Eye2Drive Patented Technology

Eye2Drive Patented Technology

Eye2Drive Value

The patented EYE2DRIVE sensor technology (E10-80 and ET2-GS Sensors) enables vision systems to capture images with a customizable dynamic range. This range can be set from standard linear to very high, with various discharge curves set frame by frame, resulting in hundreds of possible combinations and the required sensitivity.

We designed our technology to cater to the specific needs of the autonomous navigation industry, encompassing autonomous vehicles, robots, drones, and submarines.

EYE2DRIVE’s imaging sensors natively acquires the response with the desired dynamic range, eliminating the need for external digital processing. This reduces the complexity of algorithms, data load, and power consumption of the entire imaging system.

A Step Forward

The EYE2DRIVE approach to vision sensors significantly advances intelligent vision systems. This innovative approach overcomes the limitations of traditional HDR techniques by utilizing a fundamentally different approach to HDR combined with AI engines. Through adaptive HDR, a new generation of vision systems can use the flexibility of dynamically reconfiguring the sensor and utilizing a digital brain, such as a machine learning system. The AI system receives high-quality visual data and can process it without additional elaboration. It can easily extract structured information and utilize it for its vertical applications and decision-making systems.

Representation of an electronic circuit mounting the new Eye2Drive imaging sensor.

Intellectual Property & Patents

A representation of a human eye powered by Artificial Intelligence. The image describes the connection between the human eye and the vision sensors created by Eye2Drive

Eye2Drive’s proprietary technology offers key advantages to the autonomous navigation industry. The company’s patented solutions enhance safety, efficiency, and user experience, boosting performance and creating a competitive edge. Its strong intellectual property portfolio secures leadership by raising barriers for competitors. Eye2Drive’s unique features add value, fostering brand loyalty and trust. As demand for advanced driving solutions continues to grow, the company is well-positioned for future growth and innovation.
Combining patented technology with customer-centric design, Eye2Drive delivers safer and more efficient driving experiences, solidifying its role as a leading industry player.

Patents References

FiledWIPOUSEPOIT PO
2012WO2013046003US9918026
US20140204189
EP2761657
2019WO2018229645US11184568
US20200204752
EP3639512
2021WO2021234566EP4154517
2024IT102024000011512

Eye2Drive’s Sensors SWOT Analysis

We conducted a SWOT analysis comparing the major sensor technologies for autonomous navigation vehicles, including traditional imaging sensors, LiDARs, Radars, and the new generation of bio-inspired sensors developed by Eye2Drive. The results of our analysis are shared in the comparison table below.

  Camera
Sensors
LiDAR
Sensors
Radar
Sensors
Eye2Drive
Sensors
Technology Captures images of the environment using CMOS sensors. Emits laser beams to create a 3D map of the environment. Emits radio waves to detect objects and measure their distance and speed. Captures images using bio-inspired vision technology, mimicking the human eye’s ability to adapt to changing light conditions.
SStrengths High resolution, cost-effective, able to perceive color and texture. High accuracy, provides detailed 3D maps, works well in low-light conditions. Works in all weather conditions, long range, accurate speed and distance measurements. HDR (High dynamic range), high sensitivity, low latency, no-flickering, no-ghosting, and full saturation control.
WWeaknesses Performance is affected by poor lighting, weather conditions, and limited depth perception. Acquired images can require complex post-processing. High-cost, performance can be degraded by adverse weather conditions. Lower resolution than cameras and LiDAR, limited ability to capture fine details. Still, under development, it may be more expensive than a traditional camera sensors, but will vastly simplify the navigation system software.
OOpportunities Can be used in a variety of applications, including object recognition and classification, lane-keeping assist, and traffic sign recognition. Can be used for mapping, object detection, and autonomous navigation. Can be used for adaptive cruise control, blind spot detection, and collision avoidance. Can be used for autonomous navigation, robotics, defense, and medical imaging. The ability to handle challenging conditions and provide high-quality images opens up new possibilities.
TThreats At risk of being replaced by newer technologies, such as bio-inspired imaging sensors. The increasing capabilities of other sensors could diminish their prominence in autonomous navigation. Too expensive for some applications. Advancements in camera technology and bio-inspired sensors could challenge LiDAR’s position in the market. Less accurate than other imaging sensors. In the future, more sophisticated sensor techniques could reduce the reliance on radar alone. It may take a long time to be widely adopted. The technology’s novelty means it must compete with established sensor solutions and overcome potential integration challenges.