Eye2Drive Technology

Eye2Drive Patented Technology

Eye2Drive Value

The patented EYE2DRIVE sensor technology, used in ET-1080 and ET2-GS Sensors, allows vision systems to capture images with a customizable dynamic range. This range can be adjusted from standard linear to very high levels, with various discharge curves set for each frame. This results in hundreds of possible combinations tailored to specific sensitivity requirements.

We designed our technology to meet the specific needs of the autonomous navigation industry, including autonomous vehicles, robots, drones, and submarines.

EYE2DRIVE’s imaging sensors natively capture responses with the desired dynamic range, eliminating the need for external digital processing. This reduces the complexity of algorithms, data load, and power consumption of the entire imaging system.

A Step Forward

The EYE2DRIVE approach significantly improves intelligent vision systems by utilizing advanced vision sensors. Unlike traditional High Dynamic Range (HDR) techniques, this innovative method combines a fundamentally different HDR strategy with artificial intelligence. By implementing adaptive HDR, this new generation of vision systems can dynamically reconfigure sensors while using a digital brain, such as a machine-learning system. The AI processes high-quality visual data without additional processing, enabling it to easily extract structured information for various applications and decision-making.

Representation of an electronic circuit mounting the new Eye2Drive imaging sensor.

Sensors SWOT Analysis

We conducted a SWOT analysis comparing major sensor technologies for autonomous navigation vehicles, including traditional imaging sensors, LiDARs, Radars, and the next-generation bio-inspired sensors developed by Eye2Drive. The results of our analysis are shared in the comparison table below.

 Camera
Sensors
LiDAR
Sensors
Radar
Sensors
Eye2Drive
Sensors
TechnologyCaptures images of the environment using CMOS sensors.Emits laser beams to create a 3D map of the environment. Emits radio waves to detect objects and measure their distance and speed. Captures images using bio-inspired vision technology, mimicking the human eye’s ability to adapt to changing light conditions.
SStrengths High resolution, cost-effective, able to perceive color and texture. High accuracy, provides detailed 3D maps, works well in low-light conditions. Works in all weather conditions, long range, accurate speed and distance measurements. HDR (High dynamic range), high sensitivity, low latency, no-flickering, no-ghosting, and full saturation control.
WWeaknesses Performance is affected by poor lighting, weather conditions, and limited depth perception. Acquired images can require complex post-processing. High-cost, performance can be degraded by adverse weather conditions. Lower resolution than cameras and LiDAR, limited ability to capture fine details. Still, under development, it may be more expensive than a traditional camera sensors, but will vastly simplify the navigation system software.
OOpportunities Can be used in a variety of applications, including object recognition and classification, lane-keeping assist, and traffic sign recognition. Can be used for mapping, object detection, and autonomous navigation. Can be used for adaptive cruise control, blind spot detection, and collision avoidance. Can be used for autonomous navigation, robotics, defense, and medical imaging. The ability to handle challenging conditions and provide high-quality images opens up new possibilities.
TThreats At risk of being replaced by newer technologies, such as bio-inspired imaging sensors. The increasing capabilities of other sensors could diminish their prominence in autonomous navigation. Too expensive for some applications. Advancements in camera technology and bio-inspired sensors could challenge LiDAR’s position in the market. Less accurate than other imaging sensors. In the future, more sophisticated sensor techniques could reduce the reliance on radar alone. It may take a long time to be widely adopted. The technology’s novelty means it must compete with established sensor solutions and overcome potential integration challenges.