AEye introduces new sensor data type for autonomous vehicles


AEye has announced the introduction of a new sensor data type called Dynamic Vixels, which are designed to more intelligently acquire and adapt data for the company’s iDAR (Intelligent Detection and Ranging) perception system.

This advancement in AEye technology further strengthens its biomimicry approach to visual perception, essentially enabling vehicles to see and perceive more like humans to better evaluate potential driving hazards and adapt to changing conditions.

In simple terms Dynamic Vixels combine pixels from digital 2D cameras with voxels from AEye’s Agile 3D LiDAR (Light Detection and Ranging) sensor into a single super-resolution sensor data type.  For the first time, a real-time integration of all the data captured in pixels and voxels is combined into a data type that can be dynamically controlled and optimized by artificial perception systems at the point of data acquisition.

Dynamic Vixels create content that inherits both the ability to evaluate a scene using the entire existing library of 2D computer vision algorithms as well as capture 3D and 4D data concerning not only location and intensity but also deeper insights such as the velocity of objects.

Dynamic Vixels can also be encrypted. This patented technology enables each sensor pulse to deal appropriately with challenging issues such as interference, spoofing, and jamming.  Issues that will become increasingly important as millions of units are deployed worldwide.

Simply put, this new way of collecting and inspecting data using at the edge-processing of the iDAR system enables the autonomous vehicle to more intelligently assess and respond to situational changes within a frame, thereby increasing the safety and efficiency of the overall system. For example, iDAR can identify objects with minimal structure, such as a bike, and differentiate objects of the same color such as a black tire on asphalt. In addition, Dynamic Vixels can leverage the unique capabilities of agile LiDAR to detect changing weather and automatically increase power during fog, rain, or snow.

Likewise, iDAR’s  sensory perception allows autonomous vehicles to determine contextual changes, such as in the case of a child’s facial direction, which can be identified to calculate the probability of the child stepping out onto the street, enabling the car to prepare for the likelihood of a halted stop.

The iDAR perception system includes inventions covered by recently awarded foundational patents, including 71 intellectual property claims on the definition, data structure and evaluation methods of dynamic Vixels. These patented inventions contribute to significant performance benefits, including a 16x greater coverage, 10x faster frame rate, and 7-10x more relevant information that boosts object classification accuracy while using 8-10x less power.

AEye’s first iDAR-based product, the AE100 artificial perception system, will be available this summer to OEMs and Tier 1s launching autonomous vehicle initiatives.

Source: AEye