Cohda Wireless fuses V2X with surround vision on NVIDIA DRIVE AI computer

0

Cohda Wireless announces the porting of its V2X software stack on the NVIDIA DRIVE™ AI car computer, and the fusing of V2X and surround vision target classification. This extends the abilities for vehicles to detect threats beyond line-of-sight, for example those around blind corners, over the crest of hills, and behind large trucks.

Cohda Wireless has been involved in vehicle trials since 2006, and now 60% of global V2X trials use Cohda Wireless technology. Deployments of Cohda’s technology have included the world’s first production vehicles, tough underground mining environments, major commuting projects, pioneering truck platooning initiatives and smart city programs to combat congestion.

V2X (incorporating Vehicle-to-Vehicle and Vehicle-to-Infrastructure) communications is widely regarded by automotive and urban planners as a necessary technology for safer, less congested roadways and an essential stage for the eventual deployment of autonomous vehicles.

The NVIDIA DRIVE platform combines deep learning, sensor fusion and surround vision to enable autonomous driving. The system is capable of understanding in real time what’s happening around the vehicle, precisely locating itself on an HD map and planning a safe path forward. Designed around a diverse and redundant system architecture, the NVIDIA DRIVE platform is built to achieve ASIL-D, the highest level of automotive functional safety.

Cohda’s V2X solutions, which support both wireless 802.11p and 5G mobile networks, provide cars with “360-degree awareness,” detecting hidden threats by extending the horizon of awareness beyond what the driver can see or on-board sensors can detect. What distinguishes Cohda solutions is the ability to “see” beyond the vehicle-centric perspective by gathering and synthesising data from sensors on nearby vehicles and roadside infrastructure – creating a much broader zone of awareness.

Source: Cohda Wireless

LEAVE A REPLY

Please enter your comment!
Please enter your name here