DeepRoute.ai, a global AI company, and Qualcomm have announced a new technology cooperation to develop high performance, cost-effective advanced, intelligent driving solutions for ADAS and automated driving based on Snapdragon Ride Platforms.
Utilizing a Snapdragon Ride Platform, DeepRoute.ai’s intelligent driving solutions will cover LiDAR and vision-only ADAS systems, both of which support advanced intelligent driving features such as Urban NOA (Navigation on Autopilot), Highway NOA and automated parking. The companies say that this combined solution can be deployed on both fuel vehicles and new energy vehicles, meeting the personalized needs of global automakers for a wide range of vehicles. Through their collaboration, the companies will work similarly to optimize AI models, such as end-to-end Birds Eye View and transformers, on Snapdragon Ride Platforms.
In simulating a human’s neural network, Snapdragon Ride Platforms help intelligent driving systems understand their surrounding traffic environments, and the logic behind common driving behaviors, more naturally. The companies pointed towards long-tail scenarios (such as human-vehicle negotiation, nudging around parked vehicles, passing irregular intersections and narrow road driving) as a key example of where the intelligent driving system can support enhanced performance in complex road scenarios. Here, its end-to-end model can help these systems understand human driving culture. For instance, when the road ahead is blocked due to construction, vehicles equipped with the model can analyze the traffic conditions in real time and promptly change lanes to support efficient passage.
For safety, DeepRoute.ai has optimized its algorithms for complex lighting conditions such as strong light, low light, and backlight. By enhancing the perceptual limitations of vision-only systems in extreme environments, it delivers core performance that Deeproute.ai says is on par with LiDAR-based solutions.
Through the partnership, Qualcomm is aiming to demonstrate the technical advantages of its Snapdragon Ride Platforms. The open architecture utilized by these platforms allows OEMs and suppliers to deploy various ADAS algorithms including camera perception, sensor fusion, driving strategies, automated parking and driver monitoring.
Integrated with AI accelerator and image processing engine, they are able to concurrently process data from 16 cameras, multiple radars and LiDAR sensors to enable real-time environment perception and decision-making in functions like pedestrian detection, lane detection and obstacle avoidance, in order to fully meet the needs of advanced intelligent driving solutions.