Tactile Mobility showcases the feasibility and importance of combining vision-based data with tactile data in order to develop better and safer autonomous vehicles and cities.
Tactile Mobility collects first principle data, such as wheel speed, wheel angle, RPM, paddle position, gear position, and then analyzes it to yield actionable insights, like vehicle-road dynamics, the vehicle and the road profile, in real-time. Pairing tactile data collection with visual automotive data yields first principle-based mapping of the road conditions alongside enriched visual media for the end-user.
The pairing of Tactile Mobility’s solution and the NVIDIA DRIVE compute platform will enable vital insights for road authorities and municipalities, i.e. real-time hazard detection and correction, as well as better informed road repair, that will improve urban planning, reduce road incidents, and build a smart city future.