Mobileye tests autonomous fleet in Jerusalum

The first phase of the Intel and Mobileye 100-car autonomous vehicle (AV) fleet has begun operating in the challenging and aggressive traffic conditions of Jerusalem. The technology is being driven on the road to prove that the Responsibility-Sensitive Safety (RSS) model increases safety, and to integrate key learnings into our products and customer projects.

Using a 360-degree view made possible by eight cameras, an Intel Mobileye autonomous vehicle successfully maneuvers on busy Jerusalem roadways. Among the driving skills the vehicle displays is lane changes in various dense traffic scenarios.

RSS is a model that formalizes the common sense principles of what it means to drive safely into a set of mathematical formulas that a machine can understand (safe following/merging distances, right of way, and caution around obstructed objects, for example). If the AI-based software proposes an action that would violate one of these common sense principles, the RSS layer rejects the decision.

Put simply, the AI-based driving policy is how the AV gets from point A to point B; RSS is what prevents the AV from causing dangerous situations along the way. RSS enables safety that can be verified within the system’s design without requiring billions of miles driven by unproven vehicles on public roads. Our fleet currently implements Mobileye’s view of the appropriate safety envelope, but we have shared this approach publicly and look to collaborate on an industry-led standard that is technology neutral (i.e., can be used with any AV developer’s driving policy).

In the coming months, the fleet will expand to the U.S. and other regions. While the AV fleet is not the first on the road, it represents a novel approach that challenges conventional wisdom in multiple areas. Specifically, Mobileye is targeting a vehicle that gets from point A to point B faster, smoother and less expensively than a human-driven vehicle; can operate in any geography; and achieves a verifiable, transparent 1,000 times safety improvement over a human-driven vehicle without the need for billions of miles of validation testing on public roads.

Why Jerusalem?

Jerusalem is notorious for aggressive driving. There aren’t perfectly marked roads. And there are complicated merges. People don’t always use crosswalks. One can’t have an autonomous car traveling at an overly cautious speed, congesting traffic or potentially causing an accident and must drive assertively and make quick decisions like a local driver.

This environment has allowed Mobileye to test the cars and technology while refining the driving policy. Driving policy, also known as planning or decision-making, makes all other challenging aspects of designing AVs seem easy. Many goals need to be optimized, some of which are at odds with each other: to be extremely safe without being overly cautious; to drive with a human-like style (so as to not surprise other drivers) but without making human errors. To achieve this delicate balance, the Mobileye AV fleet separates the system that proposes driving actions from the system that approves (or rejects) the actions. Each system is fully operational in the current fleet.

Camera only strategy

During this initial phase, the fleet is powered only by cameras. In a 360-degree configuration, each vehicle uses 12 cameras, with eight cameras providing long-range surround view and four cameras utilized for parking. The goal in this phase is to prove that we can create a comprehensive end-to-end solution from processing only the camera data.

The radar/lidar layer will be added in the coming weeks as a second phase of our development and then synergies among sensing modalities can be used for increasing the “comfort” of driving.

The camera-only phase is Mobileye’s strategy for achieving what they refer to as “true redundancy” of sensing. True redundancy refers to a sensing system consisting of multiple independently engineered sensing systems, each of which can support fully autonomous driving on its own. This is in contrast to fusing raw sensor data from multiple sources together early in the process, which in practice results in a single sensing system.

True redundancy provides two major advantages: The amount of data required to validate the perception system is massively lower (square root of 1 billion hours vs. 1 billion hours) as depicted in the graphic below; in the case of a failure of one of the independent systems, the vehicle can continue operating safely in contrast to a vehicle with a low-level fused system that needs to cease driving immediately.

A useful analogy to the fused system is a string of Christmas tree lights where the entire string fails when one bulb burns out.

Source: Intel-Mobileye