The Autonomous Driving Landscape: It’s the End of Your Car as you Know it

It was once only the stuff from science fiction movies or futuristic cartoons.  But now more than ever, there is greater recognition and anticipation that the launch of fully Autonomous Vehicles (AV) will be as transformative and revolutionary as the assembly line was to manufacturing in 1913 and as innovative and competitive as the Space Race in the 1960s. Autonomous vehicles, sometimes referred to as driverless cars or self-driving cars, are becoming capable of reading and sensing the environment and operating without human input, moving forward into a brave new world.

Companies are investing billions of dollars across the AV value chain, from traditional and Tier-1 automakers to innovative software startups, not to mention the tech giants, with the goal of being the first to market. An increasing number of players are developing semi-autonomous and fully autonomous vehicles, include: Google (Waymo), Intel, Apple, Huawei, Continental, General Motors, Volkswagen, BMW, Ford Motor Company, Baidu, Toyota, Tesla, Audi, Jaguar among others; all trying to keep up with a changing marketplace. While we are clearly past the initial stage of ‘early adaption’ to the concept of AVs, consumers are watching closely to see when the industry will achieve critical mass and acceptable safety requirements to comfortably use AVs.

It seems for example, as of 2021, that self-driving trucks for the shipping of goods can revolutionize the transport industry, while an army of self-driving cars may integrate in the near future with our entire public traffic system. “[We are on the] verge of the most significant transformation since the introduction of the automobile,” then-US Secretary of Transportation Elaine L. Chao said at the Detroit Auto show in 2017. “Automated or self-driving vehicles are about to change the way we travel and connect with one another.”1 This is seen as the next major disruption in the auto industry as well as a huge source of revenue growth. Intel predicts the driverless economy to grow to $800 billion in 2035, and reach approximately $7 trillion by 2050.2

So what’s holding back this industry breakthrough?

A safe form of mobility 

For AVs to move beyond proof-of-concept to mass production, they must be proven to be safe, dependable and affordable. Driven by immense advancements in Artificial Intelligence (AI), the advent of autonomous vehicles will offer a new form of mobility that will not just improve lives, but will save them too.

The World Economic Forum anticipates driverless vehicles will generate $1 trillion in “economic benefit to consumers and society” over the next 10 years, and autonomous driving features will help prevent 9 percent of accidents by 2025 with the potential to save 900,000 lives in the next 10 years.3  Yet, the highest barrier for autonomous vehicle deployment is consumer fear, particularly in light of headline grabbing accidents involving autonomous vehicles.4

American Automobile Association (AAA) research in 2021 showed that only 22% of Americans feel at ease with AVs, while the low rate is explained by their fear of safety-related issues. Former Mobileye executive Yonah Lloyd’s quote that “Clearly people like being in control of their cars, which is why at Mobileye (along with many car companies) we are taking a phased approach to deploying autonomous features,” clearly backs up the statistics. To further handle this skepticism, he explained that “With our system the driver can always take control of the wheel, or choose to let the car drive itself.5 Over time, people will likely become more comfortable with this experience, and society will appreciate the safety benefits, as accident numbers decrease.” As proven in many of the technological revolutions of the past decade, if the value added of the new technology is significant and the cost is in-line, new industry standards will be adopted. 

There are, in addition to AI, many different technologies that go into the autonomous vehicle technology stack such as sensors, computational software and hardware, that enable an AV to perceive its environment, make sense of it all and decide how to act. By fusing real-time and surrounding data with an always alert, automated, split-second decision-making mechanism roads will be safer for passengers and pedestrians alike. Significant social issues that impair human drivers, such as inebriation, distraction, or exhaustion, will become a thing of the past.

What's the difference between a semi-autonomous and a fully autonomous vehicle?

According to Amir Freund, who served as a Ford Motors Israel executive (2015-2017), “self-driving cars employ some combination of sensors – cameras, radar, high-performance Global Navigation Satellites System (“GNSS”), and LiDAR (Light Detection and Ranging, a remote sensing method that uses light in the form of a pulsed laser to measure ranges), together with AI and machine learning to achieve their respective levels of autonomy.”  It is the combined coverage that these sensors provide and the ability to process the data from them in real time to make decisions that determines the level of autonomy a vehicle has.

The Society of Automotive Engineers (SAE), has issued a standard that delivers a coherent classification system for driving automation. They’ve classified six levels of driving automation from “no automation” to “full automation,” which groups levels of functionality, technology, and relevant distinctions between the various levels. The classifications are as follows:

Level 0: No Automation

Level 1: Vehicle performs minor steering or acceleration tasks; all other operations are under full human control.

Level 2: Vehicle automatically responds to safety situations, but the driver must remain alert and responsive.

Level 3: Vehicle performs certain “safety-critical functions” under various traffic or environmental conditions.

Level 4: Vehicle can operate without requiring human input.

Level 5: Vehicle operates with full automation in any environment (weather or traffic).6

Level 4 has yet to be reached, and the phase of full autonomy, Level 5 technology, is considered nowadays to be the product of joint effort, rather than an individual company’s achievement. Once arrived at, the entire automobile market will be totally changed.7 

Key functions autonomous cars must perform to ensure a safe ride

A variety of different sensor technologies are required to enable cars to “take control of the wheel.” It is the fusion of data from these sensor technologies which will make autonomous driving a reality. Sensors must enable detection and classification of objects in every weather or lighting condition, and inputs have to be ultra-reliable to ensure safety-critical functionality. Consequently, both sensor redundancy and fusion are essential.8

Freund identifies six crucial functions that autonomous vehicles must be able to perform based on input from sensors.

  • Navigation
  • Localization, turning corners, at a safe distance from the curb.
  • Obeying traffic rules and signs
  • Perfect braking (safety)
  • Obstacle avoidance (even when unexpected)
  • Crossing intersections, especially when not all cars are autonomous. 

Today, navigation is largely provided by GPS, but it has an insufficient level of accuracy for AVs. Autonomous vehicles also need to know their precise location, both for decision making and path planning.9 Many AVs rely on GPS signals for positioning, but these measurements can be off by as much as 1-2 meters — too significant an error rate, given that an entire bike lane is roughly 3 meters on average.10  AVs must learn how to negotiate driving patterns involving both human drivers and other AVs, localizing vehicles with a very high degree of accuracy. 

Sensors: The all-seeing AV eyes 

In the AV sector, sensors like radar, sonar, cameras and LiDAR are often combined to allow automobiles to perceive and react to the world around them.11 They are considered the key elements in developing AVs, as well as current Advanced Driver Assistance Systems (“ADAS”) for enhanced safety and security. According to Freund, solving these challenges requires not only significant upfront R&D but also long test and validation periods.

ADAS applications detect potentially dangerous situations and warn drivers,12 with the goal of minimizing accidents. Among commonly used ADAS applications available today are automatic emergency braking, blind-spot detection, lane-change assist, vehicle-exit assist and pre-crash warning.13 Current ADAS applications combine several sensor systems to perceive the car’s surroundings as accurately as possible.14

The four main groups of sensor systems in AVs are camera, radar, LiDAR and IR, and different car manufacturers utilize different combinations.

No one type of sensor can provide enough coverage, so data from multiple sensors needs to be fused and layered together, but all autonomous vehicle manufactures use cameras as one of the main sensors. According to a McKinsey report from 2017, there are different advantages to each type of sensor.

  • Camera over radar relies predominantly on camera systems, supplementing them with radar data.
  • Radar over camera relies primarily on radar sensors, supplementing them with information from cameras.
  • The hybrid approach combines LiDAR, radar, camera systems, and sensor-fusion algorithms to understand the environment at a more granular level.15 

“Google (Waymo) and Ford are using a camera and LiDAR combination, while Tesla is relying on cameras mainly, with radar added, which is probably the original view of Mobileye,” Freund said. “GM is relying on radar more significantly.” Improving and extending the range of current sensors is an ongoing challenge.

What is the "sweet spot" of sensor coverage? 

The optimal distance for clear detection of objects is 200 meters. Freund relates that, 200 meters is important because if an AV turns onto a highway, and there is a motorcycle driving at 120 km/h (65 mph), then it would take about four or five seconds to get the AV onto the highway, and the motorcycle will travel about 200 meters in that time. One also needs to factor in inclement weather (rain, fog or night), so this is a simple calculation and the sensor detection should be capable of overcoming these obstacles.

“Cameras have very good resolution but have range limitations, as well as difficulties with night imagery and glare from the sun. LiDAR has a detection range of up to 100 meters, and next year we'll see LiDAR that can reach 150-180 meters,” according to Freund. “Thermal & IR cameras and radar are also used, and have a very long range but very low resolution, making it difficult to identify whether an object is a motorcycle or a truck from long distances. 

Israeli startup and OurCrowd portfolio company Arbe has developed proprietary chipset technology that it claims can overcome radar’s resolution limitation. Arbe’s product is an end-to-end 4D (distance, height, depth and speed) ultra-high-resolution imaging radar system that can maintain detection accuracy in long, mid and short ranges under all weather and lighting conditions and provides a highly detailed image of the environment in a wide field of view. The company released its Beta system to its first customers in 2019 including OEMs and Tier-1 companies), while moving it to production this year, 2022.

Sensory perception and fusion: The brains of the AV 

The real-time data generated from the multiple sensor layers in AVs needs to be fused together and localized so it can understand where it is, and then try to relate to the different objects that are detected and make decisions. 

This data fusion and detection requires high processor capacity and efficiency, and size and cost are crucial factors for implementation into autonomous vehicles, according to Freund. Major chip manufacturers like Nvidia, Huawei, Intel, and Qualcomm are competing with startups like OurCrowd funded Hailo, as well as other venture-back companies like ThinCI, Mythic, Cerebras Systems, and Horizon Robotics in the multi-domain controller sector.

Hailo is developing a deep learning processor with the performance of a data center-class computer, operating in real time at minimal power consumption, size and cost.

Monetization and user experience 

Like the Space Race over 60 years ago, the Autonomous Vehicle market is driving technology innovation in a number of different spheres that have the potential to transform economies and improve lives. The main overall challenge, according to Freund, is not to create an AV for its own sake, but to actually be able to monetize it and provide a better and safer user experience for passengers that will be safe and comfortable.

While it is an axiom for all new drivers to keep both hands on the wheel, as investment pours into these emerging technologies, it’s no longer hard to imagine a future, a few years from now, where all drivers might be twiddling their thumbs. 

In the meantime, keep both hands near the wheel and your eyes on the road ahead.

Sources

  1. https://www.transportation.gov/briefing-room/detroit-auto-show ↩︎
  2. https://www.theverge.com/2017/6/1/15725516/intel-7-trillion-dollar-self-driving-autonomous-cars ↩︎
  3. https://www.technologyreview.com/s/609450/autonomous-vehicles-are-you-ready-for-the-new-ride/ ↩︎
  4. https://tech.co/news/mapping-driverless-car-crash-california-2018-10 ↩︎
  5. https://blog.ourcrowd.com/live-chat-the-next-gen-of-driverless-tech/ ↩︎
  6. https://www.technologyreview.com/s/612754/self-driving-cars-take-the-wheel/ ↩︎
  7. https://www.mckinsey.com/features/mckinsey-center-for-future-mobility/overview/autonomous-driving ↩︎
  8. https://www.electronicdesign.com/automotive/how-will-radar-sensor-technology-shape-cars-future ↩︎
  9. https://www.cbinsights.com/research/startups-drive-auto-industry-disruption/ ↩︎
  10. https://www.cbinsights.com/research/startups-drive-auto-industry-disruption/ ↩︎
  11. https://www.electronicdesign.com/markets/automotive/article/21806443/how-will-radar-sensor-technology-shape-cars-of-the-future ↩︎
  12. https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/self-driving-car-technology-when-will-the-robots-hit-the-road ↩︎
  13. https://www.electronicdesign.com/markets/automotive/article/21806443/how-will-radar-sensor-technology-shape-cars-of-the-future ↩︎
  14. https://www.electronicdesign.com/markets/automotive/article/21806443/how-will-radar-sensor-technology-shape-cars-of-the-future ↩︎
  15. https://www.mckinsey.com/~/media/McKinsey/Industries/Automotive%20and%20Assembly/Our%20Insights/Self%20driving%20car%20technology%20When%20will%20the%20robots%20hit%20the%20road/Self-driving-car-technology_When-will-the-robots-hit-the-road.ashx ↩︎

Access exclusive deals

Join for free and be notified of future investment opportunities