Are robots taking over the wheel?
Start-up CEOs rising overnight as superstars of the Auto industry, AI becoming the base technology for every engineering department and consumers talking more about embedded technologies than about horsepower. Well, prepare yourself; the Automotive industry is changing, and it is changing at F1 speed rate.
Today’s vehicles are becoming computers on wheels, the high-end cars nowadays gathering up to 100 Electronic Control Units (ECUs) and running close to 100 million lines of code.
The development of Advanced Driver Assistance Systems (ADAS), autonomous driving and connectivity are fast enriching the features of modern vehicles and elevating the driving experience in terms of convenience, safety, comfort, and entertainment.
The engine was the technological core of the 20th-century automobile. Today it is software, computing power and advanced sensors that take over the stage, enabling most modern innovations, from efficiency to connectivity, autonomous driving, electrification, and new mobility solutions.
The Automotive industry is transitioning from hardware to software-defined vehicles and this is seen in the rapidly increasing software and electronics price tag per vehicle. Software represents 10 percent of overall vehicle content today for a D-segment (approximately $1,220), and the average share of software is expected to grow at a compound annual rate of 11 percent, to reach 30 percent of overall vehicle content (around $5,200) in 2030.
Digitisation and public concerns
Safety continues to be the number one issue surrounding autonomous vehicles. Self-driving cars were already involved in accidents and pedestrian deaths. While the causes of many of these accidents remain under investigation or even point to the human operator being at fault, it remains clear that even limited autonomous technology is not quite ready for prime time. And those fears are reflected in consumer sentiment, a 2017 Gartner survey showing that 55 percent of consumers won’t even consider riding in a fully autonomous vehicle. And this in spite the fact that “the automotive industry is investing in new safety and convenience technology at a rate not seen since the dawn of the automobile”, says Gartner’s research director Mike Ramsey. Things will evolve at a staggering speed, Ramsey believes, and “the experience of owning and operating a car will be dramatically different in 10 years”.
Volvo, Audi, Mercedes, Tesla, and others already have cars on the road with a bundle of autonomous features, including full stop-and-go radar cruise control, lane centering, parking assistance and more. But none of these make a car “autonomous”. All the autonomous systems of the above brands turn the wheel, at least on the expressway, slow for stopping traffic and accelerate when it has cleared. Mercedes actually takes over in a panic situation with Evasive Steering Assist, which helps the driver safely avoid a spin during an emergency lane-change maneuver. To have an idea of the speed of the ways things evolve in the mobility industry, in 2016 only about 1 percent of vehicles sold were equipped with basic partial-autonomous-driving technology. Today, 80 percent of the top ten OEMs have announced plans for highly autonomous technology to be on the road by 2025.
But to see more clearly where we are, we need to know what “autonomous” means for the auto industry:
Level 0 — No Automation
The human driver is in charge all the time for all the aspects, even when the car is enhanced by warning or intervention systems. It applies to most of the cars on the road today.
Level 1 — Driver Assistance
The car is in control of either steering or acceleration/deceleration using information about the driving environment. It is expected from the human driver to perform all remaining aspects of the dynamic driving task. This scenario basically covers current radar-based cruise control.
Level 2 — Partial Automation
A “driving mode” controls both the steering and acceleration/deceleration, but the human driver performs all remaining aspects of the “dynamic driving task”, meaning that the driver is responsible for changing lanes, exiting freeways, making turns and such.
Level 3 — Conditional Automation
The “automated driving system” monitors the driving environment. It controls the acceleration, braking, and steering, but expects that the human in control will respond appropriately to a request to intervene.
Level 4 — High Automation
At this level, the system controls all aspects of the driving tasks, including when a driver doesn’t respond appropriately to requests to intervene. Both Ford and Volvo promised recently they will offer a Level 4 car before 2021.
Level 5 — Full Automation
It’s the ultimate automation level, when the car is operated full time by an automated driving system and all aspects of the dynamic driving tasks for all environmental conditions are controlled autonomously. It’s the Holy Grail of the synergy between tech and automotive industry, meaning that you can get in your car in the morning, tell it to drive to work while you read the newspaper, take a nap or play the saxophone. No company has yet set a timeline for bringing a Level 5 car to market though…
Until then, remember: don’t text and drive.
So what’s next?
In the last couple of years, the industry has been talking about four disruptive trends changing the rules in the mobility sector: autonomous driving, shared mobility, connectivity, and electrification. Artificial intelligence (AI) is a key technology for all four of these trends. Autonomous driving, for example, relies on AI because it is the only technology that enables reliable, real-time recognition of objects around the vehicle.
One consequence of these 4 strategic moves of the industry is that the vehicle architecture will become a service-oriented architecture based on generalized computing platforms, shows a 2018 McKinsey Report. Developers will add new connectivity solutions, applications, AI elements, advanced analytics, and operating systems. The differentiation will not be in the traditional vehicle hardware anymore but in the UI and UX elements, powered by software and advanced electronics.
Autonomous driving, connectivity, electrification, and shared mobility are the trends expected to fuel growth within the market for mobility, change its rules and lead to a shift from traditional to disruptive technologies and innovative business models.
Tomorrow’s cars will shift to a platform of new brand differentiators, that will likely include infotainment innovations, autonomous driving capabilities and intelligent safety features based on “fail-operational” behaviors (for example, a system capable of completing its key function even if part of it fails).
It would be probably more fertile to think of a modern car not as a connected car, but as a connected data-driven ecosystem. A connected car’s ecosystem is not just its ability to connect to a smartphone, but the information the car can gather from the world around it.
Dozens of sensors and edge devices gathering up data from just about anything will be integrated in tomorrow’s car. This enormous amount of data from the world around it will include data from road infrastructure, other vehicles, and even pedestrians.
All the data collected will help create safer cars and roads in the future, inform smarter logistics and provide more actionable insights for a lot of industries. This is made possible with the journey that data makes from the edge to AI. To be clear, edge computing is the practice of processing data from IoT devices where it is generated. For a car, this will mean processing the gathered data while driving instead of in a centralized data-processing warehouse or in a public cloud. It promotes real-time data analytics without lag as the data is generated, enabling the smart device to perform as it was designed while simultaneously reducing internet bandwidth.
Connected car technology primarily leverages cellular networks for connectivity and streaming the full corpus of data generated over cellular networks would be cost and bandwidth prohibitive. In cars that can be bought today, there are already a multitude of sensors. A lot more sensors will be necessary in an autonomous vehicle (AV) though, however for the sensors to provide the value that they need to, there needs to be a device that can process the information that is streaming in real-time. For example, during a snowstorm, the internet connection of the AV could be unreliable and can result in a very dangerous situation. Having edge devices built into the vehicle ensures the vehicle is getting the updates it needs in order to drive safely.
In the next five years, vehicles that adhere to Level-4 automation are expected to appear. These will have automated-driving systems that can perform all aspects of dynamic mode, even if human drivers don’t respond to requests for intervention. The technology is ready for testing in limited situations, but validating it might take years because the systems must be exposed to various and uncommon situations. Given current development trends, fully autonomous vehicles won’t be available in the next ten years. The main obstacle is the development of the required software. While hardware innovations will deliver the necessary computational power and prices (especially for sensors) appear to continue falling, software will remain a critical bottleneck.
The self-driving car disruption the Automotive industry is facing is largely driven by the tech industry and it has many consumers expecting their next cars to be fully autonomous pretty soon. But that might not be the case. On close examination of the technologies required for advanced levels of autonomous driving, we can see a significantly longer timeline; such vehicles are perhaps ten to fifteen years away.