What Technology is in Driving Self Driving Cars
Self-driving cars are coming, as rideshare companies like Lyft and Uber seek to automate their businesses, and Apple is rumored to be developing a vehicle of its own. These devices might seem like magic, but to others, they are a complicated marriage between automobiles, artificial intelligence, machine vision, and GPS navigation. No one of those components is more important than the others, and tech manufacturers are working on pieces individually to ensure they are ready for the road.
The goal of automated vehicles is to ease traffic delays and diminish traffic collisions and other accidents caused by human error. These autonomous vehicles will have significant impacts on many services, including emergency and enforcement services, public transportation, commercial shipping, freight, rideshare or commuting, and even school buses.
This has massive ramifications for our economy and even society, as it could breathe new life into car-sharing services like Zipcar. Additionally, self-driving technology will inform the way our infrastructure like bridges, tunnels, and city streets continues to evolve, as these too will be designed to help aid self-driving cars in understanding where they are and how to continue moving safely.
Here, we’re going to take a deeper look into some of this technology to help break down the ‘magic.’ We’ll go over how obstacle avoidance vision systems work in cars, how these connect to networks and other cars, and what autonomy looks like.
What Are The Levels of Automation?
The National Highway Traffic Safety Administration has a great report on the progress of automated vehicles, and they break down the five (technically six) levels of automation in cars.
These are made with the Society of Automotive Engineers to help better understand the immediate road ahead for how vehicles will ease into automation instead of diving in headfirst.
They are as follows:
Level Zero: No Automation
This is the level of automation you can expect to find in used cars from 2015 or earlier. The human drivers will perform all driving tasks with no assistance from the vehicle itself.
Level One: Driver Assistance
Over the past several years, cars have adopted this level of automation, in which the car contains several driving assist features, such as steering adjustments for when a driver drifts out of their lane. However, the car is incapable of performing multiple tasks simultaneously, requiring the driver to perform all functions still.
Level Two: Partial Automation
Here the driver is still sitting in a position to watch the road and other vehicles, but the vehicle has multiple automated functions. The car is able to accelerate, brake, and steer on its own, but it requires monitoring from the driver at all times. This level of cars are able to perform basic functions but still need assistance navigating environments and other hazards as needed.
Level Three: Conditional Automation
The advanced driver assistance system (ADAS) is more capable of navigating terrain at this level, but it still may require a person to drive it if certain conditions like inclement weather arise. Drivers will still be required to sit behind the wheel and may be asked to take control of the car at any time.
Level Four: High Automation
Now, the automated driving system (ADS) is fully capable of performing all driving functions unless in an extreme driving environment. The driver now has the option to control the vehicle, but it is no longer required of them, and they can relax in the vehicle unless certain conditions such as severe weather or an accident were to arise.
Level Five: Full Automation
As one might guess, this level of automation means that the driver is no longer required, even in the event of extreme conditions; all human occupants are passengers in self-driving vehicles.
As cars progress through these levels of automation, the technology in them grows more complex and interconnected. Over time the cars will be able to see more, understand more, and react properly to that information.
The NHTSA reports that 94% of serious crashes are caused by human error, and 35,000 people in the United States die in automotive incidents every year. The journey towards Level Five automation is expected to reduce these numbers over time, although it is uncertain if they will ever be fully eliminated.
What Tech Allows Self-Driving Cars To See?
There are several pieces of technology in a car that all work in tandem to enable it to see the surrounding road and other hazards, and different car manufacturers employ different configurations of these.
They all work in conjunction to provide one sightline; however, different pieces of technology are intended to see different things. Landmark Dividend has a good explanation for all things self-driving cars; let’s go over a few of them:
This radar covers the immediate few inches surrounding the car to help detect things like curbs and neighboring vehicles in a parking lot. This helps with parking and with keeping the vehicle from contacting other objects.
The longer range of this radar enables it to offer a vision of blind spots adjacent to the rear wheels and offer rear collision warnings and alerts for crossing traffic.
Multiple high-quality vision lenses around the car will offer it a view of its entire 360-degree surroundings; Tesla’s driverless cars employ eight such cameras. These can assist with park assistance but are also crucial for recognizing traffic signs and offering warnings for when you are drifting outside of your lane.
Many cars have lenses in the rear, with a screen in the dashboard offering you a view of what is behind the vehicle to prevent you from bumping into anything.
A LiDAR sensor, or Light Detection and Ranging sensor, is a laser that fires in pulses and measures distance by the time taken for the light to reflect back. In your car, this laser shoots from the front to monitor and detect oncoming hazards and provides emergency braking assistance when needed, preventing collisions with pedestrians, vehicles, or structures.
Ultrasonic systems are also forward-facing systems that look ahead at oncoming hazards and directly affect your vehicle’s speed. This is called an adaptive cruise control system, which increases speeds as objects become more distant and break as the opposite occurs. These systems exist in a semi-automatic fashion in current vehicles but soon will become fully automated, with no need for interaction from the driver.
With all of these sensors and other monitoring, technologies comes a lot of data and a need to input and output that data from the vehicle for navigation purposes. As new network infrastructures like 5G continue to develop, they pave the way toward new types of thinking for connectivity.
What Network Are Self-Driving Cars Connected To?
5G Networks promises to be almost 1,000% percent faster than 4G LTE, effectively eliminating high-latency and delayed response times. In the context of autonomous cars, this means that they can be connected to everything around them, allowing every car to both output and input information to form a sort of mesh network.
To understand this new type of network, we need to break down the ways in which it communicates, which the MIT Technology Review refers to as V2X or vehicle to everything. This is broken out into several sub-categories for what that “everything” is, so let’s dive slightly deeper.
In the future, vehicles will be able to communicate with surrounding infrastructure devices, such as signs or traffic signals. This enables the car to understand the proper speed for the area it is in and any scheduled ordinances, such as if making a specific left turn is only legal during certain hours of the day. Here our technology is helping our individual vehicles navigate around the world by understanding layouts, basic rules, routing.
This is particularly important in dense urban environments where construction can occur frequently and will affect traffic flows. By giving automated vehicles the tools to communicate and perceive the road around them, driverless cars are able to adapt as this infrastructure evolves over time.
Automated cars will not just be able to connect to infrastructure, though, but they can also connect to one another. The low latency of 5G enables these vehicles to communicate with one another about their individual positions, desired routes, and even hazardous road conditions.
If Car A needs to switch to the shoulder lane, it can inform Car B of this, allowing Car B to lower its speed to make room. In addition, if Car A spots a road hazard through its vision lens, such as a pothole or debris in the road, it can alert Car B behind it, allowing both of them to slow and navigate around it.
This level of coordination improves safety standards on the road and works to ease traffic congestion, as every car can make intelligent decisions based on actionable information, allowing for cars to flow steadily. However, there are still other important members of the road to communicate with, such as pedestrians and smaller vehicles like motorcycles or bikes.
While pedestrians are not 5G enabled, almost all of them carry connected devices with them in the form of smartphones. These connected points enable GPS tracking and the location of people using devices, which can fit into the mesh network of automated cars and industrial signals. These pedestrian signals are paramount to the safety of walking people, who are still an important part of our transportation infrastructure. By utilizing these devices on the 5G network, we can prevent accidents in real-time and even respond to emergencies as they occur.
The current rollout and launch of 5G networks are still uncertain, even as some mobile phone providers tout their existing 5G offerings. For the purposes of self-driving cars, these networks must be flawless and offer robust coverage with no dead zones in urban environments. As they continue to develop, devices will be built in tandem to understand how new-age communication infrastructures work and properly utilize them.
As we’ve gone over, there are many different technologies layered on top of one another to form the automated driving ecosystem. Individual cars have a plethora of different sensors and cameras to help them navigate the roads and each other. Robust 5G networks allow these vehicles to connect to the public infrastructure around them and to one another, allowing for safer traffic and more coordinated road maneuvers such as avoiding hazards or simply switching lanes.
However, there are still many hurdles to scale, so self-driving cars’ arrival is still unknown. 5G networks are still being widely implemented, and how available they will be, particularly in a country as expansive as the US, has yet to be seen. Additionally, there is still a lot of work on the technology itself, which must be designed to function properly at high speeds on highways and see smaller pieces of debris that can still puncture tires.
This technology is still developing but has shown extremely promising growth in the last few years, as several manufacturers continue to research this area. Businesses are also poised to see great value from realizing the automated road, and they too are making investments into bringing the technology to life.