Geek things we all think about sometimes...

Intel Autonomous Driving – The Technology Behind Self Driving Cars

Cars that drive themselves. It’s the stuff of sci-fi stories and shown in every futuristic movie. You may have even heard a few things about self driving cars but do you know about the amazing technology behind self driving cars? I’m here to bring you up to speed and get started with all you need to know.  I’ll walk you through each of the various pieces you need to understand what is going on with the technology behind self driving cars or autonomous driving cars. Intel recently let me visit their just launched Intel Autonomous Driving Garage and Advanced Vehicle Lab in Silicon Valley to check this technology behind self driving cars out first hand.

An autonomous driving car ( also known as a driver-less car, a robot car, or self driving car) is a vehicle that is capable of sensing the environment around it and traveling and navigating around without any human input.

What are the different levels of Autonomous Driving Cars?

There are different levels of self driving cars ranging from level 0 to level 5.

Level 0:  Driving you are already doing. The human drives all parts of the car. turning, breaks, power etc.

Level 1: Most functions are still done by a human driver, but a specific function like steering or accelerating can be done automatically by the car.

Level 2: Two features combine and are automated. Your hands are off the steering wheel AND foot is off the pedal at the same time. Both these features are controlled by the car for this time. You must be ready to take back control of the vehicle at any time though.SAE International

Level 3: Drivers Human drivers are still needed in level 3 but the car is able to control “safety critical” features of driving on its own. You still may need to intervene at any time, but most features are controlled by the car under certain traffic or environmental conditions.

Level 4: This is what most people think of when they think of “fully autonomous”. A human driver does not need to be involved. The car can do all parts of the driving experience including all the safety and critical driving functions for the entire trip. The trip will need to fall into the  “operational design domain (ODD)” of the vehicle—meaning some driving scenarios may not be covered by this level.

Level 5:  A fully-autonomous car that performance equal or better than a human in every possible driving scenario – snow, dirt roads and more.

What kind of sensors are in Self Driving Cars?

The first thing an autonomous driving car needs is eyes or different types of ‘sensors’ to figure out what is going on around the cars. Many pieces of data need to combine in order to allow a machine to make the millions of decisions that a human does when driving. Different companies are trying to tackle them in different ways. I toured for 2 miles around Silicon Valley in a self-driving Audi powered by Delphi and Intel that use a lot more than just cameras and GPS that we are mostly all familiar with in cars today. Intel showcased multiple cars, but then one I went around in had more than 20 sensors located in various spots around the car.

Lidar is one of the main sensors used in autonomous. Lidar is a laser or 3D scanning method. This technology allows for a very detailed image around the car and the distance between what the laser is bouncing off of and the car.  You might have seen the spinning cylinder on top of a car. That’s Lidar technology. There are more types than the spinning cylinder though.  Types of short term radar, long term radar, GPS, and many cameras, all combine the data from multiple angles to help give the car the best picture  of what is going on around it and of the road.

What Are Self Healing Maps?

Another sensor technology I saw at Intel’s Advanced Vehicle Lab was Here HD live maps. This technology behind self driving cars makes a machine-readable live map that allows the vehicle to ‘see’ down the road and what is coming. These maps are called “self healing” maps. They have a prescanned map (think like Google Maps) of your route. The car would then know things like the slope of the hill, where the major cross roads are etc. The car then scans and looks for inconsistencies to those prescanned cloud based references to determine if the car needs to make a change or deal with a real life situation. This data can be shared between the cars so you can see farther ahead than the car in front of you.

There is also multiple points of data that you as a passenger in a autonomous car are looking for as well – human-machine interfaces (HMIs). The car needs to communicate with you, and you with it. For example, you will want to see like a Lyft or Uber style map location of where you are and where you are driving to as well as a system that shows what is going on around you, for example. A detour ahead? The car needs to let you know why you are suddenly going “off track”.When you are driving down the road with a taxi driver you are able to tell him things like, slow down, pull over here or I’ve changed my destination. You will need similar controls available for a self driving car. Various Human Machine Interfaces(HMI) are currently being tested to show unique options that can help you to feel comfortable and in control. Voice activated systems, screens that show on your device of the door alerting you to the fact your car has shown up are some examples.

Not all of this processing can be done in the car.  Some of it will need to be shared back to the main data server for extra processing and more advanced analysis.  All this data also needs to be stored somewhere and these terabytes of data coming in can fill up a storage locations pretty quick. 5G will be playing a part in transferring this data around.  So let’s go over that a bit.

5G Network – The Autonomous Driving Car Network of the Future

5G is the step after 4G or LTE and is expected to start being rolled out in roughly 2020. 5G data or multiple gigabits per second wireless broadband is needed to deal with all those points of data that the car is producing. How much data? Roughly 45 terabytes of data an hour will need to be sent back to a data center for processing “on the fly” about what is going on around the car.

Self-driving vehicles will opportunistically connect to 5G cells when available and needed, and then seamlessly fall back to 4G LTE to maintain network connectivity.  5G should also including direct vehicle-to-vehicle connections as well as vehicle-to-infrastructure or vehicle-to-network connectivity.

The cars need to be able to take all the data in and process it. What is a street sign? What is a person? Where are the street lights? The car needs to be able to determine these pieces of information from a stream of inflow. The FPGA and Intel’s CPU are able to process 900FPS using roughly 40 watts of power. About the same as your light bulb at home.

So buildings in your local city will have off car cameras that will be part of this future as well. Lets say your car is turning around a sharp corner – a camera located on the street corner will be able to send the information directly to your car. This data in almost real time, will help make those moments that your car can not see a lot easier.

Not only that the car will need about 15 to 20 teraflops of computing power to process all this information. That is about 10 to 20 times the computing power that is in cars today. A brand new Xbox One Scoprio has 6 teraflops for example. Cars still have a ways to go.

Now that we have the data and it can be transferred back and forth – How does the car have the intelligence required to make decisions like a human about driving? Artificial Intelligence plays a big part of this section. The ability to learn and process the information that is going on will be a key component of self driving cars of the future. This is an area that is on going in development with many different companies playing roles in analyzing all that data. Intel has developed in this area as well.

Intel has developed a Software Development Kit (SDK) that will help future system designers and developers to create autonomous driving solutions of the future. There are countless uses for what companies can create and an easy to use SDK will greatly help with the advancement of that. There will be lots more to come in this space – Including buying on the fly upgrades, like advanced cruise control or the ability to have extra features pushed directly over the air to your vehicle and upgrade your car with ease. The future is really awesome!

Final thoughts about the technology behind self driving cars

So we have learned a bunch about the technology behind self driving cars, including the different levels of self driving cars, how they work and what technology advances still need to come. What information needs to be saved? What information can be removed? What needs to be sent back to a main data center? How will 5G work exactly? What artificial intelligence is available, and what is still needed? All of these many pieces are still being worked out. As new companies develop in this space, many of  the answers to these questions will also change in the upcoming years. The potential to sit in a Level 5 complete self driving car is a very real reality and it is coming. It will be interesting to see what the companies of tomorrow do to solve the many issues still involved in the technology behind self driving cars today.

Intel flew me to their Autonomous Driving Garage event in Silicon Valley but thoughts as always are my own.

Select which comments you prefer...