SUBSCRIBE TO OUR BLOG
The promise of a car that can drive on its own seems closer than ever. Still, there are some big hurdles left to leap.
Adam Gopnik’s recent New Yorker piece, on learning to drive in middle age, reminded me how many things we take for granted about driving. There are dozens of mental calculations we make every second on the road: is there another lane the car can move to; is it an HOV lane; did the radio guy just say there was a thunderstorm headed in this direction?; are my tires slipping?; oh - wait - is that a mom crossing the street with a baby carriage?
Today, self-driving cars are often tested in near perfect conditions with camera and radar towers that can spot other cars and objects on the road, yet they come with huge blind spots. These sensors have a limited field of vision and lack a highly detailed view of the static road ahead, let alone information that we humans process effortlessly like weather and traffic information. While self-driving cars today work on carefully plotted out test areas, they can’t quite mimic the way real people drive.
The stakes are high: cars that act like a machine rather than people risk alienating drivers. Sure it may be legal and even safe for a self-driving car to move on a narrow and curvy road at 120 kilometers per hour, but it would make for a hair raising ride.
At HERE we’re working to infuse cars with the intelligence they need to react to the environment like people do. Today, at the Automotive Tech.AD show in Berlin, Dietmar Rabel, who heads a number of our automated driving projects, explains:
“To enable cars to act like people, they need to understand their environment, and our location cloud aggregates information from various sources including other cars on the road and make sense of it all for the vehicle.”
Live Roads, a key aspect of HERE’s highly automated driving offering, processes real time sensor data integrates real time traffic information, weather and even road conditions.
Let’s say, for example, that in a particular area tire sensors on certain cars report that the tires are slipping. Meanwhile other cars close by send information that the windshield wipers are on. At the same time the local weather agency sends out an alert that temperatures have dipped below freezing.
Live Roads aggregates and analyzes all of that information to understand that there is black ice in a certain area and can then send that information back to all cars headed there.
Rabel calls Live Roads, “A heavily dynamic view of things like lane closures, slippery roads, construction at much higher precision that what you are used to with today’s traffic information.”
He says that the ability to pull information from various sources beyond a car’s own sensors is crucial to advancing car technology for a host of applications that eventually pave the way to automated driving.
“Most OEMs don’t have fleets of cars everywhere, so that where we can come in to ingest and integrate this sort of live information across individual OEMs fleets and then share it broadly, which will ultimately make our roads safer,” he says.
Along with the HD Map and Humanized Driving, Live Roads will make self-driving cars that mimic people more of a reality.
Three things from HERE that will make autonomous cars a reality
|1. HD Map – because autonomous cars can only understand the real world through a map|
|2. Live Roads – because autonomous cars have to see around the corner|
|3. Humanized Driving – because autonomous cars have to make passengers feel relaxed and comfortable|
image credit: Umberto Salvagnin