Why self-driving vehicles depend on self-healing maps

Sanjay Sood, Head of Highly Automated Driving at HERE
Chicago 41° 53' 2.9759999999997" N, -87° 37' 56.748" E

Vision, although our most dominant sense, has its limits. That cannot be the case with self-driving vehicles, however, which need to see through buildings, around corners and 20 miles in advance to maneuver safely. Success is contingent upon map learning.

Earlier this week, I discussed how HERE is addressing this critical need for self-healing maps at NVIDIA's GPU Technology Conference in San Jose, Calif., where I joined partners, developers, researchers and other technologists showcasing the vital work being done around deep learning, AI, big data analytics, virtual reality, self-driving cars and much more.

The road ahead

To fully realize the vision of autonomous driving, vehicles will need to understand the road environment beyond the range of its onboard sensors, and, at the same time, instil full confidence of their safety to the public.

Vehicles will need real-time, accurate and semantically rich data to, for example, pinpoint lane-level positions, identify lane boundaries and road objects, and be able to proactively maneuver in response to changes or incidents that affect driving conditions.

Our HD Live Map helps vehicles plan these maneuvers beyond sensor visibility by providing precise positioning, enhanced sensor functionality, contextual awareness of the environment and local knowledge of road rules. HERE provides this cloud-based service in a data efficient manner through its tiled, layered format. These capabilities will also increase trust with the consumer through a more comfortable experience.

Location platform

As we've discussed in a previous blog post , highly automated driving relies on the seamless fusion and ingestion of static and dynamic road data by the vehicle's driving and safety systems. Where am I exactly? What lies ahead? How can I get there comfortably? These are the questions automated vehicles must process continuously.

These symbiotic relationships are only possible through a community of data gathering and reporting -- keeping maps fresher and feeding the self-healing maps process flow from sensor collection and ingestion, to aggregation in the cloud and, ultimately, publishing an update or creating new features.

By collecting data from cars sensors and referencing other map data in the cloud, HERE's machine learning capabilities are providing the industry's first "self-healing" map at scale.

Through our initial three automotive company investors we have unique access to car sensor data. What's more, key strategic technology partners -- Intel, Mobileye and Nvidia -- broaden the flexibility and capability of our HD Live Map.

In addition, our Chinese investors Tencent, NAVINFO and GIC unlock thirty percent of the growth market in the world, and our potential in Japan has expanded through our partnership with Pioneer.

Most recently, we announced updating the HERE HD Live Map using anonymized data from BMW cars equipped with camera-based Advanced Driver Assist System (ADAS) technology from Mobileye.

To fully realize the true promise of autonomous driving, it's essential that HD mapping be offered at scale, both in the participating population of vehicles and geographic coverage.

We're making that happen now -- we invite you to join us.

Click here to download the HD live map tech brief!

Topics: HD Live Map, Automotive, Autonomous cars, Big Data, Features, Editor's picks, What HERE does, Sensors ingestion

Comments