Autonomous car services are coming to the market, but are they ready to interact with humans?
Automotive startup Zoox recently received California’s first permit to transport passengers in their self-driving test vehicles (backup human safety driver provided). With this new permit, companies will have the chance to test an entirely new aspect of autonomous vehicle operation. We spoke with Matt Preyss, Product Marketing Manager at HERE Highly Automated Driving to learn more.
“Previously, the only thing we’ve had insight to with these autonomous vehicles on a public road test were technical disengagements,” said Preyss. “That’s how we’ve measured how good the technology is. Now we’re also going to start seeing how people interact with the machines.”
Disengagements are the instances when an autonomous vehicle stops working, and/or requires the backup safety driver to take over. But the bigger point that Preyss makes is that companies now have the opportunity to test how the passenger component will factor in to the operation of an autonomous car.
“We’ll start to understand what's best in terms of pickups, drop-offs, and what are passengers doing inside the vehicle?” Preyss added. “We’ll also get more insight on how consumers feel. How much do you trust/like/dislike the autonomous vehicle? What are you doing in the car? All new insights for how people interact with the car are going to come to the forefront... Those insights create a huge opportunity for autonomous car makers.”
Human-Computer Interaction (HCI) is the design discipline that answers how people and machines connect and exchange intentions and information. It's an understated element within discussions about autonomous cars. Tech companies rightfully keep their focus on perfecting the hardware and software that makes self-driving cars work in the first place. But to make them truly functional, autonomous vehicles must also be humanly accessible.
Designing human interaction is not a light task
Think about the interaction you have when you get in a taxi (or a rideshare). You'd likely say hello to your driver, tell them where you’re going, and maybe add how you want to get there.
Along with parsing out pleasantries, the human behind the wheel knows exactly where to stop the car so you can get in easily. They can quickly adapt if you need to add an extra stop. They can respond when you ask them to pull over on the far right corner at the last minute. There are no AI assistants that can reliably recreate these types of use cases just yet.
Most of us have a strong preconception of what should happen when we enter a car. Any designed in-car experience that doesn’t account for those expectations is likely to deliver a shock to the earliest autonomous car passengers.
We would like to see that shock avoided, because this area of experience and interaction is legitimately exciting. If thoroughly tested and delivered correctly, these early experiences will not just influence future design of vehicles, but how people and machines interact for a long time to come.
Optimize on-demand and scheduled mobility operations to enable seamless intermodal journeys.