Testing an autonomous vehicle can be a struggle. Operate one in a closed environment, and it’s an unrealistic reflection of the roads it will one day have to traverse; throw one into the deep-end and you risk people’s safety. What is needed is an environment in which real-world conditions can be found, but the stakes aren’t real. Enter AirSim: a simulator which allows users to crash autonomous vehicles.
AirSim is part of Microsoft’s Aerial Informatics and Robotics Platform research project, which aims to drive understanding of how autonomous vehicles will actually move around in the real world. AirSim provides a sort-of dry run for developers, allowing them to write code to control autonomous vehicles and drones in a highly realistic simulator.
According to a Microsoft blog post, the simulator, based on the Unreal engine, essentially gives developers the freedom to crash autonomous vehicles and drones, enabling “the study and execution of complex missions that might be time-consuming and/or risky in the real-world.”
Worryingly, this could also be a hit with those who take J.G Ballard’s Crash a little too seriously.
Make it real
The simulator, which is still under heavy development, currently supports drones, though Microsoft is planning to add support for other vehicles soon, with the open source community being called upon to contribute code to extend to other types of hardware.
According to Microsoft, there are a number of ways that AirSim can contribute to the development of robotics. First, is the opportunity to train self-driving cars and drones to better cope with real-world environments, helping them to understand what to avoid – curbs for cars, say, or glass doors for drones.
It also aims to help these vehicles differentiate between parts of the environment which need to be avoided, and parts which may mislead the A.I. This is explained in the blog post, which states that these “tools could help researchers develop better perception abilities, which would help the robot learn to recognise elements of its environment and do things like differentiate between a real obstacle, like a door, and a false one, like a shadow.”
Indeed, Microsoft stresses that the reason self-driving cars and autonomous drones are still considered emerging technologies is the difficulty they find in differentiating and anticipating scenarios which happen in the real world. Ashish Kapoor, a Microsoft researcher says in the post, “That’s the next leap in AI, really thinking about real-world systems.”
By offering a near-lifelike representation of the real-world (with graphics not far off your average PS4 game), the AirSim helps developers to quickly and cheaply test robotics without the need to wander around a closed-circuit with a dustpan and brush, scraping up bits of drone.
In-fact, the very nature of crashing can be illuminating – just ask crash test dummies. Microsoft, in another post, claims that by allowing vehicles the opportunity to crash in an environment where others can’t be harmed, lessons can be learned. It says, “collisions in a simulator cost virtually nothing, yet provide actionable information for improving the design.”
While the future of the autonomous vehicle relies heavily upon the refinement of A.I and the use of sensor data, AirSim also provides a healthy, productive learning resource – the opportunity to make mistakes.
What do you think are the main obstacles for autonomous vehicles? Let us know in the comments below.