Subscribe now

Technology

Driverless cars learn from landscape pics before going off-road

By Conor Gearin

13 July 2016

Two vehicles on a partly forested hillside

Just mind the trees

James Cheadle/Alamy Stock Photo

Where we’re going we don’t need roads, just a large database of pictures. A team at New York University have taught a machine learning system to drive off-road by showing it photos of different landscapes.

It’s one thing to train driverless cars to navigate city streets or highways. But taking robot drivers off-road is tricky because the surroundings are highly variable, says Karl Iagnemma at the Massachusetts Institute of Technology, who wasn’t involved in the work.

It is hard to give a robot an exhaustive set of driving rules for every environment. Even a simple dirt road can be a challenge if it changes a lot in appearance along its length. “That’s a harder problem than just avoiding a traffic cone,” says Iagnemma.

To prepare the AI to encounter many different landscapes, Artem Provodin and his colleagues taught it to recognise drivable terrain in pictures from the ImageNet database. It’s not the first time this online collection of about a million photos has been instrumental in teaching AIs to recognise objects in a scene, from sunsets to cats – sometimes with greater accuracy than humans.

Provodin’s team selected a small set of images showing landscapes and labelled them by hand, highlighting areas where a vehicle could be driven and non-navigable areas – those with trees, for example. The system used these initial examples to learn to label landscape features on its own.

Training a robot using photos is a good idea, says Panagiotis Tsiotras at the Georgia Institute of Technology. It’s a quick way to teach it what surfaces are slippery or provide good grip, for example, without having to try them out for real.

The images don’t tell a robot everything, however. To fine-tune their system, the team took a four-wheeled robot called Corobot Jr for a spin.

Corobot Jr isn’t built for off-road driving and can’t drive over grass if it is more than a few centimetres tall. The images gave it a head start, but limitations like this one are best learned about by trial and error.

When the robot drives over a surface, it adds that to its list of things that are traversable, says team member Liila Torabi of GE Global Research in New York. Over time, the robot will become more independent – and less likely to get stuck in the mud.

Topics:

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox! We'll also keep you up to date with New Scientist events and special offers.

Sign up