For years, luxury carmaker Mercedes Benz has been working on a prototype of the car of tomorrow, the F 015. It may not fly, but it’ll be able to do just about anything else: park itself, change lanes, avoid pedestrians, play your favorite music, alert you when friends are nearby, and interact with you through lasers and touchscreens. It’ll be at your beck and call through an app, like an Uber you own and don’t have to pay extra for.
“It’s seen as a completely autonomous driving vehicle,” Simon Tattersall, Mercedes Benz’s resident expert on autonomous driving, told me during a phone interview. But before you get too excited about this sexier, shinier version of robocar Kitt from Knight Rider, know that “the logical technological advances necessary for such autonomous driving haven’t been made. We’re on the road with current and research vehicles, but we’re not close to this vision yet.”
And to make good on that vision, Benz and others working on autonomous cars will have to engineer vehicles with superhuman, all-seeing, almost infallible eyes. And that's not easy. It requires sophisticated artificial intelligence techniques in computer vision, pattern recognition, and predictive planning. These are all things your iPhone does relatively well, with one big caveat: If iOS suddenly poops out, you might lose your cool, but you won’t crash and kill an innocent bystander. There’s very little room for error with algorithms that control a speeding chunk of metal.
To sense their environment and avoid collisions, robocars will need not just GPS, but cameras and sensors that face every direction. These will connect to powerful computers that will makes sense of all the incoming data and help the car learn its routes, pick out anomalies, obey traffic laws, and avoid precocious children crossing roads without their parents.
But they’ll also change the definition of what a car is.
In a very fundamental way, Benz, Audi, Tesla and other companies getting into the robocar game are essentially remaking the car into a very powerful and expensive spying smartphone on wheels, almost by necessity. Without its Big Brother capabilities, an autonomous car just wouldn't work, the same way Google wouldn't be as good if it didn't "read" our emails or archive our search history.
For instance, an app that controls the F 015 can also turn the cameras it uses to see the road as remote prying eyes. Through the app, you can connect to the car’s cameras to spy on the car's surroundings through your phone. It effectively turns your car into a lurking Dropcam that can be used to watch unknowing passersby, anywhere, anytime. Or as another journalist on the junket put it, it turns every single vehicle into a Google Street View car. The privacy implications will be huge.
But it doesn't stop there. Just like your iPhone or Android device, your car will communicate with other internet-connected devices in your life. It’ll learn your habits and adapt to your needs. For instance, say your car “realizes” you’re on your way home at dinner time. It “knows” your smart fridge is stocked with nothing but booze, so it prompts you to go to the grocery store or local eatery to pick up some grub. It’ll pull up the number of your favorite restaurant or suggest a new one based on your preferences. While you call, your robo-butler adjusts its course to take you where you need to go. By the time you arrive for curbside pickup, your credit card will already have been charged.
“We call it predictive learning,” said Mercedes' Tattersall. “This will be something not so far away.” And he's probably right. This is the type of machine learning that already underpins Google, Facebook and Amazon and that makes your iPhone or Android device such a powerful little tool.
What that means is that the next advertising turf war will be in cars. We already see and hear ads everywhere we drive, on billboards and radio shows and podcasts we listen to. Many times these digital ads are tailored to the places we’re frequenting and to what algorithms think that we’ll like. But if the future looks like what Tattersall is describing, the car of tomorrow will actually help us act on our impulses. Call it the next evolution of the in-app purchase, materialized IRL.
And the same artificial intelligence that powers a car's spy eyes will be central to the new ad war. Experts call it deep learning.
Deep learning already underpins voice and image recognition on the web. Companies are using it to target ads and make sense of our Facebook posts and tweets. It's even being used to decipher facial expressions and body language, though its accuracy there is a little more tentative. Recent research has put it to the test in picking out different types of objects on streets and walkways. When these systems are trained on graphical processing units (GPUs) — chips traditionally used by the video game industry — researchers find that models can be trained super quickly and accurately.
Just last week, for example, GPU maker NVidia announced its $10,000 Drive PX card would go on sale in May. It comes pre-packed with deep-learning software that recognizes different types of objects. “It’s a system that can be trained, and retrained, with more data,” Senior Automotive Director Danny Shapiro wrote in a blog post. “Every time your self-driving car gets an over-the-air update, it can get smarter.”
Which all self-driving cars need to be. Benz flagged its $12-million one-of-a-kind car as an autonomous vehicle, and, yes, it drove around in a loop with no one behind the wheel. It even started, sped up, slowed down, and stopped all on its own. When one of the engineers at the event beckoned it through an app, it started up and wheeled itself over. But all this was preprogrammed, and even so it stumbled. It worked really well during my ride, but during previous test-rides, someone had to take over for the car's algorithms. If we'd taken it outside the Alameda Naval Air Station where it was being showcased, it might have crashed rather quickly.
Tattersall wouldn't say what kind of software or hardware Mercedes is experimenting with to make its cars smarter, but given what other leaders in the field, like Audi, Tesla and Google, are doing, it's likely they've also turned to deep learning and GPUs. I’d be willing to bet — based on other experimental self-driving projects I’ve seen — that the rumbling computers packed in the F 015’s trunk were equipped with some GPUs. It’s just where the industry is going, both on the web and on the road.
Daniela Hernandez is a senior writer at Fusion. She likes science, robots, pugs, and coffee.