Earlier this year, when a 22-year-old factory worker at a Volkswagen plant in Germany died after being crushed against a metal plate by an assembly machine that mistook him for a car part, most news outlets reported it as a case of robot murder. Countless headlines blared "Robot kills worker at Volkswagen plant," and a series of entirely predictable Skynet jokes followed.
But last week, when news broke that Volkswagen had outfitted 11 million of its diesel cars with software that allowed them to cheat on their emissions tests, leading its customers to unwittingly release many tons of pollutants into the atmosphere, nobody blamed the company's "robot cars" for the error.
And yet, both Volkswagen situations were technically the fault of robots. (The Oxford English Dictionary defines a robot simply as as "a machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer.") In fact, the diesel cars were arguably more sophisticated robots than the factory assembler. The factory machine simply grabbed and moved large objects according to pre-programmed instructions. But according to investigators, the diesel Volkswagens were equipped with onboard sensors that were able to detect when a car was being subjected to an emissions test, and respond by going into full "catalytic scrubbing mode" which allowed it to fool the test. In other words, Volkswagen's factory machine was inadvertently diabolical, while its diesel cars were actually programmed that way.
Today, many of the devices in our lives are really robots working under pseudonyms. A "smart thermostat" is a robot that raises and lowers the temperature of your house. A "smart home security system" is a robot that keeps you safe. A coffee maker with a Bluetooth chip is a robot that keeps you caffeinated. And then there are all the so-called "software robots": personal assistants like Siri and Cortana, financial "robo-advisors," and apps that translate foreign languages on the fly.
As more and more household tasks become automated, the number of robots in our lives is growing rapidly. And the rise of connected devices raises a thorny semantic question: namely, where does "automated process" stop and "robot" begin? Why is a factory machine that moves car parts considered a "robot," but a Volkswagen with a much more sophisticated code base is just a Jetta?
Instead of trying to gerrymander a definition for "robot" that could account for the differences between all of the varied types of machine intelligence, I propose a different solution: what if we just stopped saying "robot" altogether?
I asked some experts in robotics and automation what they think of the word "robot," and many agreed that the term had outlived its usefulness.
"Robots are things that don't do useful things," Chris Anderson, the CEO of drone-maker 3DR (which was formerly known as 3D Robotics, before a recent name change) told me over email. "Once they eventually work, we call them what they are, like 'dishwasher' or 'toaster" or 'drone.'"
The word "robot," Anderson said, "implies an immature developmental state, a technology looking for a problem to solve. Once it's found its natural form, it finds a name that describes its function."
Since it came to the English language in the 1920s, from the Czech robata (meaning "slave") the word "robot" has been primarily used to describe things that make us uncomfortable. During the 1930s, people worried about robots stealing human jobs, gaining sentience, and eventually turning on their masters. And the popular sci-fi robots of the following decades—Skynet, HAL 9000, Ultron—were either clichéd in their malevolence or notable—like C-3PO and R2D2, or Samantha from "Her"—in that they didn't want to kill us.
Part of the popular obsession with the word "robot" is that it's evocative. (Why explain how a company applied assembly-line techniques to pharmaceutical distribution when you could fret about "drug-dealing robots" instead?) And partly, it's because the term is a cover-all that allows for the kind of technical hand-waving that often accompanies narratives about complex systems we don't quite understand.
"It's the only shorthand that works," says James Kotecki, the head of communications at Automated Insights, a company that makes automated writing software for news organizations like the Associated Press. "We could say, our natural language generation software automatically writes stories for you. Or we could say, 'robot writer.'"
The media's collective willingness to use "robot" as a stand-in for anything even remotely automated has led to some fairly convoluted headlines, like:
San Francisco’s Newest Fast Food: Healthy, Cheap and Served by Robots (actually, the restaurant just replaced cashiers with iPads—all of the food is still made and placed into delivery cubbies by human hands)
AOL's New Video Audience May Consist Largely of Robots (aka "software")
Here come the robot lawyers (less sexy version: "Law firms experiment with new machine-learning technology that helps them sort through large documents")
Kevin Albert, the CEO of soft-robotics maker Pneubotics, defended his company's use of the term "robot," since they actually manufacture machines that perform coordinated movements with multiple degrees of freedom. But he agreed that the term had been too broadly defined by others. "There's actually a taxonomy of machines," Albert said. "Saying 'robot' is equivalent to talking about a certain species as if it's just an animal."
Ultimately, Albert added, the word "robot" may end up becoming little more than a rhetorical flourish for companies hoping to signal their interest in innovation. "It's a signpost, right up there with the flying car," he said. "Saying the thing you're making is a robot is an easy shortcut for saying, 'We're working toward the future.'"
But the biggest problem with the word "robot" isn't that it's overly broad marketing jargon, or that it engenders lazy reporting. It's that it obscures human agency. When we talk about robots misbehaving, we often forget that those robots didn't just execute bad code on their own. Generally, they were programmed to do so by humans. If Volkswagen's emissions fiasco had been portrayed as a case of malicious robots cheating on emissions tests, as its factory machine death several months earlier was, the human executives who signed off on those software tweaks would have gotten off easy.
I confess to a certain level of guilt here. I've used "robot" before in contexts when "software" or "connected device" might have been more accurate. I've illustrated stories about automated software using metal automatons from sci-fi movies. And I've evoked the "OMG ROBOTS ARE RISING UP TO KILL US!!!!" trope in an attempt to garner interest in a fairly unremarkable feat of automation.
But from now on, I'm taking a vow of accuracy. Whenever possible, I'll do my best to describe machine behaviors as the automated processes they actually are, rather than leaning on the crutch of robotics. Fewer lazy robot clichés means more attempts to understand the ways automated systems work. And ultimately, the less we say "robot," the better grasp we'll have on the mechanics of the machines that run our lives, and the more accountability we'll insist on for the humans who ultimately control them.