When we drive, we're constantly looking backwards. We have to see if there's a car approaching us from behind if we want to switch lanes or if there are objects we should steer clear of when we're backing up.
One of the obvious problems with this is that if your eyes are focused on what's behind, you're blind to what's happing in front of you, and vice versa. A machine, though, can theoretically look everywhere, at once.
On Monday at the Consumer Electronics Show in Las Vegas, graphics chip maker Nvidia unveiled a new computer chip, dubbed the Drive PX 2, that the company hopes will make self-driving safer and better able to navigate their environment. The PX 2 can run complex artificial-intelligence algorithms to help cars make sense of what's around them. The availability of faster, more powerful computers is one of the ingredients necessary to make cheap self-driving cars a reality.
For autonomous vehicles, the biggest challenge is knowing where they are in space relative to other (moving) objects and then planning a path given constantly changing conditions. Like a human driver, they need to do this all in real-time. For a person, that requires a lot of concentration and focus (one of the reasons distracted driving is so dangerous). For a computer, what's necessary is an extraordinary amount of computing power. The lunch box-sized PX 2, which will be deployed first on a test fleet of Volvo cars, has the computing power of 150 MacBook Pros, according to the company.
Leveraging that, the chip can process all the information the car is taking in through sensors and 12 cameras to create an accurate detailed map of where it is using special software. It can then use processed information to "decide" how to move around safely.
The system makes it easier for each car company to develop and scale their own artificial intelligence systems. That's really important because that software will determine what the driving experience in different robocar models will be like, said Jen-Hsun Huang, co-founder and CEO of NVIDIA during a press conference. "This body of work has to be owned by the car companies," he said. Nvidia wants to be there to facilitate that.
The Drive PX 2 system can also be hooked up to a car's infotainment system, so that you can see what the car "sees," also in real-time.
Notice the car coming up from behind in the rear left-hand corner. The car's rearview camera "sees" and tracks it and displays that information for you, cartoon style. Suddenly, "we don't need rearview mirrors anymore," said Huang.
He's not the only one who thinks that. A recent survey by Institute of Electrical and Electronics Engineers showed that experts largely believe that rearview mirrors will be axed from cars by 2030, along with horns and emergency brakes. Steering wheels and gas pedals will join them on the R.I.P. list five years later.
That aligns with the Google vision for self-driving cars, in which software, and not humans are in charge. The concept Google robovehicles have attachable steering wheels due to legal reasons, but the search giant largely imagines cars without them. (They do have rearview mirrors, though.) Last year, Mercedes Benz, whose parent company is a Nvidia client, showed off a concept car that also had an optional steering wheel and no rearview mirrors.
How long it'll actually take to get there is still T.B.D. In December, California drafted regulations for self driving cars requiring a licensed driver to be in the car just in case there's a bug and a human needs to take over.
Daniela Hernandez is a senior writer at Fusion. She likes science, robots, pugs, and coffee.