Self-driving cars will rarely have to deal with a pack of drivers who think they are in a “Fast and Furious” movie, but training them to do so might just be what it takes to reach true autonomy.
But sending driverless vehicles drifting around curves at high speeds isn’t exactly practical or safe. That’s why Ascent Robotics Inc. is building a virtual simulation that it believes will help create self-driving automobiles capable of handling any scenario, however unlikely. The Tokyo-based startup is raising 1.1 billion yen ($10 million) in its first funding round, led by SBI Investment Co.
The total distance traveled by driverless vehicles on public roads has long been considered the main metric of progress in the industry. By that measure alone, the 8 million kilometers (5 million miles) logged by Alphabet Inc.’s Waymo would appear to be an insurmountable lead. Ascent is betting that only a fraction of that data is actually useful enough for training autonomous cars, because it’s the rare, unexpected events that matter the most.
“Driving data isn’t everything,” said Fred Almeida, who founded Ascent in September 2016 after leaving another deep-learning startup he helped co-found. “We can learn key behaviors in a simulation that transfer effectively well to a physical system.”
While automakers and Silicon Valley rivals in the autonomous space are also looking at simulations to fine-tune their technology, Ascent is betting that virtual environments will let it speed ahead of the competition. To create the “Fast and Furious”-style simulation, engineers donned VR headsets and started off by performing drift maneuvers using PlayStation wheel controllers. Algorithms take that data to clone driving styles, then instruct the software to try and predict trajectories and react accordingly.
Ascent’s Atlas simulation environment is built on Unreal Engine, a game development platform behind Batman: Arkham Knight, Assassin’s Creed Chronicles and other titles. The virtual world comes complete with pedestrians, buildings and traffic signs, but it isn’t a replica of any particular city. Instead, the software is more like a set of Lego blocks which can be used to build training scenarios.
“You want to have that uncertainty and noisiness of human driving as part of the learning experience of the car,” Almeida said. “This should result in a more robust result at the end.”
Robot cars have come a long way in the past decade. In a 2004 car competition to spur development, none of the driverless vehicles made it to the finish line, hitting barriers, swerving off roads and freezing in place. Since then, advancements in sensors and breakthroughs in deep-learning algorithms vastly improved the ability of machines to perceive the world, making basic navigation a solved problem.
Another challenge for driverless cars is making predictions based on what they see. Errors can be costly in the real world, but not in virtual environments. Ascent can accelerate learning in Atlas by operating multiple vehicles simultaneously and making the internal clock run up to 20 times faster than the real world.
Waymo, which brings together all the Google parent’s autonomous driving efforts, said last month that its software drove 2.7 billion virtual world miles. The company uses high-definition 3-D maps built from data gathered by its street cars to test challenging scenarios such as flashing yellow signals, drivers going the wrong way, pedestrians and cyclists.
“Every major player in this space is aiming for a combination of real road testing and a magnitude more in miles driven inside a simulated environment,” said Ali Izadi-Najafabadi, an analyst at Bloomberg New Energy Finance. “Focusing on the edge cases is going to be key for everyone.”
Ascent plans to use some of the money it’s raised to equip four Lexus SUV hybrids with a full sensor array and begin road tests on Japanese roads as early as this summer. The cars will carry computers worth $100,000 each to gather data, especially for road conditions that are difficult to simulate, such as snow and ice. A driver will be present, hands poised over the steering wheel, ready to take control.
The startup has a research partnership with at least one major Japanese automaker and three robotics companies, Almeida said, declining to give names because the agreements aren’t public. Ascent also plans to at least double its 30-person workforce by the end of the year.
While it’s not yet clear what combination of technologies will deliver fully autonomous vehicles, the good news is we won’t have to wait long to find out. Most automakers agree that commercial deployment of cars that can drive without human assistance will start around 2020, according to BNEF. That’s also when Ascent plans to begin licensing its AI agents to car companies.
Fans of the “Fast and Furious” series and drift racing may have something extra to look forward to.
“The car itself should be able to learn how to drift,” Almeida said. “The reason why drifting is dangerous is because people don’t have full control. A robot car would be able give you just the right amount of thrill.”