Tesla is creating its own chips optimized for machine learning, CEO Elon Musk said during a call with investors on Wednesday. Musk acknowledged the rumored project last December but is now providing more details.
“We’ve been in semi-stealth mode for the last two to three years,” Musk said.
During that time, Tesla created what Musk described as “the world’s most advanced computer specifically for autonomous operation.
Musk said that the new chip, due out next year, will deliver an “order of magnitude improvement in operations per second” compared to “current Nvidia hardware.” And the new hardware is designed to be backwards-compatible with the current generation of Tesla vehicles, allowing Tesla to swap out the old hardware to give current cars a big performance boost.
The decision to make the chip backwards-compatible with current Tesla vehicles may be a tacit admission that the “hardware 2” chips Tesla began shipping two years ago was not, in fact, powerful enough for full self-driving capabilities, despite being marketed as self-driving capable. Musk didn’t specify when upgrades might be available to existing Tesla owners, or if those upgrades would be offered for free or require an additional payment.
Tesla hired an Apple chip guru for the project
Tesla’s chip efforts have been spearheaded by Pete Bannon, an engineer who oversaw the development of the A5 chip at the heart of the iPhone 5 and worked on a number of iPhone chips since then.
“Two years ago when I joined Tesla we did a survey of all the solutions that are out there for running neural networks, including GPUs,” Bannon said on the earnings call. He continued:
Pretty much everywhere we looked, if someone had a hammer, whether it was a CPU or a GPU or whatever they were adding something to accelerate neural networks. But no one was doing a bottoms-up design from scratch, which is what we like to do.
We had the benefit of having the insight into seeing what Tesla’s neural networks looked like back then and having projections of what they would look like in the future, and we were able to leverage all of that knowledge and our willingness to totally commit to that style of computing to produce a design that is dramatically more efficient and has dramatically more performance than what we have today.
The key to the chip’s performance, Musk added is to “run the neural net at a bare metal level.” Other CPU and GPU chips do neural network calculations “in some kind of emulation mode,” he said.
The Tesla chip has the ability to do “a huge number of very simple computations with the memory needed to store the results right next to the circuits doing the matrix calculations,” Musk said. For a lot of competing systems, Musk said, “the transfer between the GPU and the CPU ends up being one of the constraints on the system.”
Of course, some companies have designed chips specifically for machine-learning applications. Google, for example, has developed chips called tensor processor units for machine-learning applications, though so far they’ve focused on offering the chips as a cloud computing service rather than selling chips to third parties. Musk didn’t explain how Tesla’s chips compare to other chips that are custom-designed for machine-learning applications.
Musk said that the result was an order-of-magnitude improvement over current Nvidia chips. But he didn’t specify which chips he was talking about, making the comparison somewhat unclear. Tesla adopted Nvidia’s Drive PX 2 platform in 2016. Then last year, Nvidia announced the PX 2 Pegasus, a new platform with 10 times the performance of the original PX 2, due out this year.
If Tesla’s chip performance is ten times as good as the PX 2 chips in current Tesla vehicles, that would put them roughly on par with Nvidia’s Pegasus chips. On the other hand, if Tesla was planning to deliver a tenfold improvement over Nvidia’s Pegasus chips, that could give the company a substantial advantage for the next couple of years. We’ve asked Tesla for more details and will update if we hear back.
“It’s an amazing design and we’re looking to increase the size of our chip team and our investment in that as quickly as possible,” Musk said.