Tesla’s artificial intelligence team took to the stage on Friday evening for the company’s second annual AI Day to demonstrate how far their autonomous robot and vehicle research has come. Tesla AI Day 2022 gave us ourstrolling around the stage, updates on self-driving software and a first look at the Dojo hardware powering Tesla’s AI research.
Originally planned for June, Tesla CEO Elon Musk delayed the showcase until September to get the Optimus prototype working. “This event is meant for recruiting AI & robotics engineers, so will be highly technical,” Musk tweeted Thursday. Mirroring Musk’s stance that the Tesla Bot will be “friendly,” Tesla tweeted an animation of robot hands forming a human heart symbol.
The event kicked off with a short keynote from CEO Elon Musk, punctuated by a brief demo of the latest version of Optimus, which had plenty of exposed parts but waved and walked around the stage on its own without a tether. It’s a long way from last year at the inaugural AI Day, when a human cavorted around the stage in an Optimus costume. But there’s plenty of work to be done before the Optimus becomes the ultra-capable autonomous helper that Musk believes it could be.
“The Optimus has two times the economic output [of people],” Musk said on stage. “Actually, it’s not clear what the limit actually is.”
Better still, Optimus could go on sale in three to five years, Musk said.
The night had more than Musk’s high hopes for the Optimus. Engineers described design challenges to make the robot move around and recognize things like humans would. Then Tesla researchers from the team behind the Autopilot autonomous driving software explained progress with its Full Self Driving, or FSD, software, which is intended to take the highway-navigating Autopilot to more complex city streets. Last came the hardware folks who unveiled what Dojo can really do once their hardware cabinets stacked with hundreds of chips start arriving early next year.
Here’s what we learned at Tesla AI Day 2022.
The Tesla Bot walks and waves on its own
We’re still a ways away from the final version we saw envisioned in concept art at last year’s AI Day, but a working version of Optimus was finally unveiled. Weighing 73kg (161 pounds), packing a 2.3kWh battery in its servo-exposed chassis and using third-party actuators, it walked around and waved under its own power.
The next Optimus version was hauled out to tease the audience, a sleeker model with metal casing covering its torso and limbs with Tesla-built actuators, but it wasn’t far enough in development to move under its own power and simply waved. As Musk said frequently over the night, their goal is to “produce the robot as quickly as possible and have it be useful as quickly as possible.”
Yes, Optimus will come In a catgirl model
Shortly after revealing Optimus, and quite literally as he was backstage while his team was continuing Tesla’s presentation, Musk tweeted that “Naturally, there will be a catgirl version of our Optimus robot.” A second reply tweet showed a photo of a lady action figure in the foreground — possibly Zero Suit Samus — with rows and rows of robot chassis shaped like female humanoids behind it.
Whether Musk is being serious is impossible to tell, though in response to a question during the Q&A period, he implied there could be different appearances for Optimus. “We want to have really fun versions of Optimus,” Musk said. “You can skin the robot in many different ways.”
Full Self-Driving grows to 160,000 beta users
Tesla’s Autopilot team explained how far they’ve come with the FSD technology, which expanded its beta from 2,000 Tesla drivers last year to 160,000 in 2022 so far. It’s still only available in the US and Canada, though Musk said that without regulatory issues to sort out with every country they’d expand to, it’s technically possible that Tesla could open the FSD beta globally by the end of the year.
In their show-and-tell session, Tesla engineers explained how they’ve sped up the car’s decision-making capabilities from weighing options in milliseconds to 100 microseconds, which is 10 times faster. The team showed how FSD’s tech sees the world around Teslas mapped in 3D geometry and makes choices based on what’s around them.
Training the FSD model to make those choices is no mean feat, either. Tesla cobbled together three supercomputers, also known as the Dojo supercomputing platform, which is currently made of 14,000 GPUs — 10,000 for training and 4,000 for labeling. If you’ve struggled to find scarce GPUs, some of them might have gone to Tesla.
Other Tesla engineers explained wonky topics like creating an entire neural network just to recognize lanes in roads. Their early image-based models could identify the lane the car was driving in and those on the right and left, which worked on simple roads like highways, but the team wanted to make a system for much more complex maneuvers like turning left and right in intersections despite multiple crossing lanes of traffic from cars, buses, bicycles and pedestrians.
Dojo is faster than stacks of GPUs
Tesla is starting to put together a massive,to train its AI on all the video its cars are picking up and beaming back to the company. To get the performance the AI team needs to churn through a 30-petabyte footage vault, Tesla went dense with its hardware.
As the engineers explained, a stack of 25 Dojo dies (called D1) is collected in a tile that can replace six off-the-shelf GPU boxes. System trays of six tiles, paired with 640 GB of DRAM split into 20 cards, are only 75mm high (or about 10 of the iPhone 14 stacked up), weigh 135kg and are capable of 54 PetaFLOPS of computing power — or 54 quadrillion floating-point operations per second.
Two of those trays are placed in a cabinet (called an ExaPOD) and loaded with all the power sources it takes to keep it afloat. Despite such a serious power draw that, in testing last year, engineers pushed the cabinet to over 2 megawatts which tripped their substation and got a call from the city, they were able to reduce the coefficient of thermal expansion, or CTE — a metric of heat management efficiency — by a factor of three.
To compare with off-the-rack performance, a standard process takes five microseconds with a stack of 25 D1 dies, but 150 microseconds with a stack of 24 GPUs.
That’s just the first generation of these devices. Tesla plans to build its initial ExaPOD by the first quarter of 2023 with six more they plan to build. But the next generation will be 10 times better with more advanced hardware.
Tesla could loan out Dojo for companies to train their AIs
Tesla will certainly have its hands full building Dojo and integrating it to train its own AI, but in response to a question from an AI Day audience member, Musk said the company probably won’t sell its custom cabinets as a business. Instead, it’s possible Tesla sells compute time on a Dojo instead, much like Amazon Web Services, he theorized.
“Just have it be a service that you can use that’s available online and where you can train your models way faster and for less money,” Musk said.