Elon Musk reveals a humanoid robot at Tesla AI Day 2022

Elon Musk reveals a humanoid robot at Tesla AI Day 2022

Hi, this is Wayne again with a topic “Elon Musk reveals a humanoid robot at Tesla AI Day 2022”.
Uh welcome to Tesla AI day 2022.. I do want to set some expectations with respect to uh our Optimus robot um, as, as you know, last year it was just a person in a robot suit uh, but uh We’ve. Now we’ve come a long way and it’s I think you know compared to that. It’S going to be very impressive, so should we should we bring out the vot before we do that we have one one little bonus tip for the day.

This is actually the first time we try this robot without any backup support cranes. Mechanical mechanisms, no cables. Nothing yeah want to join with you guys tonight, but it was the first time.

Let’S see you ready, let’s go [ Applause, ] foreign, so this is essentially the simple self-driving computer that runs in your Tesla cars. By the way this is the this is literally the first time the robot has operated without a tether was on stage tonight. So the robot can actually do a lot more than we just showed you.

Elon Musk reveals a humanoid robot at Tesla AI Day 2022

We just didn’t, want it to fall on its face uh. So we’ll we’ll show you some videos now of the robot doing a bunch of other things. Yeah.

We wanted to show a little bit more what we’ve done over the past few months with apart and just walking around and dancing on stage. Ah just humble beginnings, but you can see the autopilot neural networks running, as is just retrained for the bud uh directly. On that, on that new platform, that’s my watering can yeah when you see a rendered view.

That’S that’s the robot. What’S the that’s the world, the robot sees so it’s it’s very clearly identifying objects like this is the object. It should pick up picking it up. We use the same process as we did for autopilot to connect data in train your networks that we then Deploy on the robot. That’S an example that illustrates the upper body a little bit more.

What you saw was what we call Bumble C. That’S our uh sort of rough development, robot uh using semi-off-the-shelf actuators um, but we actually uh have gone a step further than that. Already the team’s done an incredible job um and we actually have an optimist bot with a fully Tesla designed at both actuators um battery pack, uh control system, everything um it. It wasn’t quite ready to walk uh, but I think it will walk in a few weeks.

But we wanted to show you the robot uh, the the something that’s actually fairly close to what will go into production and um and show you all the things it can do. So, let’s bring it out. Let’S do it all right with the degrees of freedom that we expect to have in Optimus production unit, one which is the ability to move all the fingers independently move the to have. The thumb have two degrees of freedom, so it has opposable thumbs and both left and right hand.

Elon Musk reveals a humanoid robot at Tesla AI Day 2022

So it’s able to operate tools and do useful things. Our goal is to make a a useful, humanoid robot as quickly as possible. The Optimus is designed to be an extremely capable robot, but made in very high volume, probably ultimately millions of units, and it is expected to cost much less than a car uh, I would say, probably less than twenty thousand dollars the the potential for optimistic is, I Think appreciated by very few people hey as usual: Tesla demos are coming in hot software integration – Hardware upgrades over the months since then, but in parallel, we’ve also been designing the Next Generation this one over here. Obviously there’s a lot. That’S changed since last year, but there’s a few things that are still the same you’ll notice.

Elon Musk reveals a humanoid robot at Tesla AI Day 2022

We still have this really detailed focus on the true human form so on the screen here, you’ll see in Orange are actuators which we’ll get to in a little bit and in blue our electrical system, so in the middle of our torso. Actually it is the Torso. We have our battery pack, the sized at 2.3 kilowatt hours, which is perfect for about a full day’s worth of work.

What’S really unique about this battery pack is it has all of the battery Electronics integrated into a single PCB within the pack so going on to sort of our brain, it’s not in the head, but it’s pretty close. Also in our torso, we have our Central computer. So we still are gon na it’s gon na. Do everything that a human brain does processing Vision, data making Split Second decisions based on multiple sensory inputs and also Communications, so to support Communications, it’s equipped with wireless connectivity as well as audio support, and then it also has Hardware level security features which are important to Protect both the robot and the people around the robot.

So now that we have our sort of core we’re going to need some limbs on this guy and we’d love to show you a little bit about our actuators and our fully functional hands as well. So there are many similarities between a car and the robot when it comes to powertrain design. The most important thing that matters here is energy, mass and cost. In the particular case, you see a car with two drive units and the drive units are used in order to accelerate the car 0 to 60 miles per hour time or drive a City Drive site, while the robot that has 28 actuators um.

It’S not obvious. What are the tasks at the actuator level, so we have tasks that are higher level like walking or climbing stairs or carrying a heavy object which need to be translated into joint into joint specs. The rotary actuator in particular has a mechanical clutch integrated on the high speed side, angular contact ball, bearing and on the high speed side and on the low speed side a cross roller bearing and the gear train is a strain wave gear and there are three integrated Sensors here and the bespoke permanent magnet machine, so our actuator is able to lift a half tone. Nine foot concert grand piano. Our fingers are driven by metallic tendons that are both flexible and strong. We have the ability to complete wide aperture power grasps, while also being optimized for precision gripping of small thin and delicate objects. Some basic stats, about our hand, is that has six actuators and 11 degrees of freedom.

It has an in-hand controller which drives the fingers and receives sensor feedback reported directly from autopilot to the Bots situation. It’S exactly the same occupancy Network that we’re talking to a little bit more details later with the autopilot team that is now running on the butt here. In this article, the only thing that changed really is the training data that we had to recollect we’re also trying to find ways to improve those occupancy networks using work made on your Radiance fields to get really great volumetric rendering of the Bots environments, for example. Here some Machinery that the bot might have to interact with so we’ve been training, more neural networks to identify high frequency features, key points within the Bots camera streams and track them across frame over time as the bot navigates to its its environment and we’re. Using those points to get a better estimate of the Bots pose and trajectory within its environment as it’s walking – and this is a video of the motion – control code running into your product. Simulator simulator, showing the evolution of the robot’s work over time. And so, as you can see, we started quite slowly in April and start accelerating, as we unlock more joints and uh deeper, more Advanced Techniques like arms balancing over the past few months. We wanted some manipulate objects, while looking as natural as possible and also get there quickly. So what we’ve done is we’ve broken this process down into two steps.

First, is generating a library of natural motion references or we could call them demonstrations and then we’ve adapted. These motion references online to the current real world situation. So let’s say we have a human demonstration of picking up an object. We can get a motion capture of that demonstration, which is visualized right here as a bunch of keyframes representing the locations, the hands, the elbows, the Torso. We can map that to the robot using inverse kinematics, and if we collect a lot of these now we have a library that we can work with, but a single demonstration is not generalizable to the variation in the real world, for instance, this would only work for A box in a very particular location, so what we’ve also done is run these reference trajectories through a trajectory optimization program which solves for where the hand should be how the robot should balance during uh when it needs to adapt the motion to the real world.

So, for instance, if the box is in this location, then our Optimizer will create this trajectory instead. I think the first thing within the next few weeks is to get optimists, at least at par with Bumble C. The other bug prototype you saw earlier and probably Beyond, we’re also going to start focusing on the real use case at one of our factories, and I make this product a reality and change the entire economy.

All of this was done in barely six or eight months. Thank you very much. [ Applause, ] .