Rodney Brooks has a great essay on why he's skeptical that the current humanoid hype will deliver and the central claim is that human dexterity is extremely advanced any today's humanoids lack even the sensors and data needed to start building the models needed to match human performance.
I saw him post this article on his Bluesky saying that they're the first ones he's seen that are close to cracking this issue (he's an investor/adviser).
I wonder how accurate joint positions and muscle activations can be from just a POV camera. Maybe it’s not crazy to think someone could get tens of millions of hours of well-labeled training data.
Yeah I’m going to completely disregard this because I feel like we are less than a year away from completely human feeling humanoids. This is based on nothing but obsessively watching and following humanoid progress on the internet.
What was eye-opening, or rather, sobering for me was when I read an interview with an engineer who explained how incredible difficult it is for a robot to orient itself when it is lying on the floor and wants to stand up.
Yes, it can do the required motions just fine, that’s not the point. But think about yourself when you are lying on the floor: it’s really easy to determine if this is safe, if you are lying underneath something and so on. You just feel that.
A robot cannot do that; all they can do is look around as good as possible and visually determine their situation.
Yeah but imagine yourself lying on the floor with your vision being your only sense, plus an info floating in your mind: „fyi, you are no longer upright“.
That’s all, you feel nothing else. Now your job is to move all parts of your body in just the right way.
The point about being aware of lying underneath some object was interesting. Sound might matter, like the frequency of background noise changes when you're in an enclosed space, and listening to your own shuffling noises helps you know when you've planted your feet right - or something. I have some really effective ear plugs and I notice they make it harder to move around.
Having said that, I've probably hit my head on the underside of an open cupboard door five or six times in my life, and I expect to do it again.
It is also things like "I can feel that my left knee is bearing a little too much weight, I should shift weight to my right hand and use that to push myself up" - things that come automatically to animals after learning the hard way in infancy (some of it is innate; baby animals are clumsy, but usually more mobile than human infants). Regardless of learned-vs-instinct, these abilities rely on sophisticated "sensors" and cognition. I suspect engineering the sensors is actually a bit harder, but I'm also not optimistic about a deep learning approach to the cognition.
A significant underappreciated advantage of animals over AI: lifeforms can "learn the hard way" more easily than 2020s robots because of cheap self-repair. AI labs are reluctant to damage their robots, but an essential part of humans learning to move safely is severely bonking your head and reckoning with the consequences - "hey, dummy, why did you trip and fall and bonk your head? Because you were running like an idiot."
I am learning the hard way to this day :) I have been practicing with work knives. A few months ago I got stupid and impatient, and sliced my thumb nastily. If I didn't block the cut with my thumbnail (still ruined) I might have chopped bone. It is hard to say precisely what I learned from this experience - "don't be stupid and impatient" is facile - but I know I learned a lot. I am actually optimistic about targeted surgical robotics. But for a general-use humanoid robot, I would not want to give it a knife if it's not capable of feeling pain. I never use big knives anywhere near my cats because I understand intuitively that they are nimble and unpredictable and easily stabbed by knives. I didn't need to be trained on this. A robot kind of does. Yikes.
I obsessively avoid any kind of "technology is going thataway" content. So I haven't seen anything that looks like humanoid progress in quite some time. About the only thing that has snuck around my barrier is Musk apparently claiming he'll have it by the end of the year, which is pretty conclusive evidence that they won't have it by the end of the year.
So if you're seeing anything that actually seems to merit attention, I'd love a few pointers. I could use some good news.
Well, as someone who has tried to build at least a couple small robot arms, I think we are probably closer to 20-50 years away. Both the power and dexterity are not there.
Right now, only a human can both push over a boulder and pick up a tiny speck from the floor using the same actuator.
Many of the Chinese companies are doing very impressive open-loop sim2real. They make great demonstrations. They are not great at dealing with the real world and unpredictable environments.
(That's not true of all Chinese companies - some are doing really impressive work with closed loop systems in unpredictable environments. But many of the highly viewed ones with coordinated dance performances or martial arts are intended more as theater to government financial sponsors than useful function. The technically impressive performances do not look as visually impressive.)
those were impressive but were also RC. I think an important part of robotics is not just the mechanics of humanoid motion, but the independent control of those mechanics.
Back in the 90s, I developed a rule of thumb: if I saw it in Wired, it's because it was either already over, or it wasn't going to happen at all.
I was so disappointed when I saw BetterPlace (the car with replaceable batteries) on the cover of Wired. It seemed like such a good idea. Too bad the rule of thumb meant it wouldn't work.
Rules of thumb were made to be broken. Maybe this time it will be different.
Rodney Brooks has a great essay on why he's skeptical that the current humanoid hype will deliver and the central claim is that human dexterity is extremely advanced any today's humanoids lack even the sensors and data needed to start building the models needed to match human performance.
https://rodneybrooks.com/why-todays-humanoids-wont-learn-dex...
I saw him post this article on his Bluesky saying that they're the first ones he's seen that are close to cracking this issue (he's an investor/adviser).
I wonder how accurate joint positions and muscle activations can be from just a POV camera. Maybe it’s not crazy to think someone could get tens of millions of hours of well-labeled training data.
Yeah I’m going to completely disregard this because I feel like we are less than a year away from completely human feeling humanoids. This is based on nothing but obsessively watching and following humanoid progress on the internet.
What was eye-opening, or rather, sobering for me was when I read an interview with an engineer who explained how incredible difficult it is for a robot to orient itself when it is lying on the floor and wants to stand up.
Yes, it can do the required motions just fine, that’s not the point. But think about yourself when you are lying on the floor: it’s really easy to determine if this is safe, if you are lying underneath something and so on. You just feel that.
A robot cannot do that; all they can do is look around as good as possible and visually determine their situation.
I naively assumed they have a gravity sensor, so will generally have an approximate up vector ?
Yeah but imagine yourself lying on the floor with your vision being your only sense, plus an info floating in your mind: „fyi, you are no longer upright“.
That’s all, you feel nothing else. Now your job is to move all parts of your body in just the right way.
But I have more than that. I can definitely sense which way is up unless I'm underwater.
Or have an amusing inner ear infection. So OK, sure, it's vector, not a flag.
Theu have an IMU, what they don't generally have is the various aspects of touch.
The point about being aware of lying underneath some object was interesting. Sound might matter, like the frequency of background noise changes when you're in an enclosed space, and listening to your own shuffling noises helps you know when you've planted your feet right - or something. I have some really effective ear plugs and I notice they make it harder to move around.
Having said that, I've probably hit my head on the underside of an open cupboard door five or six times in my life, and I expect to do it again.
It is also things like "I can feel that my left knee is bearing a little too much weight, I should shift weight to my right hand and use that to push myself up" - things that come automatically to animals after learning the hard way in infancy (some of it is innate; baby animals are clumsy, but usually more mobile than human infants). Regardless of learned-vs-instinct, these abilities rely on sophisticated "sensors" and cognition. I suspect engineering the sensors is actually a bit harder, but I'm also not optimistic about a deep learning approach to the cognition.
A significant underappreciated advantage of animals over AI: lifeforms can "learn the hard way" more easily than 2020s robots because of cheap self-repair. AI labs are reluctant to damage their robots, but an essential part of humans learning to move safely is severely bonking your head and reckoning with the consequences - "hey, dummy, why did you trip and fall and bonk your head? Because you were running like an idiot."
I am learning the hard way to this day :) I have been practicing with work knives. A few months ago I got stupid and impatient, and sliced my thumb nastily. If I didn't block the cut with my thumbnail (still ruined) I might have chopped bone. It is hard to say precisely what I learned from this experience - "don't be stupid and impatient" is facile - but I know I learned a lot. I am actually optimistic about targeted surgical robotics. But for a general-use humanoid robot, I would not want to give it a knife if it's not capable of feeling pain. I never use big knives anywhere near my cats because I understand intuitively that they are nimble and unpredictable and easily stabbed by knives. I didn't need to be trained on this. A robot kind of does. Yikes.
I obsessively avoid any kind of "technology is going thataway" content. So I haven't seen anything that looks like humanoid progress in quite some time. About the only thing that has snuck around my barrier is Musk apparently claiming he'll have it by the end of the year, which is pretty conclusive evidence that they won't have it by the end of the year.
So if you're seeing anything that actually seems to merit attention, I'd love a few pointers. I could use some good news.
Well, as someone who has tried to build at least a couple small robot arms, I think we are probably closer to 20-50 years away. Both the power and dexterity are not there.
Right now, only a human can both push over a boulder and pick up a tiny speck from the floor using the same actuator.
Beware generalising from a carefully curated and presented set of demos to real life.
This one is different? What about unitree? What about their demo at the Spring Festival Gala?
https://www.youtube.com/watch?v=Ykiuz1ZdGBc
That sure felt "different".
No doubt hands are important, but I think you've missed a lot here Wired.
Many of the Chinese companies are doing very impressive open-loop sim2real. They make great demonstrations. They are not great at dealing with the real world and unpredictable environments.
(That's not true of all Chinese companies - some are doing really impressive work with closed loop systems in unpredictable environments. But many of the highly viewed ones with coordinated dance performances or martial arts are intended more as theater to government financial sponsors than useful function. The technically impressive performances do not look as visually impressive.)
those were impressive but were also RC. I think an important part of robotics is not just the mechanics of humanoid motion, but the independent control of those mechanics.
no paywall: https://archive.is/Wro1e
archive.is is malicious -- as in, uses your browser to launch DDoS attacks, and other things.
Stop using it.
https://arstechnica.com/tech-policy/2026/02/wikipedia-bans-a...
Anyone else here have happy memories of playing with Armatron? Circa 1984?
Apparently yes: https://news.ycombinator.com/item?id=43718493
Yes! The most amazing part about those things was they achieved all those axis' of motion with one or two motors.
> a ChatGPT moment for the physical world.
That's not a good thing, WIRED.
I want Rosie (fictional robot from the TV show "The Jetsons")
Basically, I want a robotic butler / maid that will do most of the cleanup around the house.
Haha! Instead, you’ll get a robot that will make you art, music, and tell you stories and you get to toil away cleaning the house.
Back in the 90s, I developed a rule of thumb: if I saw it in Wired, it's because it was either already over, or it wasn't going to happen at all.
I was so disappointed when I saw BetterPlace (the car with replaceable batteries) on the cover of Wired. It seemed like such a good idea. Too bad the rule of thumb meant it wouldn't work.
Rules of thumb were made to be broken. Maybe this time it will be different.