Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, but in RoboCup 3D the robots fall all over themselves and can barely play: https://www.youtube.com/watch?v=7T1ElDs5eSQ - RoboCup seems focused more on simulating robot movement than the game itself.

This new Google competition appears to be more about ML and playing the game, judging from videos: https://www.youtube.com/watch?v=F8DcgFDT9sc




That's a fair characterization; the 3D league is meant to incorporate physical constraints, and learning "through" those to the higher level aspects of the game is challenging. That's what the 2D league exists for though.

The Google simulator lets you learn from pixels if you want to, but the agent that you're controlling only has 8 actions [1] available to it, so the learning problem here really has no bearing on a robotics application or anything in the real world that I can think of.

[1] https://arxiv.org/pdf/1907.11180.pdf


RoboCup also has leagues for wheeled robots, where there's a lot less falling over: https://www.youtube.com/watch?v=_Y5_iGxWFrQ

The humanoid robot leagues are constrained by cost - several of the simulated leagues, including the one shown in your video, have a corresponding physical league. And the physical league needs two teams of NAO robots, which cost ~$8000 each, and there are 11 on each team. So you're looking at $176,000 of student project for the physical league. You've correctly identified they don't work very well - but more agile robot hardware would cost even more. I've never seen two atlas robots together, let alone 22!




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: