Hacker News new | past | comments | ask | show | jobs | submit | more siekmanj's comments login

Hey everyone, I am one of the authors of the paper described in this article. We used reinforcement learning and recurrent neural networks to learn a controller which can ascend and descend stairs without any vision-based perception, meaning that it must rely entirely on proprioception to walk. This is the first time (to our knowledge) that a human-sized bipedal robot has been able to climb real-world stairs blind to the world, and we're pretty excited about the results of this research.

Here's a link to the Arxiv submission: https://arxiv.org/abs/2105.08328

And here's the accompanying video: https://youtu.be/MPhEmC6b6XU

And an uninterrupted five-minute video of a test on an outdoor staircase: https://youtu.be/nuhHiKEtaZQ

Happy to answer any questions the HN crowd may have!


I was always very unimpressed with VR until I borrowed a headset to play Half Life Alyx. That game convinced me VR is the inevitable future of gaming. The rest of the industry may not be there yet, but it is an obviously superior experience.


As a Valve Index owner myself, I will actually disagree with VR being "THE future".

Not all games work well in VR. For example, I would not want to play World of Warcraft in VR. Your character has too many abilities that you couldn't possibly map them out on a VR controller, and even if you could, it'd be immersion breaking.

Games played from a top-down view like RTS or colony/base/city builders, or games that might involve looking at a lot of text, would probably not work well either.


I beg to differ. The major problem with VR for me is fatigue - can’t wear the headset for more than about an hour. If that was solved I’d never leave.

> you couldn't possibly map them out on a VR controller

Hand tracking + 10 fingers, possibilities are endless! Like casting spells for real.

> Games played from a top-down view like RTS or colony/base/city builders

Fly over the map like some kind of a demigod and manipulate cities directly? Sign me up.


Index controller's individual finger tracking isn't good enough for it to be a critical game function. If I try to fold in just my middle finger (ie, to make "The shocker" gesture), it often detects my ring finger being slightly in as well.

Gestures in general just might not work well, especially using the same gesture over and over. Have you played WoW? If you're a caster, you're casting a spell every 1-3 seconds, and targeting as a healer might not work well, not to mention seeing all the health bars.

Also, I don't know how it would work for melee abilities at all.

Sure, you could do an RPG with gestures and the like for spell casting, and make melee combat interesting, but it wouldn't be WoW.

> Fly over the map like some kind of a demigod and manipulate cities directly? Sign me up.

The problem is the UI. Graphical UIs in VR are very limited since motion controls are not nearly as precise as a mouse, so buttons have to be big. Cities: Skylines would certainly look cool in VR, but the actual gameplay would suffer.


> I was always very unimpressed with VR until I borrowed a headset to play Half Life Alyx.

The word "borrowed" stands out for me: I also tried a cutting edge VR headset loaned from a co-worker but did not then rush out and buy one. Because the experience felt like more of a "novelty", not a day-to-day thing that I would switch over to.


One of the problems is that competitive fps gaming probably not going to move away from the mouse/keyboard inputs in the short term.

And sitting down looking forward in a vr goggle, and turning your character around with a mouse induces wild nausea for a number* of people.

*I’m unfamiliar with the exact ratio.


“The future” != “future of gaming”


Actually, it may be.


The future of gaming is already here; it's called "pay to win mobile multiplayer shooter".

Video games are only a way to get a quick dopanine rush, and that's the local optimum that gets you your rush quickest and easiest.

VR is not needed.


This is mostly true for supervised and unsupervised learning models, but for reinforcement learning the LSTM is king because of the convenient fact that it can be evaluated one time step at a time, instead of just outputting a sequence like a transformer. For things like robotic control, etc, attention-based models are pretty nonsensical.


Not true, a transformer can be used in models without any lookahead, for example how it is used in gpt-2.! The real difference is the complexity of the model and the large increase in computational cost.


Python doesn’t really have scope in that way. Variables declared inside an if statement can be accessed outside of the if statement.


Control-flow blocks aren't scoped in most popular programming languages.


name one except python


Wow. I have been looking for a good resource on implementing self-attention/transformers on my own for the last week - can't wait to read this through.


Unfortunately, looks like KSP 2 will also be using Unity: https://forum.kerbalspaceprogram.com/index.php?/topic/187315...


Won't be a problem if they're using ECS and DOTS


What do those acronyms mean?


ECS = Entity Component System. DOTS = Data Oriented Tech Stack, which is the new implementation of ECS in Unity.

Current KSP is plagued by performance issues during advanced gameplay, particularly threading granularity issues with physics simulation when running large craft such as space stations or complex rocketry. There are some amazing KSP videos on YouTube with orbital construction on a grand scale, gigantic interplanetary vehicles, complex planetary bases and so on; their dirty secret being that the original gameplay is actually occurring at a painfully frustrating <10 fps and has been massively sped up for consumption.

GP is suggesting that using DOTS those issues can be relieved, by making more efficient use of available system resources.

refs:

https://en.wikipedia.org/wiki/Entity_component_system

https://unity.com/dots


ECS and DOTS aren't quite ready for prime time by Unity's own standards so there aren't many games actually shipping using ECS or DOTs because the API isn't stable.


As a game eveloper I was hoping to see they might make the switch to UE4 but guess they have too many solutions in Unity for that to make sense.


I was hoping they’d switch to Nintendo Switch :-)


Interesting, lots of people say the exact same things about Portland. I would guess every booming city is facing this dilemma.


yes, the locals just blame it on procedurally generated things which are totally hotswappable with the things people in another city blame. "I'm mad at those people with money (but would buy here at market rate if I could afford it)"

Marginally fascinating.

The irony being that if you get to travel enough to notice this, you are practically already exempt and privileged in some way.


> theoldnet.com is currently unable to handle this request.

Not sure if intentional or hilarious irony.


I've been writing a C-based deep learning library for the past year or so, which I've recently been trying to clean up to be a bit more presentable. It was pretty heavily inspired by Darknet, although it admittedly has far fewer features: https://github.com/siekmanj/sieknet


Andrej Karpathy has a pretty good introductory article to RNNs here: http://karpathy.github.io/2015/05/21/rnn-effectiveness/

He has some code which is pretty easy to follow to go along with the article: https://gist.github.com/karpathy/d4dee566867f8291f086


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: