Hey everyone, I am one of the authors of the paper described in this article. We used reinforcement learning and recurrent neural networks to learn a controller which can ascend and descend stairs without any vision-based perception, meaning that it must rely entirely on proprioception to walk. This is the first time (to our knowledge) that a human-sized bipedal robot has been able to climb real-world stairs blind to the world, and we're pretty excited about the results of this research.
I was always very unimpressed with VR until I borrowed a headset to play Half Life Alyx. That game convinced me VR is the inevitable future of gaming. The rest of the industry may not be there yet, but it is an obviously superior experience.
As a Valve Index owner myself, I will actually disagree with VR being "THE future".
Not all games work well in VR. For example, I would not want to play World of Warcraft in VR. Your character has too many abilities that you couldn't possibly map them out on a VR controller, and even if you could, it'd be immersion breaking.
Games played from a top-down view like RTS or colony/base/city builders, or games that might involve looking at a lot of text, would probably not work well either.
Index controller's individual finger tracking isn't good enough for it to be a critical game function. If I try to fold in just my middle finger (ie, to make "The shocker" gesture), it often detects my ring finger being slightly in as well.
Gestures in general just might not work well, especially using the same gesture over and over. Have you played WoW? If you're a caster, you're casting a spell every 1-3 seconds, and targeting as a healer might not work well, not to mention seeing all the health bars.
Also, I don't know how it would work for melee abilities at all.
Sure, you could do an RPG with gestures and the like for spell casting, and make melee combat interesting, but it wouldn't be WoW.
> Fly over the map like some kind of a demigod and manipulate cities directly? Sign me up.
The problem is the UI. Graphical UIs in VR are very limited since motion controls are not nearly as precise as a mouse, so buttons have to be big. Cities: Skylines would certainly look cool in VR, but the actual gameplay would suffer.
> I was always very unimpressed with VR until I borrowed a headset to play Half Life Alyx.
The word "borrowed" stands out for me: I also tried a cutting edge VR headset loaned from a co-worker but did not then rush out and buy one. Because the experience felt like more of a "novelty", not a day-to-day thing that I would switch over to.
This is mostly true for supervised and unsupervised learning models, but for reinforcement learning the LSTM is king because of the convenient fact that it can be evaluated one time step at a time, instead of just outputting a sequence like a transformer. For things like robotic control, etc, attention-based models are pretty nonsensical.
Not true, a transformer can be used in models without any lookahead, for example how it is used in gpt-2.! The real difference is the complexity of the model and the large increase in computational cost.
Wow. I have been looking for a good resource on implementing self-attention/transformers on my own for the last week - can't wait to read this through.
ECS = Entity Component System. DOTS = Data Oriented Tech Stack, which is the new implementation of ECS in Unity.
Current KSP is plagued by performance issues during advanced gameplay, particularly threading granularity issues with physics simulation when running large craft such as space stations or complex rocketry. There are some amazing KSP videos on YouTube with orbital construction on a grand scale, gigantic interplanetary vehicles, complex planetary bases and so on; their dirty secret being that the original gameplay is actually occurring at a painfully frustrating <10 fps and has been massively sped up for consumption.
GP is suggesting that using DOTS those issues can be relieved, by making more efficient use of available system resources.
ECS and DOTS aren't quite ready for prime time by Unity's own standards so there aren't many games actually shipping using ECS or DOTs because the API isn't stable.
yes, the locals just blame it on procedurally generated things which are totally hotswappable with the things people in another city blame. "I'm mad at those people with money (but would buy here at market rate if I could afford it)"
Marginally fascinating.
The irony being that if you get to travel enough to notice this, you are practically already exempt and privileged in some way.
I've been writing a C-based deep learning library for the past year or so, which I've recently been trying to clean up to be a bit more presentable. It was pretty heavily inspired by Darknet, although it admittedly has far fewer features: https://github.com/siekmanj/sieknet
Here's a link to the Arxiv submission: https://arxiv.org/abs/2105.08328
And here's the accompanying video: https://youtu.be/MPhEmC6b6XU
And an uninterrupted five-minute video of a test on an outdoor staircase: https://youtu.be/nuhHiKEtaZQ
Happy to answer any questions the HN crowd may have!