Hacker News new | past | comments | ask | show | jobs | submit login
Full-body tracking in VR using AprilTag markers (github.com/ju1ce)
117 points by podiki on Aug 20, 2021 | hide | past | favorite | 15 comments



YMMV. Having worked with AprilTags and Aruco markers, I can already see the limitations in the short video. Resilience is very good, very very, but not under strong movements and particularly any kind of motion blur (I'm looking at you, cheap webcams) (the video has quite slow movements). A decent size for the markers is also a huge requirement (around 10cm per meter from the camera. With a 4k camera I reached 2cm per meter with Aruco markers). Pose estimation is still a tricky problem and solving it is challenging at best. All of this is impressive in a very controlled environment, but can be disappointing in real world environments (and non trained users). I'm very curious how they'll solve all of these!

ps: oh and you need totally flat markers.


Very cool. Has somebody already attempted to make clothing with a visual fiducial system like AprilTag? Like gloves, socks, t-shirts, etc? Could be a really comfortable way to enable full body tracking without a complicated setup.


I think the limitation would be the flat-plane requirement of AprilTag. Velcro bands + premade tags with plastic backing would be nice.


OptiTrack is marker based. Zozosuit could have been a “poor man’s” alternative but not enough numbers of hackers got it apparently. Nevertheless there’s always Kinect, and ML-based markerless approaches looks interesting as well, wonder which ones are popular these days.


This sounds very doable, given how many made-to-internet-order manufacturing options are out there. And I agree: it would quite useful to many.


This looks like it works great for 180 degree tracking (ie if you are facing the camera.) Is there any way to make it so you could turn all the way around and still have it work, ie by adding more apriltag markers on the opposite side from the front ones?


It kind of looks like in the video that the person has a marker on the side as well as front, so it could potentially work. Maybe the side marker could be set to rotate the character


very surprised to see this repo linked in HN, when i just saw the it over at the steam community forum for this very topic : https://steamcommunity.com/app/250820/discussions/0/30438576...


I think that's where I saw it! Was just browsing around some SteamVR stuff and thought that was a very cool project. Will be fun to try out.

And I was surprised it hadn't been on HN before, as far as I could see.


Given unlimited AprilTags, how would this approach compare to a depth camera and Xsens/Rokoko mocap suit?


* less robust high speed tracking (2d barcode features often get destroyed by motion blur from consumer grade cameras)

* slower tracking in general (specialty hardware such as depth cameras often are higher spec: intel realsense can do 90fps, some mocap systems can do 1000hz)

* worse accuracy and precision in general (though it's related to the tag size, bigger tags are more precise/accurate)


I wonder if a more sophisticated software system could use a the motion blur as a velocity estimate to improve tracking accuracy


The Luxonis OpenCV AI Kit (OAK) project team is also actively implementing AprilTags [1] and my understanding is it will handle Tag Detection and Pose Estimation on the camera.

[1] https://github.com/luxonis/depthai-python/pull/298


this is sort of going backwards and then forwards in steamvr. In "the labs" you can explore (sort of) the secret valve developer room and its full of similar markers to let the device track 3d space.

This puts the markers on the person instead of the room, but maybe valve played with that too - I don't know.

Cool stuff - I might mess with this next month if I can find some time.


How much latency does this add compared to light house trackers f.ex.?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: