Hacker News new | past | comments | ask | show | jobs | submit login
Flying a Drone in First Person View Using the Oculus Rift (laughingsquid.com)
81 points by radley on July 25, 2013 | hide | past | favorite | 34 comments



This is cool, but the Oculus is not the optimal tool for the job right now.

I'm in the FPV hobby for over half a year and fly quadrocopters and model airplanes using video goggles. The Rift currently has one important problem that, until overcome, will prevent it from gaining traction in this area: latency. It doesn't have an analog video input, and having to digitalize and process the video for the Rift introduces a delay into the signal that makes controlling your model uncomfortable at best, impossible in many cases.

Because of this problem, pretty much everybody uses analog video for FPV these days. There are goggles on the market made specifically for the job: FatShark series, the Zeiss Cinemizer HD and others. Some of them are HD and 3D capable (Cinemizers), many are fitted with built-in motion trackers (e.g. FatShark Attidude SD) that are easily bound to servos on the model controlling camera movement (pan-tilt setups), and there are external motion tracker modules easily used with the goggles that don't come with them. With no delay, flying even very fast aircraft is actually much easier in FPV than using traditional line-of-sight control. The latency problem is worse than it might sound: e.g. the GoPro cameras introduce a very small delay on their analog outputs (<50ms), and even this is absolutely noticeable by an experienced pilot in FPV flight and influences the control the pilot has considerably; the best latency currently achieved with various FPV setups using the Oculus Rift is in the 200ms region.

Another advantage of using "traditional" motion trackers over the Oculus Rift is that they output PWM signal, ready to be fed into your R/C transmitter and then used on the model with servos to move the camera.

Now, I understand that HD and 3D is eventually the future of this hobby, and the Oculus Rift might very well at some point become a very viable goggles and motion tracker assembly. But until the latency issue is solved, it's not going to gain a wide audience.


They acknowledge the latency of about 120ms. I can see this being a problem with smaller drones, but high latency hasn't stopped the military for remotely flying drones in Afghanistan from facilities in Nevada.


It hasn't stopped them, because nothing stops them. Dead children doesn't stop them so why would latency?

The latency on the military drones is actually far greater due to the latency on the satellite connections. The military drones are controlled on take off and landing via radio, but switch to satellite control for the mission. The lag does indeed affect the control. US military drones crash all the time. [http://www.zerohedge.com/news/us-drones-dropping-flies]

None of this matters when you can just get the Fed to print more USD and buy US treasury bonds with them and you just use that money to buy more drones.


This is because the UAVs have a high degree of autonomy - the technology is probably closer to the Mars probes than to the RC drones.


If you use autopilot to assist you flying the model, a delay is not a big problem. However, most people fly models in full manual, with the FPV setup giving them an "in-cockpit" experience. Even for fairly slow multicopters, a latency of >100ms is really bad for anything except slow hovering over open terrain (which gets incredibly boring very quickly).


The stated latency for the Rift was ~16ms--the other 100ms or so came from the rest of the pipeline.

If that could be brought down--say, by an FPGA or Paralella or something, maybe it wouldn't be so bad.


Isn't a 'delay' inherent to digital encoding? I mean, even if the encoding could be done virtually instantly, I seem to recall most of these methods encode differences between frames, including some frames ahead of the current one. Since we can't look into the future, you necessarily have at least a few frames delay? Unless there are some efficient digital encoding algorithms that don't need this?


I used to fly RC airplanes back when I was in high school. I've been out of the loop for 20 years now... are there good sites to re-engage the RC world and/or get into FPV?


I recently got into the hobby and found that Youtube was honestly the best source of information. There's a channel called RC Model Reviews that's particularly good. It's run by a guy named Bruce Simpson that's a bit of a crazy old Kiwi but he has a ton of really solid explanations of nearly everything you need to know getting started out.

One cool thing is that the past couple of years have seen some serious disruption in the field. Open source firmwares for controllers are a thing now, and it's had nearly the same effect on RC that the WRT54G had on the consumer router business - you can now do things with very cheap hardware that used to cost hundreds or thousands of dollars.

Some links for you:

ER9X Firmware - https://code.google.com/p/er9x/

FrSky, a company doing a lot of interesting things in terms of radio equipment - http://www.frsky-rc.com/

The aforementioned RC Model Reviews - http://www.youtube.com/user/RCModelReviews


Re-engage the RC world:

http://rcgroups.com/ http://flitetest.com/

RCGroups - contains endless gold for enthusiasts and hobbyists and mad scientists, alike. I suggest you get right into the forums in whatever subject interests you, and start looking for the build threads. This site, alone, contains enough information to get you seriously hooked on RC again.

FliteTest - fun, easy to watch videos on getting into the hobby. Also, these guys produce easy to make planes that will give you all the basics - me and my 6 year old recently got the FT Basics kit (nutball, delta, flyer) along with an FT Spitfire 2-pack, and we're having the best time of our lives flying a nice variety of cheap, easy to build/repair, planes. We'll probably go FPV in a little while ..


http://fpvlab.com is a good forum for all things FPV.


This is my favorite RC/FPV YouTube channel: http://www.youtube.com/user/flitetest?feature=watch


Oculus is perfect for this.

I was really hoping the cameras would be mounted on a 3-axis gimbal, with each axis controlled by the Oculus motion sensing.

The Oculus is a great option for for first person flying, even with using it as a regular 2D display, $300 is pretty cheap compared to most head mounted displays.


The latency of physically moving cameras with head tracking would be horrendous. Perhaps if you also used wide-angle/fisheye lenses and somehow synchronized the virtual panning with physical panning it would be workable, as long as you don't move your head too quickly.


FPV on RC planes with head tracking+servo controlled cameras is already pretty common in the hobby.


The control doesn't have to be 100% absolute. You can have the video 'projected' within a 3D environment and adjust it's position within that environment based on the camera's physical movements, but you can let the 1:1 tracking from the headset overshoot the boundaries so the user doesn't get too much of a jarring experience. It would feel as if they were in a cockpit.


Trying that approach will probably result in people throwing up. In an immersive environment even short latencies are extremely disorienting. When I first started recording audio for film, the industry-standard recorders used Digital Audio Tape and there was a switch to monitor either the live feed from the preamps or the recorded feed from the tape (so you could be sure you were recording - you'd be surprised how easy it is to mess this up over the course of a 12-hour workday).

The latency was small, on the order of 10-12 milliseconds, but it took me months to overcome the weirdness of opening your mouth to speak but not hearing anything until ~1/100th of a second later. I shudder to think what it would be like to have this in your visual cortex but with a longer delay and also while your eyes are telling you that you're floating above the ground.


I think you're right. Latency is very disorientating.

Here's how I would do it: The craft has two wide angle cameras. The Oculus viewer is at the center of a sphere in 3D space. Project the video feed onto the inside of the sphere, and keep the position of the feed anchored to where the camera is currently positioned in relation to the craft.

Moving your head moves the frustum immediately, and the camera position lags behind, so if you moved too quickly (say, turned 180 degrees) the frustum moves right away, but the video only comes into view after the cameras are repositioned. There could be numbered checkerboard or other pattern shown wherever the video feed is not projected, so spatial disorientation is minimized.

That way you get the cockpit feel and immediate movement in the Oculus, but avoid the uneasiness that latency would give with a more simplistic approach.

Another option would be to use 360 degree cameras to fill up the entire sphere, and then use the binocular cameras to provide high detail.


This is basically what I was hinting at, but you described it a lot more clearly than I did. Basically using a virtual 3D environment in which the player can 'be' and view the video feed through a virtual 'window'.


Interesting approaches - don't have an OR or a high-powered drone, but that's intriguing enough to do a duct tape test at home with existing gear. Thanks for the ideas.


So, kind of like ghetto dual-paraboloid mapping?


Yes, exactly.


It's really interesting to introduce a longer delay (100 ms?) and try to speak while monitoring your delayed voice. At least for me it comes out completely garbled, like I'm having a stroke or something.


This is really cool. A few buddies and myself got this setup working with an AR drone and a Rift a few months back. I'll try to get videos up somewhere soon. If you are into this kind of stuff and in the Bay Area check out this meetup: http://www.meetup.com/SF-Drones-Startup-Meetup

We had the people from Velodyne who make the 3d Lidar for googles self driving cars present last night and demoed their newer 1kg unit.


This really doesn't seem impressive. Why the heck do they even make the copter fly around with a WHOLE GODDAMN LAPTOP, this is about as wasteful as you can possibly get…

Something better which I've already seen is a gimbal[1] (which you can build for <100€) with one or two (for 3D) gopro cams. Weighs less and is way more awesome.

[1] http://s3files.core77.com/blog/images/2013/04/freefly-system...

Addendum:

This is the sort of steadiness you get with a gimbal btw: https://www.youtube.com/watch?v=NmSFZydbVnc


People having been flying with a FPV headset on RC planes for a while, but hopefully the Rift will be a cheaper and better quality option. Also, this makes it easier to make a rotating camera using the Rift's integrated IMU, for that truly immersive experience!


I'll second this. The IMU (And Rift SDK) is really really good. I can't wait to get my hands on the HD version of Rift.


Looks like fun. A shame the left/right views aren't on opposite sides, then one could view it cross-eyed to get the 3D effect without glasses.

I wonder if it would make you nauseous looking through the Occulus?


For those of us with 3-D stereographic displays or TVs, anybody know if there's an mpo file of this test?


Does anyone know which RC drone are they using for this?


I think this is the drone they are flying with... http://blackarmoreddrone.com/store/ Probably just a touch out of the hobby market price range with a $49k base. I think that's US dollars but I'm not 100pct sure.


They're based in Sweden I believe, so even if it was in Krona it would still be around $7,500 USD, which is still pretty pricey. Though based on their website (which doesn't appear to use geo location by ip) I think it's USD as well.


What is the point of this?

The cool thing about the rift is how modern components and graphics capabilities allow for a light VR device to provide a very immersive experience.

This is basically being used to show video from the drone's camera. No stereoscopic image, no controlling using the gyro/accelerometers. Also, low latency, that is one of the most important aspects of what makes the rift experience interesting, certainly wasn't part of this experiment.

No point in posting a video showing new/trendy devices being used in such a meaningless way, just for the sake of combining cool toys


> No stereoscopic image

Did you even click the link? Of the three sentences, the second one starts out with "Using its two cameras", and the two cameras are shown within the first 10 seconds of video.

>What is the point of this?

The stereo view allows for depth perception in flight. They're getting fish eye lenses, and one thing that's hard to express about VR helmets like this one is the sense of space around you, even without stereo vision. This will create a feeling of flying that no monitor could ever hope to achieve.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: