Hacker News new | past | comments | ask | show | jobs | submit login

Also once you fix a software bug, it’s fixed forever (generally). Humans introduce all sorts of fun variability every time the program is run.



Expecting mechanical accuracy from humans is a fool's errand.

ATC should be wearing VR goggles, visualizing approach and takeoff routing as it maps to flown with machines spotting the dangers similarly but differently from TCAS.


I interned at a VR lab at NASA Ames in the late 1990s. This very idea (ATC operations in low-vis conditions using VR or AR) was what fed their grant proposals. It has always been 20 years away; some of the things I learned:

1) VR itself can lead to spatial disorientation and will introduce its own control issues.

2) A significant percentage of people (~1 in 4) cannot use VR without motion sickness. This is independent of #1. Modern VR (Oculus etc.) at first claimed to be better, but guess what, plenty of people still get sick. Sinus congestion can cause this even in the tolerant.

3) Position reporting of planes today is nowhere near accurate, reliable, or real-time enough to present a whole picture of runway ops. This is fixable with enough $$$...but who pays?

4) I suspect "VR ops" procedures from the FAA would take years to be developed and approved, without some kind of urgent mandate.

My gut feeling is that we'll have automatic ground traffic control at major airports by the time the necessary systems are in place, and skip the goggled humans entirely.


And the VR goggles help them read the minds of pilots about the speed in which they move their planes around on the ground, and when?

Move fast and break things has no place in aerospace nor aviation, just rolling whatever fancy new tech is there is not done for reasons. And this behavior made aviation as safe as it is today.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: