Hacker News new | past | comments | ask | show | jobs | submit login
Vision-Correcting Displays [video] (youtube.com)
172 points by pioul on Aug 2, 2014 | hide | past | favorite | 48 comments



If I understand this correctly, this is a light-field display.

The implications are bigger than vision correction - LF display can reconstruct actual 3d images, as opposed to the stereo images being marketed as "3d" today. Stereo displays give two different pictures to two different eyes, but the don't provide perspective shift (the picture doesn't change when you move your head left and right), and they don't provide different focal planes (your eyes focus on the screen plane regardless of how far the object is supposed to be, creating a dissonance between the distance inferred from the angle between the eyes and the focusing distance).


Yup, I wonder if this technology could be used to improve head-mounted displays, to be able to focus on different parts of the virtual scene, not just being always focused on infinity (probably together with high-precision eye-tracking to figure out depth where you try to look at).


It can and will in relatively short order: https://www.youtube.com/watch?v=deI1IzbveEQ

Douglas Lanman, the researcher behind this technology work at Nvidia was hired a few months ago by Oculus VR.


Magic Leap got $50M first round funding to do just that:

http://www.nytimes.com/2014/07/15/science/taking-real-life-s...


Don't get this, what's the point in 1 device being corrected while the rest of the real world is blurry?

One use case example was a guy in a car with GPS navigation. So he can then see the GPS nav but how does he drive if he can't see properly!?


If someone's far-sighted, as many people above the age of forty are, they might be able to see the street perfectly well, but have trouble reading a GPS screen that's only fourty cm away from their face.

And speaking as one of the people whose vision defect can't be corrected by glasses, but only by uncomfortable contact lenses, there'd be immense value to me in a computer I can use without lenses - especially since reading on a screen is basically the only thing I need the damn things for while I'm at home. For someone whose work is mostly computer-based, you might be able to get away with taking your lenses out for the entire day and only wear them for the drive to work and back.


> they might be able to see the street perfectly well, but have trouble reading a GPS screen that's only fourty cm away from their face

So they would also not be able to see their speedometer, or their fuel guage...

This is a technological solution looking for problems that are far more conveniently solved by conventional means.


A GPS screen is not a simple dial. Reading small text on a computer you interact with in your car is quite different from determining the orientation of a glowing pointer.


There's nothing convenient about corrective lenses, especially if you are far-sighted or need bifocals and are driving. This is also why the portion of the video that mentioned the GPS also mentioned the speedometer.

However, until it is universally installed in cars and easily adjusted for your personal needs when moving from car to car, I doubt many people will be throwing away their bi/tri-focals in favor of lenses for distance only.


By conventional means, do you mean reading glasses? Observing people needing/using them, to me it seems like a hassle and definitely not a solution that most are satisfied with.

The video mentions both the GPS and speedometer as potential targets for this technology. Giving it some thought and being close to the age where I might benefit from this, it isn't such a bad idea after all.


Obviously I can't speak for other bespectacled folk, but I wouldn't call them a huge hassle. I put them on in the morning, and take them off at night. Once in a while I clean them. That's about the extent of my interaction with them.

(Although I'm short-sighted, so the arguments above don't apply to me anyway.)


While I've seen my dad struggle with reading glasses (and now bifocals) after not needing to correct his vision for the first 40+ years of his life, I've had mostly the same experience you describe for the last 20 years (thanks to inheriting my vision problems from my mom's side). However, in the last couple of years I've started taking my glasses off if I am reading for long periods of time, because my distance vision is getting bad enough that the correction is making the closer text slightly more difficult to read than it is without the lenses. Eventually, this might also lead to needing bifocals myself, despite the fact that I have no problems reading very small text within arms-length without glasses.

Essentially, I'm getting close to doing the opposite of what most people do with reading glasses. For now, I do most minor reading tasks and my work with my glasses on, but for lengthy reading I take them off. Over time, I'm sure, I'll end up taking them off (or looking below my glasses) for almost every reading task, and eventually for work as well.

In the end, though, I don't think the technology will be likely to really solve the issue for me, except to correct a few displays for my corrected vision. I have to pin most of my hopes, at the moment, on improvements in surgery and eventually being able to afford the surgery.


You're in for some future fun, then. I'm "Mr. Magoo" myopic, but in my late 40s the presbyopia fairy dropped in for an extended visit (and still hasn't indicated any desire to go home). So I have different needs for distance vision, for "conversational distance", for monitor distance and for the reading of tiny things, and there are gaps into which things may fall a well. Depending upon which field of range you're talking about (and which eye) I require correction somewhere between +3 and -6 dioptres. It is definitely a hassle, and quite unlike my carefree four-eyed youth.


Same for me with regular glasses but reading glasses for far-sightedness is another issue as you can't put them on temporarily while driving.


This would be great for computer work.

Many people have vision problems where the eye can't easily focus on a variety of distances. So if you have glasses, they're set up to make focusing on one distance easy, but things closer or farther (if that distance is not "infinity") is more difficult. I have progressive bifocals to get some amount of varying correction with distance, but this is thrown off for computers since they take up more of your field of vision than a book. (I also find looking at my phone through the bottom of my glasses to be pretty awkward, even though I've had bifocals since elementary school.)

One solution is an extra pair of glasses for computer work, but this is annoying because you can't get up and go to the bathroom without changing glasses. If the computer monitor were set to add the extra correction between my normal prescription and the intermediate distance prescription, my life would be much better. (Same for my phone.)

Right now, I can pretty much focus without using anything more than my distance prescription, but as I get older, this will become more and more difficult. So I'm pretty excited about this; less eyestrain is always good.

Also, the ability to tweak the correction in software is great. Where I have my monitor isn't quite what the optometrist was expecting when I had my computer glasses made, so I have to change positions to use them. With control in software, I could just adjust some config file somewhere when I rearrange my desk.


You'll find out when you hit your forties and you're trying to decide whether you need glasses or just slightly longer arms.


Charity school with 25 tablets, shared among 500 poverty-stricken children in a third world country. If the correction can be adjusted in software and this can be made cheaply, it's a no-brainer application. The video indicates that it can indeed be made cheaply and adjusted in software.

Sounds much simpler than trying to ensure a continuous supply of glasses for 500 energetic kids


Not necessarily, see this guy for example: https://www.ted.com/talks/josh_silver_demos_adjustable_liqui...

Although I'm skeptical that third world countries will have access to high tech vision correcting tablets but not eyeglasses.


I've been nearsighted for a long time and wear glasses. Recently I developed presbyopia, and now cellphone screens look blurry unless I remove my glasses. Most of the time my vision with glasses is fine, until I want to look at my phone. A screen that looked sharp to my ancient eyes with my glasses would be wonderful. First world solutions!


If you're nearsighted, maybe you could use this in combination with glasses to project near displays to look like they're farther away. This would eliminate the need to focus on near objects when you're reading or computing.


There are already tablets for the kitchen, for blind people, and so on. I think the "tablets for old people" niche could be a pretty big one, and all of them could feature such displays.


Yes I was also questioning the benefits of a less blurry GPS if I am about to plow into oncoming traffic.




One can imagine a pair of normal eyeglasses which have an IMU or accelerometer of some sort in them. When they sense they are being taken off, the users' phone switches profiles, the screen blurs, and the user moves seamlessly from glasses to phone without ever realizing what took place.

Would be a neat touch. I like connected appliances that aren't. (if that makes sense)


I've always thought it would be interesting to correct vision at the brain/neural level rather than the physical level. Can anyone comment on whether this would be possible?


This is indeed an interesting question. Why can't the brain develop a reverse blur function? We can do this with algorithms (http://en.wikipedia.org/wiki/Adaptive_optics). There was a fad (I guess) once about correcting your vision with practice (http://en.wikipedia.org/wiki/Bates_method).

I guess this is a limitation of the plasticity of our brains. People who have hearing or vision or other losses as young children adapt faster and better than people who have these losses when older.

It would be interesting to understand if young children with vision deficits can learn to see better with time.

My personal recollection is that I had NO idea things were blurry until my first visit to the optometrist when they put glasses on me. Things were so SHARP!

I think the core of this is that out adaptability depends on sensori-motor loops. We can calibrate our responses for faulty sensors, but we don't correct just for the sake of correctness.


> Why can't the brain develop a reverse blur function? We can do this with algorithms

Adaptive optics requires more than just algorithms -- the algorithms' output are fed into a rapidly moving reflective surface to de-blur the incoming light [1] -- so this is a bad analogy. You might be thinking of some of the deconvolution algorithms [2] that the Hubble Space Telescope used before its "eyeglasses" were installed in 1993 to improve its flawed images.

[1] "Adaptive optics works by measuring the distortions in a wavefront and compensating for them with a device that corrects those errors such as a deformable mirror or a liquid crystal array." https://en.wikipedia.org/wiki/Adaptive_optics

[2] "The error was well characterized and stable, enabling astronomers to optimize the results obtained using sophisticated image processing techniques such as deconvolution." https://en.wikipedia.org/wiki/Hubble_Space_Telescope#Flawed_...


Exerpt from http://www.inference.phy.cam.ac.uk/itprnn/book.pdf (page 564):

"Deconvolution in humans

A huge fraction of our brain is devoted to vision. One of the neglected features of our visual system is that the raw image falling on the retina is severely blurred: while most people can see with a resolution of about 1 arcminute (one sixtieth of a degree) under any daylight conditions, bright or dim, the image on our retina is blurred through a point spread function of width as large as 5 arcminutes (Wald and Grison, 1947; Howarth and Bradley , 1986). It is amazing that we are able to resolve pixels that are twenty-five times smaller in area than the blob produced on our retina by any point source. Isaac Newton was aware of this conundrum. It's hard to make a lens that does not have chromatic aberration, and our cornea and lens, like a lens made of ordinary glass, refract blue light more strongly than red. Typically our eyes focus correctly for the middle of the visible spectrum (green), so if..."

I recommend the read for those interested. There's even an experiment you can do yourself to experience your own eyes limitations!


If the input (vision) is not corrected before it enters the eye then anything done afterwards, like at the neural level, would likely be similar to image post-processing filters in image editors.

By no means an expert in this area but I would think the best that could be done with this type of post processing would likely amount to applying a highly specialized "Find Edges"/"Sharpen" filter in Photoshop to a blurred photo (except at a really high resolution and at > 60 frames per second).

An interesting possibility with that kind of neural-level post-processing could be an on-demand digital-zoom effect so you could do a 2X-32X zoom on a faraway road sign or to "zoom in" to something really close at a macroscopic level (for surgeons/jewelers).


I think this happens to an extent. The poor focus of bad eyesight is effectively the same as a spatial low-pass filter, and based on my own perceptions, I think the visual processing chain applies sharpening or spatial equalization to try to compensate. However, it's not enough; it can't actually restore detail that was never provided to the rods and cones in the first place.


You lose information when an image is not in focus - and you can't recover that information. You can only guess what the "real" image might have been, with some (hard-coded) heuristics and statistical methods of inference. Then again, this is already done by your brain.



Link to the paper and supplemental information:

http://web.media.mit.edu/~gordonw/VisionCorrectingDisplay/


I'd love for this to become common, since I have worn glasses for near sightedness since childhood. However, the biggest problem I see with this is that it is customized on a per user basis.

So, while I can use my phone just fine, you can't cause it's calibrated to correct my vision deficiency. I guess that it can be used on something extremely personal; like a phone, but I don't see it becoming main stream for most displays, like a shared tablet or a computer or something.


It's an interesting idea, but even for the narrow use-case of looking at a display it does not replace for everyone the need of wearing glasses/lenses. This kind of display may only "fix" something that affects both eyes in an equal measure. If the image distortion have to be different from one eye to another, then an individual-eye-level correction is needed.


Can this be done at the software level? I.e. feature built in to OS that modifies displayed image in the same way this screen does.


In the researchers’ prototype, however, display pixels do have to be masked from the parts of the pupil for which they’re not intended. That requires that a transparency patterned with an array of pinholes be laid over the screen, blocking more than half the light it emits.


This is required to create the intended light field, but the light field approach isn't the only one, you can do simple deconvolution also. I wonder how well that would work.


AFAIK, it's possible yes but you would need to have the distance of the eye from the display (maybe through webcam?) and the inverse required to achieve it may yield poor results depending on your aberration.

In their case they are sysnthesizing a light field which inverses the aberration everywhere, you'd be doing a inverse filter specifically for your viewpoint.

I don't know how good the results would be in practice, would I would like to see someone try.


Cool, but I think most of the people with glasses wouldn't want to rely on gadgets even more. If you can see display, but can't see physical controls or paper, that's an issue waiting to happen.


I think this is a great idea, and wonder why it took so long. I'd love to be able to take my glasses off and read in bed, or take my glasses off for a few hours at work while in front of my monitor.



I wonder if there's an analogy to be drawn between the pinhole mask they use and the lithography masks used to etch ICs on silicon.


Sounds like the old "Magic Eye" pictures


I was told the following legend about how shoes were invented. A haughty princess wanted to leave her pristine castle and explore the world. But she found the world very dusty. She told her wise men to come up with a plan to cover the world in a lush carpet. The wise men pondered and said that this could not be done. The princess threw a fit. The wise men pondered some more. Then they said "Princess, we can not cover the world in a large carpet, but we can, however, cover your feet in a small one". And the princess was pleased. And that, kids, is how we got shoes.


It's not crazy to go barefoot/slippers in your home though.


Yeah, but I don't even like shoes ....

On a serious note, this does seem like a dumb idea. Why would you create a million corrective devices for the myriad objects in the real world. Why would you not rather create something small and portable that you could apply to all those objects equally?

OH wait, they have that. They're called eyeglasses, I think.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: