Hacker News new | past | comments | ask | show | jobs | submit login

Can this be done at the software level? I.e. feature built in to OS that modifies displayed image in the same way this screen does.



In the researchers’ prototype, however, display pixels do have to be masked from the parts of the pupil for which they’re not intended. That requires that a transparency patterned with an array of pinholes be laid over the screen, blocking more than half the light it emits.


This is required to create the intended light field, but the light field approach isn't the only one, you can do simple deconvolution also. I wonder how well that would work.


AFAIK, it's possible yes but you would need to have the distance of the eye from the display (maybe through webcam?) and the inverse required to achieve it may yield poor results depending on your aberration.

In their case they are sysnthesizing a light field which inverses the aberration everywhere, you'd be doing a inverse filter specifically for your viewpoint.

I don't know how good the results would be in practice, would I would like to see someone try.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: