Hacker News new | past | comments | ask | show | jobs | submit login

Phone cameras don’t have multiple lenses working together that you can interpolate between to create a single giant virtual lens. The technique OP is referring to is already used for the largest ground based radio telescopes.



Well, a lot of phones do have multiple sensors and lenses. For example, the iPhone's computational depth effect uses multiple cameras for a single shot.


That’s qualitatively different from the kind of stitching together that happens with, e.g. the VLA. The computational limits of phone photography don’t apply here.


Exactly correct. And Labeyrie has a design in place for the exact type of fleet I'm talking about.


Do you have a link to the design? I'm assuming it's a radio telescope?


No the design is optical. Here's a white paper on the concept. https://www.google.com/url?sa=t&source=web&rct=j&url=https:/...

Labeyrie has also proposed a fleet in L2 and on the Moon.


Not the OP, but I guess he is talking about the hypertelescope concept: https://hypertelescope.org/hypertelescope-en/

For additional background, there are already optical interferometry telescopes in use, see VLTI by the European Southern Observatory (Chile, shared facility with the four VLT telescopes and some smaller telescopes).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: