Phased arrays are very cool tech. Personally, I can't wait for visible-wavelength optical phased arrays to hit the mainstream (they're just now being implemented), since they'd enable tech like legitimately holographic displays and video cameras with digitally programmable optical zoom.
Microlens arrays, for all intents and purposes of consuming and recording media (for human consumption), are essentially equivalent to ideal phased array optics (while having spacing much larger than phased array usual requirements). We perceive light non-coherently, phase information is lost to our eyes; microlens arrays suffice to reproduce a light field sans phase effects[1] -- that includes the perception of most phenomena that are affected by light coherence, like oil on water or viewing laser experiments (not totally sure, but I don't think there are experiments that can't be reproduced by geometric-optical light fields that don't rely on coherent measurement?; maybe some interference phenomena though?).
There are practical problems with the technology though, as light sources we can currently make have some minimum size limitations, and incoherent optical behavior starts to degrade at very small lens size (at or below micron scale I guess).
[1] You could probably create a good approximate phased array optics with led-scale (~ 10 micron scale) coherent lasers as light sources, but again I don't see any application that's not scientific
Microlens arrays seem very interesting! I haven't seen any that are very high resolution... Is that because they kill the resolution of the display they're placed upon (as well as technical problems with very small microlenses)? So, we'd need extremely high resolution displays for microlens array displays to look reasonable by modern standards?
Perhaps actual phased array optics wouldn't have that issue?
For those just entering this thread, [1] is an example of a rudimentary microlens array display.
Looking glass factory is currently producing a display that puts microlenses on an 8K screen. The video of it looks quite good. It outputs 45 different angles, so the effective resolution is roughly 640p, right about at the bottom end of HD. Not the best but it's good enough for an lot and I'm sure the limit will improve over time. It's only horizontal parallax, but that's generally fine for a fixed screen.
holographic displays would use eye trackers to show each eye a different image. Solid state zoom is maybe a bit of a stretch, but it would involve pixels becoming sensitive to angles more inward or outward from the sensor's center.
I'm not an expert, but I believe that's not how holographic displays would work with optical phased arrays. I believe a phased array can make it seem that light is being emitted from any point above the display (within the display angle of the opposite side of the display). There's no need to track observers, because it would be an honest reconstruction of the light emitted from a real 3d dimensional object.
This chapter by Brian Wowk is a good start. It explains the basics and compares optical phase arrays against holography (that poor word, keeps getting abused!):
The light field conceptually belongs to geometric optics. It doesn't involve phase or the wave theory at all. It would be either an input to or output from a phased array.
Phased arrays admittedly use the wave theory in a way which is geometrically simple. Maybe "far field" is the way to describe it.