Hacker News new | past | comments | ask | show | jobs | submit login

It's all IR. It's color-mapped over RGB, but the sensors are IR.



Didn't the observed light shift into IR wavelengths due to the expansion of space over the last billions of years while it traveled to us?

Do the recolored images have any relation to what the original view would have looked like, or is it just arbitrary "artistic license"?


Hey, that's an interesting and uncommon question that I had not seen elsewhere --

They did not release or talk in much technical detail of how the images were assembled, which I'm sure will be done at some point.

I do not think the colors do correspond (at least not deliberately), for 2 reasons:

First is that the image of the Deep Field ("SMACS") contains galaxies at a range of redshifts. For example there may be galaxies quite near us (redshift z = 0 or close to 0) while others are more distant (the arced galaxies in the image being lensed that this image is famous for, at redshift z = 0.39), where redshift is the measure (1+z) of how much the wavelength light has been multiplied.

So regardless of what color mapping you chose, it would not be a perfect fit for all objects in the field of view. For the galaxies near us in the image (z=0), the wavelengths being converted to RGB don't correspond to what we would see in the optical.

Secondly, if it were remapped especially for the galaxy cluster of interest in this image, I don't think the colors are specifically tuned for that either.

In more detail:

Consider the optical color spectrum we see, ROYGBIV, or let me reverse it in order of increasing wavelength VIBGYOR -- and take the "RGB" 3 colors that might make up an image, or BGR to use that ordering -- this spans a wavelength range of say 400nm, 600nm, 800nm.

The imaging filters available on the NIRCam span 900nm to 4400nm (4.4 micron) and there are 29 of them [0]. Researchers choose which filters to use based on what they wish to study. And recall that the imaging sensor actually outputs grayscale only, it is the filters that give it a color view and individual images in each filter are assembled to create a color composite.

According to an example science program designed to take such images[1], the filters selected to be imaged might be 900nm, 1150nm, 1500nm.

If you applied the redshift of the galaxies (divide by 1.39 from the above info about the cluster of galaxies), the above sampled wavelengths in the image would still correspond to redder parts of the spectrum compared to what is visible if we were seeing the galaxies now: 647, 827, 1079nm.

So, no I don't think the color mapping was chosen to be accurate in a scientific sense of seeing what you would see if the galaxies were brought to the "original" view.

[0] https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

[1] https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam... ("select filters")


A tangential question. If these galaxies are so much redshifted, then they are probably very, very far. Like more than 1 billion ly far. And if that is so, they should look exactly the same today, tomorrow, in one year, or in a hundred years. If for some reason we wanted these images in much higher resolution, could we just point the camera to the same spot and take millions of shots and then apply a super-resolution algorithm?


I believe that in the image assembly pipelines for processing these astronomical images, they already do take into account / use the "dithering" patterns that you're hinting at. (Often the telescope will be pointed in a pattern with sub-pixel offsets over multiple exposures to do exactly this).

However, 2 factors:

1) there is an intrinsic limit I believe to how much more resolution you can recover (maybe a factor of approx. 2x?), for a lot more exposure time needed However, also at these faint levels of brightness you're also competing against intrinsic photon and sensor noise)

2) practically, given the value of the telescope's time and not much more to be gained (science-wise) from achieving this next order of spatial resolution, they want to spend the time on other new targets instead of sitting on the same patch for much more time.

(you can even try this at home: https://petapixel.com/2015/02/21/a-practical-guide-to-creati... )


Thanks for the write-up! It's definitely important to mention the single-channel aspect of the sensors because I guess many people probably don't understand that.


The typical digital camera of today works the same way, just with 3 filters (R, G, B) kind of like a CRT pixel array in reverse[0].

This kind of mimics the human eye, which is sensitive to those three approximate color frequencies, but it's interesting to note other species besides humans (and, apparently, even some humans)[1] have vision that work with more "filters", or on different spectra (such as honeybees).

I've always found it kind of amazing that so many satellite imaging devices work on far greater spectral ranges with far more color filters, being able to discern far more information than we could with the naked eye (but in essentially the same way).

[0]https://en.wikipedia.org/wiki/Bayer_filter

[1]https://en.wikipedia.org/wiki/Tetrachromacy


Looked like from what perspective? A human eye on Earth?


OP clearly means from an observer moving at (approximately) the same velocity as the object emitting the light.


The objects aren’t really moving in that sense. The expansion of space over time is what stretches the light.


Without a fixed point of reference, what is the difference?


The difference is the expansion of space is accelerating over time. And certain wavelengths of light will be blocked by gas in the intervening space, and which light is blocked changes over time based on how red-shifted it is. Special Relativity isn't enough to explain this.


NIRCam’s range starts at 0.6 microns (600 nm) so picks up a little bit of the red end of our visible range. But it definitely can’t differentiate a range of human visible colors.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: