For the professional (or former professional) astronomers among us, I will make my somewhat amused observation that what people are most paying attention to is not really the distinguishing features of JWST.
People seem most impressed by the apparent increase in resolution of the images, which is not from a certain point of view the hardest thing to do . HST might have done that if its instruments had been of different pixel size or imaging array size / focal length. Ok, the much larger mirror is an achievement. But anyway, the resolution of the images is often not what really is the limiting factor for photometric observations. Yes it is sharper/higher resolution, but that wasn't the key selling point.
The new thing is observations in the IR, which is somewhat a technical footnote in many gushing announcements of these images (or some discussion here too). And the general public knows little about that detail's importance, especially since the images are stylized / colored anyway to look just like RGB images that we are so familiar with. But everyone can easily appreciate a sharper image.
Anyway, still a momentous achievement. And thank god we have a scientific field where stunning images was enough to get the public to support a $10B project.
**
Edit to add: I did not mean to detract from or diminish anyone's appreciation of the images and accomplishment at whatever level they are enjoyed. And of course many here are technically knowledgeable about the IR aspect. I just write to point out that for the most headline-grabbing images and newspaper writers, the sharpness of the images over the actual IR frontier is what grabs the attention.
Agreed! In the SMACS 0723 image, there is a red spiral galaxy near the top right which is effectively not present in the HST image because it was redshifted out of the spectrum. This implies it's one of the galaxies receding the fastest from us in the image right? And therefore also among the oldest and farthest away?
1) as you said, its flux is predominantly in the IR
2) it could have been fainter than the sensitivity of the HST instruments but now seen because of the sensitivity of JWST
But given that it appears so bright in the JWST image similar to other nearby galaxies that do appear in the HST image, your bet on #1 seems reasonable.
Also there is another point: rather than a highly redshifted galaxy it could be a very dusty nearby galaxy (also appearing very red) but if I remember right, that would have a slightly different signature. Dusty galaxies often aren't entirely dusty and have "lanes/channels/streaks" of dust that are interspersed among normal stellar regions, so if it were that, you would be seeing some bright spots outside the infrared. But this one has the shape of a normal galaxy but red all over, suggesting something affects the whole galaxy -- i.e. redshift.
Isn't one way to detect redshift to check if there's absorption of certain frequencies due to the light passing through matter on the way here? The absorption will be redshifted as well?
Related question: to confirm, some of the additional detail we’re seeing in the JWST images is in fact IR that has been “hue clamped” into the visible spectrum?
Hey, that's an interesting and uncommon question that I had not seen elsewhere --
They did not release or talk in much technical detail of how the images were assembled, which I'm sure will be done at some point.
I do not think the colors do correspond (at least not deliberately), for 2 reasons:
First is that the image of the Deep Field ("SMACS") contains galaxies at a range of redshifts. For example there may be galaxies quite near us (redshift z = 0 or close to 0) while others are more distant (the arced galaxies in the image being lensed that this image is famous for, at redshift z = 0.39), where redshift is the measure (1+z) of how much the wavelength light has been multiplied.
So regardless of what color mapping you chose, it would not be a perfect fit for all objects in the field of view. For the galaxies near us in the image (z=0), the wavelengths being converted to RGB don't correspond to what we would see in the optical.
Secondly, if it were remapped especially for the galaxy cluster of interest in this image, I don't think the colors are specifically tuned for that either.
In more detail:
Consider the optical color spectrum we see, ROYGBIV, or let me reverse it in order of increasing wavelength VIBGYOR -- and take the "RGB" 3 colors that might make up an image, or BGR to use that ordering -- this spans a wavelength range of say 400nm, 600nm, 800nm.
The imaging filters available on the NIRCam span 900nm to 4400nm (4.4 micron) and there are 29 of them [0]. Researchers choose which filters to use based on what they wish to study. And recall that the imaging sensor actually outputs grayscale only, it is the filters that give it a color view and individual images in each filter are assembled to create a color composite.
According to an example science program designed to take such images[1], the filters selected to be imaged might be 900nm, 1150nm, 1500nm.
If you applied the redshift of the galaxies (divide by 1.39 from the above info about the cluster of galaxies), the above sampled wavelengths in the image would still correspond to redder parts of the spectrum compared to what is visible if we were seeing the galaxies now: 647, 827, 1079nm.
So, no I don't think the color mapping was chosen to be accurate in a scientific sense of seeing what you would see if the galaxies were brought to the "original" view.
A tangential question. If these galaxies are so much redshifted, then they are probably very, very far. Like more than 1 billion ly far. And if that is so, they should look exactly the same today, tomorrow, in one year, or in a hundred years. If for some reason we wanted these images in much higher resolution, could we just point the camera to the same spot and take millions of shots and then apply a super-resolution algorithm?
I believe that in the image assembly pipelines for processing these astronomical images, they already do take into account / use the "dithering" patterns that you're hinting at. (Often the telescope will be pointed in a pattern with sub-pixel offsets over multiple exposures to do exactly this).
However, 2 factors:
1) there is an intrinsic limit I believe to how much more resolution you can recover (maybe a factor of approx. 2x?), for a lot more exposure time needed However, also at these faint levels of brightness you're also competing against intrinsic photon and sensor noise)
2) practically, given the value of the telescope's time and not much more to be gained (science-wise) from achieving this next order of spatial resolution, they want to spend the time on other new targets instead of sitting on the same patch for much more time.
Thanks for the write-up! It's definitely important to mention the single-channel aspect of the sensors because I guess many people probably don't understand that.
The typical digital camera of today works the same way, just with 3 filters (R, G, B) kind of like a CRT pixel array in reverse[0].
This kind of mimics the human eye, which is sensitive to those three approximate color frequencies, but it's interesting to note other species besides humans (and, apparently, even some humans)[1] have vision that work with more "filters", or on different spectra (such as honeybees).
I've always found it kind of amazing that so many satellite imaging devices work on far greater spectral ranges with far more color filters, being able to discern far more information than we could with the naked eye (but in essentially the same way).
The difference is the expansion of space is accelerating over time. And certain wavelengths of light will be blocked by gas in the intervening space, and which light is blocked changes over time based on how red-shifted it is. Special Relativity isn't enough to explain this.
NIRCam’s range starts at 0.6 microns (600 nm) so picks up a little bit of the red end of our visible range. But it definitely can’t differentiate a range of human visible colors.
Also, when comparing the deep field images I don't think it can be stressed enough the difference in exposure time between HST and JWST images (Hubble had a 10x longer exposure time). Many many more distant galaxies would become visible in the JWST image with the much longer exposure time. I look forward to seeing some long-exposure deep field images from JWST!
This is underappreciated. Much lamentation has been made of the fact that JWST's current mission length is only ten years (maybe twenty or thirty at best, but hard-limited by on-board coolant), but with the speed of its observations that ten years will be as productive as a century of Hubble time.
The increased resolution is extremely important given that the diffraction threshold is function of the wavelength and the mirror diameter.
And you can clearly see that in the MIRI images at a longer wavelength that have a noticeably lower resolution compared to Nircam.
If Webb mirror was as big as Hubble the resolution would have been bad in the long wavelengths.
Hubble couldn’t have had a better resolution with better instruments, he was already limited by its aberration problem and the new instruments were designed to mitigate that problem.
Right. After the initial "wow factor" has settled down, what's been most striking to me is the level of detail that's no longer obscured by gas and dust in these nebulae due to MIRI. I know very little about the study of stellar nurseries or planetary nebulae but I've seen enough pre-JWST images of them to know that astrophysicists just got a whole lot to sink their teeth into and I look forward to seeing further developments as more data is collected and existing data is studied.
That is a good point -- my simplification of the advance provided by JWST is aimed at saying, JWST is kind of the first major telescope with such depth and resolution to complement the deep fields of HST that everyone is so familiar with.
Herschel, while impressive also, was far-IR (if I recall) and much lower resolution, which was good for certain research areas, but less complementary to the HST deep fields for the, well, currently fashionable, recent study of galaxies at high redshift.
Well, also, one thing to note is that it took WEBB only hours to capture these images, while HUBBLE was aimed for days and months to capture comparable ones. It will be interesting to see the results when they have more time to capture longer sequences of light.
Yeah, more or less. People gripe about it occasionally, especially on the left, but then some shit will go down like Ukraine and suddenly nobody wants to look like an idiot (which will absolutely happen in an American context if you suggest cutting military spending right after Russia invades someone).
It’s not so much public support as the lack of public opposition.
Everyone in “big science” remembers the cautionary tale of the Superconducting Super Collider, which was cancelled mid-project when it became politically viable to oppose it as a waste of money. The circular tunnel is still sitting dusty and abandoned down in Texas while CERN runs another round on the LHC.
Big results that gather public praise go a long way toward making sure the next big science project will at least be seriously considered.
Global force projection at the benefit for all Western economies is very expensive. This money is required to even allow the form of economy "the West" is running.
How else were you going to keep up the Pax Americana that enables globalization by making significant global trade networks even possible in the first place? Who's gonna insure your freighter if international waters aren't protected by Western navies? Pirates, rogue states closing important channels, at will seizures for no reason... the list is long.
As the Pax Americana will likely soon fade through growing influence of the BRICS nations and "America First"-style ideologies, the 700B will probably wither away quite fast in the next decades - along with all the benefits we enjoyed since WW2.
Just one illustration. How man South American or African nations support the sanctions against Russia? How many Asian nations that are not Japan?
Public support for NASA drives the members of Congress to allocate funds for its work. So, NASA spends a whole lot of effort showing the public how cool its work is. That, and it divides its facilities and subcontractors across the various states to ensure that every member of congress has constituents who benefit from its payroll.
> is there public support for the >$700B a year spent on the military?
Of course. It's tempting to think we're in this new lovey-dovey age of an improved/superior humanity, but the reality is man's baser instincts are kept in check by BFGs and MAD.
In the second image it is clear that more of the nebula is visible, isn't this because more wavelengths of light are being detected? In this case an amateur absolutely can appreciate the technical improvements, the IR is mapped to a visible RGB spectrum...
Seeing through dust is part of it. Another part is redshift: because of relativity, things moving away from us appear redder (longer wavelength) the faster they’re moving away. That’s the same principle (Doppler effect) as the lower-pitched siren sound as the ambulance drives away. Because the universe itself is expanding, the farther something is from us, the faster it’s moving away relative to us, and the more redshifted it appears to us. And again because of relativity, the oldest objects we can see are the ones that appear farthest away (i.e. their light is just now getting here, after 13 billion years, from 13 billion light years away). Thus, if we want to study the earliest times of the universe, we must study the most redshifted objects—which have shifted all the way out of the visible spectrum and into the infrared. Hence, Webb is an infrared telescope.
Your statements about redshift and distance are correct, but I do not see where relativity comes into it. Even in Newtonian physics redshift occurs. If you could elaborate I would love to learn. Thank you!
Newtonian physics does have the Doppler effect which can cause red shift, but it doesn't accurately explain the red shift that occurs when observing distant galaxies. That's because this red shift occurs due to three factors:
1) The galaxy is moving away from us. This is most like the classic Doppler effect, but because of the high relative velocities involved, time dilation needs to be taken into account to model the red shift accurately, thus at least special relativity should be used.
2) The light travels through space with different curvature. For example, light originating near a very massive star will red shift when moving away from that star because it moves into less curved space. General relativity is needed to explain this effect.
3) The light travels through expanding space. For very distant galaxies this becomes the dominating factor of red shift, as we see an amount of red shift directly corresponding to their distance from earth. General relativity also explains this effect.
Yeah, you’re correct, you don’t need to invoke relativity to explain the Doppler effect. Sibling comment did a good job explaining the ways relativity does impact redshift, but my initial statement (“because of relativity…”) was not correct.
Good point, I shouldn't have said "only." I have heard of reflected light been used for exoplanet observations although I think it's quite faint. I guess some nebulae are illuminated by stars. I don't know if the reflected light is longer wavelength than the incident light, but I suspect so. BTW I'm not an astronomer!
> This is so dismissive and insulting. ... I'm sure you were well intentioned, but this comment read all kinds of rude and negative.
Your comment is way, way over the top. Their observation is entirely correct, maybe not in your circles, but certainly it's what I'm also seeing on Facebook and twitter.
Hell, NASA posted "The razor-sharp resolution of the @NASAWebb imagery was enough to bring astrophysicist Jane Rigby to tears."
Apologies, I did not mean for it to come off that way. I edited it to not make such broad brush statements.
Certainly people here and discussing it among those who get to watch these announcements during their day, are a more knowledgeable and appreciative group of the details. They have the time and info to know the "new" aspect.
I was just making the point that in the most headline-grabbing and CNN/newspaper science-writer genre, probably their and their audience just sees the image sharpness as they flip through the news.
I think this is a little harsh, but I actually generally agree. I'm a total laymen here and my PRIMARY TAKEAWAY has been the IR component and how much more and further away you can see because of it.
I get what you are trying to say here, but someone posting on HN about "the IR component" using the nom de guerre "Enginerrd" might not be "a total laymen" :-)
Although this reference nicely explains the origin of the diffraction pattern on the JWT images, it does not explain why the spikes seems to extend much farther then in the Hubble images. My hunge is that the JWT succeeds much better to gather all the light into a real point which makes the primary diffraction pattern stronger too.
While this is very important for scientific work (easier to see planets!) it is less appealing to the eye. Also note that some JWT images have some faint blue streaks which in effect are diffraction spikes from bright stars outside the field of view.
People seem most impressed by the apparent increase in resolution of the images, which is not from a certain point of view the hardest thing to do . HST might have done that if its instruments had been of different pixel size or imaging array size / focal length. Ok, the much larger mirror is an achievement. But anyway, the resolution of the images is often not what really is the limiting factor for photometric observations. Yes it is sharper/higher resolution, but that wasn't the key selling point.
The new thing is observations in the IR, which is somewhat a technical footnote in many gushing announcements of these images (or some discussion here too). And the general public knows little about that detail's importance, especially since the images are stylized / colored anyway to look just like RGB images that we are so familiar with. But everyone can easily appreciate a sharper image.
Anyway, still a momentous achievement. And thank god we have a scientific field where stunning images was enough to get the public to support a $10B project.
**
Edit to add: I did not mean to detract from or diminish anyone's appreciation of the images and accomplishment at whatever level they are enjoyed. And of course many here are technically knowledgeable about the IR aspect. I just write to point out that for the most headline-grabbing images and newspaper writers, the sharpness of the images over the actual IR frontier is what grabs the attention.