Hacker News new | past | comments | ask | show | jobs | submit login

Thanks. But excuse my dumminess, just need to get this right: so the three channels of data can be converted into RGB to give the "shades" to make a photo ... but you can pick a dominant colour filter, like blue, green or in the Horsehead's case, pink?

I am mesmerised by Juno's photos but wondering if it's true to the human eye (like as if we are there.) The photo caption hints otherwise [1] but I'm not sure if my understanding is correct!

[1] Multiple images taken with the JunoCam instrument on three separate orbits were combined to show all areas in daylight, enhanced color, and stereographic projection.




Basically, each narrow-band filter gives you a grey-scale image. You can assign whatever color you like to each of them, and then mix them to get a color image. If you have filters for the Red, Green, and Blue wavelengths, and then apply Red, Green, and Blue colors of the same wavelength to the images before mixing them, that'll get you as close to a true-color image as possible. (That's essentially what's going on in your camera, for capturing an image and then displaying it to you.)

Red, Green, and Blue are used because their wavelengths match the wavelengths that the cells in our retinas are sensitive to. That allows cameras to approximate human vision. But scientists use lots of other wavelengths too in order to see specific things more clearly, like Juno's methane filter. Every chemical gives off specific wavelengths of light when it releases energy, so filters that are tuned to those wavelengths make it easier to detect those chemicals. (I'm simplifying a bit here.) The false-color images you see from these missions are designed to combine multiple filters (that aren't RGB) and mix them using contrasting colors so that you can still see highlights from each of the filters.


A simple explanation: converting IR-images to "human-color" images. [1]

Your eyes don't see the Infrared, so you apply some conversion where the infrared scale is converted to some color scale, and you know that where you see more red (for example) means that you're seeing where the infrared was stronger.

[1] https://en.wikipedia.org/wiki/Infrared_photography


> wondering if it's true to the human eye

Not really connected to your main question, but here goes -

The point is moot in the case of the Horsehead Nebula. The image is so dim (in the eyepiece of a telescope) that the human eye cannot perceive colour.

Also, it's a lot smaller in the eyepiece than most people expect - so it's pretty easy to miss.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: