Hacker News new | past | comments | ask | show | jobs | submit login
NASA's Webb captures an ethereal view of NGC 346 (phys.org)
92 points by wglb on Oct 16, 2023 | hide | past | favorite | 26 comments



Its funny how physical systems of vastly different scale and physics get visually mapped by our brain into images that feel somehow "familiar" (wispy clouds).

In fact truly "alien" imagery is not easy to come by. Star fields and galaxies are too abstract, they dont generate a sense of place. On the other hand most inner solar system images are startingly familiar, the sense that you could be walking there is palpable.

One exception I recall is the high resolution images of Jupiter clouds on its polar regions. A magical cauldron where bizarre formations are the norm.


> In fact truly "alien" imagery is not easy to come by.

It is an interesting observation and comes to show that matter follows the same rules everywhere when they interact with each other like particle systems.

I think if we were to come across something that appears truly alien, it will be a system that has a computational property of some kind. Life is one example of this, but it doesn't have to be.


> matter follows the same rules everywhere when they interact with each other like particle systems

yes, I suspect that is a major factor, though it falls in the realm of complexity physics that, afaik, has not yet formalized such universality

the other factor might purely internal (our visual processing latching on to known patterns). this bias starts already from the point astronomers process the collected data and compose these images.


But the forces do change at different scale. Gravity does not influence particules for example. And my understanding is that at gigantic scale (galaxies) gravity is not sufficient to explain all observations. Thus we create those dark concepts (dark matter, dark energies) that might well be misunderstood new forces operating at these scales.


"Particles" as in a model of particle-like interaction. So the particles in question could be individual stars on a larger scale. Or asteroid sized clumps in solar system scale.

In fact, much of the simulations for all these systems involve particle representation.


How about anglerfish or giant tube worms?


"fun fact" JWST only has 65 gigabytes of storage and has to be downloaded daily at 10mbps

I believe I read the European Extremely Large Telescope is going to be doing a full sky survey every night generating 8TB DAILY


I don't think the 39m primary E-ELT is designed to do full sky surveys. AFAIK it is to do targeted observations using multiple instruments.

The Vera Rubin (fka LSST) Telescope will survey the southern sky every 2 or 3 nights with an expected data rate of 20-30TB/night, depending on the source.

https://www.lsst.org/about

https://www.lsst.org/sites/default/files/docs/Site%20Selecti...


> I don't think the 39m primary E-ELT is designed to do full sky surveys. AFAIK it is to do targeted observations using multiple instruments.

Correct. I wasn't able to quickly find the maximum size of the FOV available at the E-ELT, but the FOVs of the first generation instruments[0] are not large enough to efficiently survey the sky. The largest FOV instrument for the E-ELT is 6.78arcmin across, compared to 3.5deg for the Rubin Observatory[1] (i.e., 960x larger area).

The E-ELT is large (39m diameter vs 6.5m effective collecting area for Ruben). So the E-ELT has a factor of 36 more collecting area.

However, the Etendue (collecting area * instantaneous FOV) of Ruben is 26x larger than that of the E-ELT. So for surveying a large portion of the sky to a fixed depth, Ruben is ~26x more efficient than the E-ELT.

[0] https://www.eso.org/sci/facilities/eelt/instrumentation/phas...

[1] https://www.lsst.org/scientists/keynumbers


Both magnificent but in different ways.


JWST specs were set in stone in the late 90s. Having a 65 GB drive and 10mbps internet connection back then would have been god-like.


The ELT having the advantage of course of being situated on earth.

Then again, plenty of people reading this will be wondering why their broadband is somehow slower than something 1.5 billion metres away.


NASA's flickr account for high res images from Webb

https://www.flickr.com/photos/nasawebbtelescope/albums/72177...


I've been reading an excellent book on the building of the Palomar Observatory[0]. It's amazing how small we used to think the universe was, a few tens of thousands of light years across. The book is very much a page turner. [0] https://store.palomar.caltech.edu/products/the-perfect-machi...



all of these videos that zoom into the static image just get me every time I see one. anyone have any insights on how to create a somewhat accurate 3D image like this? I'm sure there's a data set available for public use since it was publicly funded. Really curious how much hardware RAM/CPU/GPU it takes, and what kind of render times are involved to make these types of videos.


Here's how you get JWST data https://jwst-docs.stsci.edu/accessing-jwst-data

There's lots of info to get started with what all the datasets mean and how to work with it etc.


Thanks. I'm familiar with accessing the imagery data from the various instruments. I've written several scripts utilizing curl to download once I learn their naming scheme.

I was hoping for more of how to go about creating the 3D aspect. Are people using custom 3D engines written specifically for astro type work, or are they using something that a FX post house would use, or can it be done in Unity(shriek) or Unreal. Are these 3D models/layouts even remotely close to being accurate, are they just faking it for creative license?

So I guess it's more of a tooling use question vs how to obtain pretty pictures question


How much of the image is processed? What’s the raw photo look like? How much of this is an artists impression?


Curious what the crowd favorite Webb image now is, if you exclude the first batch..


What does the Milky Way look like from the SMC?


If you have a VR headset, it's definitely worth it to try Space Engine, which models the entire universe (it has known stars/galaxies catalogued and procedurally generates the rest), and one of my favorite things to do in that game is travel to a star that's in a good vantage point to see a galaxy from just the right angle, and which also has planets (the procedural generation gives the majority of stars planets), land on a planet, and look at the night sky view of the galaxy from the planet's surface. I should try this from the Small Megellanic Cloud some time.


It’d be very large, but also very faint (if you were using human eyes). Not nearly as spectacular as you seen in illustrations.

The Andromeda galaxy covers an area of the sky six times larger than the moon, but human eyes can only see the brightest part of the core and then only faintly.


Go there and find out!

We don't know, we can only make informed guesses about the structure of our universe.


Anyone else “see” a women holding her right hand to her face in the upper left blue cloud?


I see a blue skeleton holding some kind of scepter in its left hand.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: