Hacker News new | past | comments | ask | show | jobs | submit login

Yes, and particularly the way light reflects off of different materials is often still in the uncanny valley. Perhaps ray tracing will finally get us there in the next few years.



That's the thing, light doesn't just reflect off materials. A bunch of lights gets absorbed by it, changing the color and the amount of light that is reflected back, from a deeper surface area. There are just so many things happening with light, most of it we don't fully understand yet, but we seem to instinctively see when it's off.


Actually we do understand all the things which happen during normal conditions with light (i.e. when we speak about: bunches of photons and not a single one; objects big enough so that quantum effects can be neglected; relatively low intensities, so that self collimation and other effects related to it don't play a role), at least from physics stand point. Maxwell equations do a good job for us to explain light and its behaviour. Though, obviously, there are so many effects coming from those equations: refraction, reflection, diffraction, interference, dispertion, scaterring, etc, which just are numerically complicated all to be taken into account for a rendering engine.


The general principles maybe, but also you also need to know a bit about the specific materials being modelled.

There are some neat papers where they do very high resolution CT scans (microCT) to examine the microscopic structure of different materials, like the size and orientation of fabrics. If you model that, the material takes on the appearance of the fabrics, which is amazing:

See Figure 8 here: https://www.cs.cornell.edu/~kb/publications/SIG11CT.pdf


The future(well now too) is machine learning. Don't have to understand. Just feed pictures of what you want it to look like and out comes materials that look like that.

Many techniques like this highlighted on Two-Minute Papers


Path tracing can replicate that effect, including things like an object near a window lighting the room with reflected light in its colors, and refraction effects through some materials.

See https://raw.githubusercontent.com/Dalamar42/rayt/master/samp... for one example; notice that the white light reflecting off the green wall makes the side of a nearby object look green, for instance.


Path tracing with colors still presents some challenges. Most mainstream renderers use RGB model, but of course real world has the full EM spectrum.

For example for the Cornell box scene, imagine that the wall was pure green material and the box pure yellow material. Then there shouldn't be that visible secondary bounce, but with naive RGB path tracing there would be, because it wouldn't be able to distinguish between pure yellow and mixed yellow.

In practice I would expect the effect to be pretty subtle in most scenes, but e.g. scenes lit by sodium streetlights could have stark difference. For example something like https://commons.m.wikimedia.org/wiki/File:Red_and_black_cars...


Yes, there's definitely more going on than just reflectance. That's why most virtual materials appears too glossy to me. From what I understand, ray/path tracing will enable better materials modeling as well, given enough processing power. But I'm not a graphics person, so would be happy to be corrected.


Lighting and water are kinda hard to nail perfectly... there's just something about the real thing that's hard to duplicate.


And fire. Fire always looks fake in GGI whether that's in games or in movie/tv


More generically: fluid simulations. Fire, smoke, water etc




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: