I'm pretty sure the acronym was for "Physically Based Ray Tracing". The original preface to The Book explicitly says that
> pbrt is based on the ray-tracing algorithm
I suspect it's moved to "Rendering" in common usage to avoid getting hung up on distinctions like ray tracing versus path tracing versus hybrid techniques.
PBR shaders are not trivial. It's incredible to me that they are readily available in all the shader languages, for free. Your common gamer thinks that the game engine is responsible for the quality of graphics in a game. This is hugely untrue. The quality is largely down to how well the art assets have been authored to take advantage of PBR. And that tech is, for all practical purposes, free (in both senses of the word).
My only gripe with modern day graphics documentation is the huge gap in any guidance on building a well-performing rendering pipeline. There are hundreds of tutorial series that get you as far as rendering a single, lit, textured, normal mapped model. But managing multiple models, multiple lights, different types of materials, takes a very different design for resource management and is basically ignored.
The current situation is certainly better than it was 20, even 10, years ago. But that last, missing piece is pretty vital.
I spent a lot of time reading this literature as an outsider to graphics development this summer. I agree resource management is the real engineering problem and the graphics code itself is the idealized code that is relatively small in comparison.
My conclusion is that resource management across different hardware is a secret sauce that helps individual engines push the limits of the current generation. Listening to interviews with developers, and they rarely talk about a novel lighting formulas. Instead they talk about squeezing in high res textures or more colors. How they managed so many assets or faked a reflection. I imagine the work is incredibly tedious and makes browsers differences look trivial.
Differences between hardware is largely how memory can be mapped between CPU, shared, and GPU memory.
Did you look at Unreal engine? The entire source is open to examine and modify (registration required).
Resource management can be a really big chunk of the core game engine. One reason there is less documentation is that there is generally not one right way to do it. Different games have very different requirements and solutions that work for one game/genre will hurt performance (or team productivity) for another.
One infamous example was the EA Superman game which was developed using the Madden NFL engine. At a very top level that may have been a good idea (battle tested Multiplatform engine with great character rendering). The game requirements of long draw distances, scripted gameplay, and detailed city environments were (very sensibly) not something the engine had been designed for and so became enormous issues for the Superman team.
> One reason there is less documentation is that there is generally not one right way to do it.
Yes, but the way that most OpenGL tutorials leave you at by the time you're done is distinctly the wrong way.
> Different games have very different requirements and solutions that work for one game/genre will hurt performance (or team productivity) for another.
I wouldn't say different games have very different requirements. Different genres of games have different requirements. We can make a lot of assumptions about flight simulators versus real time strategy games, for example. There doesn't even seem to be that level of discussion: what the particular tradeoffs are, where, why, and when you'd want to take them.
This is only tangentially related, but if you are into Rust & GPU programming you should check out femtovg (https://github.com/femtovg/femtovg) a nanovg Rust port.
Right now it supports OpenGL and WebGL, Metal backend is almost done, WebGPU is planned.
Related: I've read numerous reviews on Amazon about the poor printing quality of the book. Is that true? Is there a better version of it, or should I hope the updated version that is in-progress will rectify those issues?
I'd love to buy the book, but would hate to shell out money for a crappy printing.
Amazon has a terrible counterfeiting problem, including for books. It's possible that a third-party seller made a counterfeit, low-quality print of the book and set a lower price that meant they got chosen for the “buy button” by the Amazon algorithm (this has happened to many books). Since the reviews don't distinguish between sellers, you get this problem.
If you buy from a legitimate book store, perhaps this can be avoided?
No, the actual publisher (Elsevier) printed a batch of terrible quality books. This led to our ending our relationship with them. We are in the midst of finding a new publisher for the 4th edition; not repeating that disaster is of great importance to us...
>> The latest version v4 [0] has GPU support, some of my colleagues are using it already.
That's good news. I was trying to write a shader recently (not an OpenGL guy) to add the Fresnel term to an otherwise snells-law shader. It's easy to find the math, but hard to find a simple implemnentation. I'd like the full basic model with that and a roughness term with a proper reflectance function (Schlick should be fine) but again, it's hard to find code for what should be a common 50 line shader.