I got nice and downvoted for it too, serves me right.
The noisy / Monte Carlo one is path tracing (which I've been doing since age 15 or something and commercially for over a decade), and that's indeed not what I meant but I guess all the expert gfxcoders at HN have done the efficiency analysis versus rasterisation.
Meh, I always have to remind myself how bad it is here for gfx stuff, might as well have been discussing cryptocurrency...
Curious if anyone has plotted GPU compute increases against display resolution + update frequency increases? When do the two lines cross?