Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Overdraw doesn't just use some CPU. If that would be all, it'd be an easy problem to solve. It causes extra data to be transferred over various buses, extra stops in a pipeline of operations that is already cramped trying to provide a frame every few ms, and probably other problems that I don't know about. This problem isn't as easy as 'throw some more CPU at it'.


I think it’s only that complex and taxing for GUI frameworks that mix CPU and GPU rendering. Like Win32 GDI or GTK+.

For modern GUI frameworks that were designed for GPU, like MS WPF/XAML or Mozilla WebRender, overdraw just use some GPU resources, but that’s it. GPUs are designed for much more complex scenes. In-game stuff like particles, lights, volumetric effects, and foliage involve tons of overdraw. In comparison, a 2D GUI with overdraw is very light workload for the hardware.


Overdraw actually matters quite a bit for integrated GPUs, especially on HiDPI. On high-end NVIDIA or AMD GPUs, sure, 2D overdraw doesn't matter too much. But when power is a concern, you don't want to be running on the discrete GPU, so it's worth optimizing overdraw.


That seems like a dubious claim to me. In my experience maxing out memory or pcie bandwidth is rarely a bottleneck, and extremely unlikely / impossible with low CPU usage.

Do you have a link or more detail about 'already cramped pipelines'?


2D rendering on GPUs is almost entirely memory bound.

Source: I've been working on GPU 2D rendering full time for years now.


Depends heavily on the GPU and the scene being rendered. On Android I've seen low-end GPUs bottlenecked by the shader ALU of all things, even though the 2D renderer only produces trivial shaders. Turns out it's easier to whack off shader compute cores than it is to muck with the memory bus in some cases.

Most common situation is the GPU isn't taxed at all, though, and doesn't even leave idle clocks. We end up just being limited by GL draw call overhead.


You seem to be implying that rendering of a text editor has to be all 2d and that rendering a frame pushes the memory bandwidth to it's limits, both of which can't possibly be true. Why can games run at 144hz and above but a text editor can't afford to overdraw for decreased latency?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: