> It seems the Pi’s raw CPU frequency is still not powerful enough for decoding 100% of the time. While 97-98% of the time is good enough, you will get the occasional “screen tearing”
I don’t think that’s inadequate hardware performance. I think that’s Linux GPU stack. More specifically, the parts where hardware acceleration integrates with that decades-old X11.
I once made a toy project for Pi4 that can render GLES content either with or without desktop manager: https://github.com/Const-me/Vrmac/ I did observe occasional tearing on desktop, with both 3D content or accelerated h264 video. Maximizing window into borderless fullscreen didn’t help. However, rebooting into console and running the same code on top of DRM/KMS without X11 resulted in no tearing.
As far as I understand most composition managers under X have issues with screen tearing not because they are based on "decades old X" but because they do not even attempt to sync to the screen refresh rate. Some just run on an internal timer, which is a bit like catching a train by walking of the platform at random points.
If you tell the compositor to fuck of by overriding the windows redirect state or have a compositor that at least detects full screen windows you generally get no screen tearing at the cost of everything the compositor normally does.
> because they do not even attempt to sync to the screen refresh rate
I don’t disagree with that, but I think the underlying reason is X server protocol. It’s hard to implement vsync properly when there’s a socket connecting application to display server.
On Windows, various GPU APIs and even parts of the GPU driver are DLLs loaded into the process. DLL functions don’t have latency and API designers don’t need to consider that (except stuff that have good reasons for latency, e.g. asynchronous draw/compute/copy calls). Works quite well in practice, I don’t remember screen tearing issues on Win10 unless I explicitly disable vsync, e.g. with DXGI_PRESENT_ALLOW_TEARING flag. Most apps don’t do that and don’t tear.
The problem statement “implement proper vsync” only looks easy on the surface. If one considers all the hairy details, a good implementation gonna affect many components of the OS, not just GUI related also power management and others. Otherwise it gonna introduce presentation latency (especially bad for online games), consume much more RAM and VRAM, and/or waste too much electricity (especially bad for laptops).
> If you tell the compositor to fuck of by overriding the windows redirect state or have a compositor that at least detects full screen windows
If a developer is in a position to replace OS components, that’s probably an embedded environment. For that case DRM/KMS combo is already flawless, at least according to my experience. That’s not just on Pi4, I’ve developed stuff for a few other ARM SoCs, too.
Desktop software developers can’t replace OS components or ask users to do so. They need something that works out of the box, is reliable and efficient.
> It’s hard to implement vsync properly when there’s a socket connecting application to display server.
That would affect all X11 based applications, yet somehow screen tearing consistently seems to disappear as soon as I bypass the compositor.
> Otherwise it gonna introduce presentation latency (especially bad for online games)
I once went around looking how much latency various Linux desktop environments introduce. The most widely used ones (KDE and Gnome) on their default settings are outright catastrophic. KDE lets you disable desktop effects and the compositor, Gnome only bypasses it for full screen applications. As always eye candy > functionality.
Hardware is fast enough when it can run your software fast enough to do what you want. :)
Approximately all software in the world is very suboptimal, as is occasionally proven by people who make things go fast on slow hardware for fun and games (eg see 3d engines on c64 or realtime video decoding & streaming from floppy on 8088)
I’m not an expert in Linux graphics, but I spent about a day trying to fix that but failed.
eglSwapInterval does nothing. The GPU driver exposes no *_swap_control extensions anywhere. There’re 3 extendable parts, EGL, GLES client and GLES server with many extensions each, but nothing resembling GLX_EXT_swap_control/GLX_MESA_swap_control/GLX_SGI_swap_control.
Other applications on that Pi4 who render many FPS (web browser or VLC player) also have screen tearing. I assume these people know way more about Linux GPU stuff, yet they failed too.
Do you have an advice how I can do that on that Pi4, with EGL 1.1 and GLES 3.1?
P.S. Based on the internets, the problem was introduced in Pi4. In earlier versions things were good.
I don’t think that’s inadequate hardware performance. I think that’s Linux GPU stack. More specifically, the parts where hardware acceleration integrates with that decades-old X11.
I once made a toy project for Pi4 that can render GLES content either with or without desktop manager: https://github.com/Const-me/Vrmac/ I did observe occasional tearing on desktop, with both 3D content or accelerated h264 video. Maximizing window into borderless fullscreen didn’t help. However, rebooting into console and running the same code on top of DRM/KMS without X11 resulted in no tearing.