Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wayland’s mistake is assuming every application has a local GPU. In reality the user has just one GPU attached to his monitor, which is fine for email and gaming, but serious tools run miles away in datacenters. We wouldn’t be rewriting everything in javascript if only we hadn’t forgotten how cool remote X was.


> but serious tools run miles away in datacenters

I work on a serious tool, specifically it’s CAM/CAE stuff. Despite Google, Amazon and MS sales people apply pressure to upper management (they want us to move to their clouds and offering gazillions of free compute credits), our software works on desktops and workstations, and I have reasons to believe it gonna stay this way. With recent progress of HPC-targeted CPUs, and steady downward trend of RAM prices, I believe our customers are happier running our software on their own computers, as opposed to someone else’s computers.

> We wouldn’t be rewriting everything in javascript if only we hadn’t forgotten how cool remote X was.

It was cool in the epoch of OpenGL 2. By the time Metal, Direct3D 12, and finally Vulkan arrived, it stopped being cool. Essentially, these APIs were designed to allow apps to saturate PCIe. You can’t transfer that bandwidth over any reasonable network.


True, that’s one of the domains where you can assume everyone on staff has a really expensive workstation (sometimes for licensing reasons).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: