So much is written about Wayland not "being ready." I've used it exclusively on my desktop and laptops for... as long as it's been available in Ubuntu. It works fine. I don't have any issues with input methods. It puts pixels on my monitors. The screen doesn't tear when I move windows around like X did. It's fine for me.
Me and my friend both installed arch at the same time and both tried to use gnome + wayland.
Both of us experienced issues with games being inconsistent in frame times, black rectangles over electron apps like spotify/discord, for him his multi monitor setup was somehow broken on wayland but fine with X.
Chromium apps in general appear to have varying levels of issues, some applications don't support wayland at all, most can run with the xwayland which IMO mostly defeats the purpose especially when it's still not seamless.
We've both switched to X and no longer have any of the above issues. That's not to say the desktop experience on X is seamless of course, but the above issues were all solved.
I'm really under the impression that people who use wayland either don't use a wide range of applications (which is perfectly fine!) and/or are used to putting up with "typical linux issues" and accept the quirks.
I remember when wayland came out and was supposed to solved the "fragmented/bloated mess of xorg" but it literally just appears to have been a half-baked solution for ~15 years.
IMO the issue with the "linux desktop" has always been consistency. You don't have to worry about adding launch arguments, compositor support, graphics drivers, AMD/Nvidia, wine, broken audio/networking when you do an update, etc. MacOS/Windows "just work" - at least with far more consistency.
I'm afraid you're a bit optimistic saying that Windows is consistent, or has no audio or graphics problems. My Linux laptop (xfce) is way more visually consistent than my wife's Windows desktop, and I had to troubleshoot audio issues on it, and on Windows laptops at work, but never on Linux (pipewire, onboard / USB / Bluetooth audio).
And yes, I run X, Wayland is too restrictive currently. I should give it another try some day though.
>and I had to troubleshoot audio issues on it, and on Windows laptops at work, but never on Linux (pipewire, onboard / USB / Bluetooth audio).
Funny, it's always been the other way around for me.
My Ubuntu 22.04 at work keeps randomly switching the default audio output to the headphone jack of the machine and off my Bluetooth headphones leaving me without sound in my headphones until I go to the settings and switch it back to Bluetooth. Never had such issues with Windows, it's always been rock solid in this regard.
Would you mind sharing the exact audio issues you had with Windows?
Win10, desktop, USB Bluetooth: audio not switching correctly when connecting / disconnecting headphones. Now appears to be fixed by some update. Digging through audio settings to find controls of particular audio interfaces is also fun.
HP Spectre x360 laptop (2020), factory-installed Win10: audio just dies sometimes. The Realtek's control panel (also factory-installed) periodically crashes. Funnily, booting a Linux live from a flash drive gave audio that worked without said problems. The laptop was later replaced with an updated one, when its graphics hardware visibly failed. The replacement did not have the audio problem (IDK if it was a different audio chip revision, firmware, or driver), but the Realtek control panel was still unstable.
(Visually Win10 has at least three UI toolkits that can barely agree on colors, and cannot agree on fonts or the shape of controls. Under X, I have all GTK2, GTK3, Qt5, Qt6 programs use the same fonts and a common theme, with controls, if not exactly uniform, at least having common colors, shapes, and sizes.)
Wouldn't it be installed windows update anyway? I'm pretty sure windows installs whatever software is in it's repo the moment a device is discovered by default?
No, Windows update will update it if it's already there from the factory, not if you have a fresh install from the official Microsoft ISO.
On a fresh install Windows update only install the corect drivers, no vendor apps.
The Realtek app is put there by the vendor (HP/Dell/Lenovo) from the factory in their spin of Windows. If you install a fresh vanilla copy of Windows from Microsft's website you won't have any of that nonsense.
Sadly, this is no longer true. At least, not for modern laptops.
It's possible to achieve what you wrote, but the complexity of doing that is borderline impossible. Need to set up group policies, or use custom software, to block installation of specific hardware devices, and specific Windows updates.
I have a 2021 Lenovo that had a bunch of crapware on it. Flashed a clean version of Windows and the crapware never came back after any update. The link you posted mentions nothing of a clean install, most likely it was a factory install.
> Never had such issues with Windows, it's always been rock solid in this regard
I can’t even count the number of times windows reinstalling the sound driver or whatever it does at runtime, requiring a reboot.
I think people are very likely to blame the software layer even when they use subpar hardware — no software will run fine on shitty hardware, and the very inconsistent reports from basically any OS is more than likely the result of some underlying, failing hardware components. This also mostly explains why OSX is considered more stable by some — it is much easier supporting 10 different configs, than all the others.
I think Linux has undergone a huge improvement in this regard and actually has the best, general support for any hardware nowadays — it may happen that a given driver is better/only works for windows, but the mainline has shifted to Linux now.
> and/or are used to putting up with "typical linux issues" and accept the quirks
As you later point out more explicitly, there is no way around that.
Linux is the epitome of bazaar style development - it has endless positives, but a huge drawback is not having a consistent direction, nor any real force behind any of the directions.
Apple can just say that they will now support a new compositor, if you wanna stay in business, change. And it will happen. But that’s not a technical thing at all, wayland’s first 10 years is very different from the next 5 one — since it has become mainstream now, its support and hands working on it will result in exponentially more improvements. A new direction needs critical mass, and wayland has only recently acquired that, imo.
Mainline Chromium (and subsequently Electron) support for wayland has only really landed in the last year or so. I'm curious how long ago this trial of yours was?
Yeah I think I remember seeing that, we did this around 6 months ago so not many actual applications used the newer version of electron, although IIRC a lot of electron applications don't update electron itself super frequently.
I might try wayland again, I'm just really frustrated with it. It's always been advertised to be better then X, and 15 years in it's still a rocky mess (as of ~6 months ago).
This isn't 100% waylands fault - but nvidia cards seem to add extra issues. I need CUDA support so AMD is not an option.
> This isn't 100% waylands fault - but nvidia cards seem to add extra issues. I need CUDA support
Well, there you go. NVIDIA for the longest time didn’t support linux, period. They add proprietary binary blobs to patch xserver to work. It’s no surprise that some normal userspace program can’t do something the kernel can’t do either.
I also know input methods work great on Ubuntu (Gnome Wayland) because Gnome has first-class support for ibus.
But the article is specifically about incompatibility of input methods among different DEs: Gnome, KDE, wlroots-based environments such as Sway and Hyprland.
After many years, ibus still doesn’t work with wlroots, because ibus supports input method protocol v1 and wlroots supports v2: https://github.com/ibus/ibus/issues/2182
I'm not disagreeing with any of the technical points in the article. I'm just saying that my experience with Wayland is pretty good. Someone (GNOME? Ubuntu? thousands of generous developers?) is doing all the work to make it work pretty well for me.
I've also been using wayland for a couple years now. Switched from i3 to sway and never had screen tearing as I did with X server.
Just recently moved to Hyprland and similar experience. Barely had any hiccups and switching to a non-english layout such as Arabic works like a charm.
Perhaps maybe our systems are just ideal? For the record I'm using a thinkpad which generally has good support but even on my Lenovo G505 it worked perfectly.
I can never go back to X11 and I think the greatest condemnation of it is that we're still talking about screen tearing in 2023.
Screen tearing is something that simply shouldn't exist by default. A person should be able to count on its absence, like a person should be able to count on a basic USB keyboard working without any headaches.
Now I'll brace myself for the "I've never noticed screen tearing" from people whose brains run on a different refresh rate than mine does.
Screen tearing is not really an X11 issue. It's an XFree86/X.Org issue in that they are to this day in many ways the lowest common denominator implementations, and at some point Intel removed VSync circuitry from their IGPs completely - so the only way to prevent screen tearing was to have global compositor doing page flips. So, a lot of that "never noticed screen tearing" is due to people running different GPUs and/or enabling some driver features mitigating vendor weirdness.
Similarly, nothing says you need to request a redraw when moving a window instead of redrawing it from backing store - but at some point backing store became useless in X.Org even if you explicitly try to program its use.
> Now I'll brace myself for the "I've never noticed screen tearing" from people whose brains run on a different refresh rate than mine does.
I'm actually pretty sure this is a thing. I have a dissociative disorder and sometimes the "refresh rate" of my senses seems to go down. I have no idea why or how, but it's really weird to literally experience reality feeling like a slideshow.
But other times I can detect any flickering lighting, so I dunno.
Wow, can you please elaborate on the disorder like the name? When I was a kid this slideshow effect would happen to me repeatedly and then never again since high school. Nobody I've ever mentioned this to has a clue what it was.
oh it's dissociative identity disorder, but there are other dissociative disorders that don't include actual dissociative identities, like "derealization/depersonalization disorder" (DPDR). The slideshow thing isn't necessarily DPDR, but maybe you can use it as a search term.
Thanks, slideshow is such a great way to describe it. I always explained it as time slowed down, but maybe people did not understand how time went ... like ... this as a flip-book being released really slowly.
This verges on sounding like a neural disorder to me. I notice screen tearing. So what? Please kill my framerate and bloat my latency an order of mag so I'm rid of this awful graphical artifact!! While the compositor's at it, maybe it can add delays with animations to when I close or minimize windows! (Just kidding; don't do that; that's way worse -- unless there's people who don't notice that that's the cost of eliminating tearing.) I'm using a computer. It's drawing frames. There should be tearing. Why would you want to boil the ocean and make the snappiness of your game/entire desktop worse to sustain some denial of the fact that you're at a computer?
Here's another thing, which proves that directly addressing tearing is false dharma: it gets better with higher refresh rates. Having higher Hz is awesome, and it's improving the issue to boot! It's like antialiasing. You're rendering stuff multiple times, taking enormous performance hits, sometimes now even using semi-sentient AI to smooth the jaggies out... when it turns out that the jaggies get better when you increase your resolution. False dharma: trying to live the lie of no jaggies on 768p @ 43 FPS; TRUE DHARMA: accepting and advancing your computerized existence on 4K @ 53 FPS. Same with tearing and refresh rate.
Also??? These days, adaptive sync (on the monitor hardware level, NOT software driver level) completely eliminates tearing. Another dharmically truer solution.
Also, while you're so hung up on tearing that you'd leave X11 for Wayland, you've clearly never experienced the awesomeness of xdotool, which isn't available on Wayland. You're not truly living. You're trapped in a gilded cage, satisfying merely the basest petty natures of man.
> It was such a relief the first time I used Mac OS X. Finally a unix that renders to the screen without artifacts.
That was one of the reasons I used macOS too. Not for artifact-free rendering but for the fact that it's a real production-ready desktop OS with first-party app support from most companies, while being a true Unix with all the powerful Unix capabilities, like being able to run Valgrind and etc. without nesting virtual machines.
Because screen tearing is something that honestly showed up pretty late, around the time of i3/i5/i7 with IGP, as before that tearing was mitigated at least a little with VSync. (There's the issue that X11 didn't expose good sync primitives, but that's workable with extension, not total replacement)
The design can deliver perfect frames - it's extensible so pretty much you need to provide a way to handle double buffering right - and guess what, extensions for that were proposed (Present extension by Keith Packard, building on top of X Synchronization extension)
I notice it I just don't care at all. Latency is a much bigger deal to me and it may not be technically part of Wayland or the widget toolkits used by compatible apps but it is consistently a problem.
If latency is a real problem for you, buy a monitor with higher refresh rate. That’s the only meaningful way to decrease latency on modern software. Mind you, literally every single program you use will double buffer, unless you are using XMotif or so exclusively.
I actually do use a lot of XT/Motif apps. Mostly because of the lower latency but also they weren't infected with whatever brain damage generated the gtk3 design language.
GNOME supports my trackpad better than Windows and, dare I say it, macOS does. As in, it literally supports trackpad gestures and inputs that Apple doesn't, and it's actually really smooth and responsive.
You can right-click-drag, and you can middle click. macOS doesn't support either one of these, and Windows only supports assigning a bottom-right quadrant of the trackpad to right-clicking.
GNOME has the fatal flaw that you can't configure trackpad scroll sensitivity. That means, if your trackpad is too sensitive out of the box, GNOME is simply unusable. Every laptop I've used in the past many years (a couple of Dells and a Mac) have had touch pads with too sensitive scrolling in GNOME.
It's fine in Sway or KDE, as they have scroll sensitivity settings.
> GNOME has the fatal flaw that you can't configure trackpad scroll sensitivity.
Oh, I got bitten by that actually. I ended up just configuring Firefox since that was the main app I used, but most apps aren't Firefox and don't have settings for things that they would expect to be handled by your WM/DE/DM.
I don't believe it's a fundamental GTK problem. When you scroll, the Wayland compositor sends scroll events to the app, with a float for the delta. KWin and Sway have an option to let you multiply that delta with a constant; multiply it by 0.5 and you have halved your scroll speed. GNOME doesn't have such an option.
GTK may have other scroll issues as well (I recall seeing some stuff about that, though I don't remember the details), but there is definitely stuff the compositor can do.
> When you scroll, the Wayland compositor sends scroll events to the app, with a float for the delta
GTK works through a bunch of heuristics and in-built multipliers, but the fundamental problem is that scroll effect depends on the “weight” of the content itself — kinematic scrolling is unfortunately quite missing from linux desktop, AFAIK. But do correct me if you know better, I’m not too well versed in that.
I wish there was an equivalent to "better touch tool" for linux, that program adds so many trackpad/other peripheral shortcuts and gestures it's crazy and doesn't involve editing config files from a wiki and half a dozen stack overflow pages.
Also been using Wayland for years via GNOME, everything works, I have cool new dbus APIs to control my desktop, the "press ctrl+alt+f2" to bypass the lockscreen" thing isn't a thing anymore, libinput is leaps and bounds above everything that came before it.
I don't think anyone doubts it works in limited circumstances with a curated set of software, provided you don't need anything like remoting or screen sharing.
I share my desktop (single windows and/or whole desktop) at least once every day. Usually with Google Meet in Firefox, but also with Slack, Microsoft Teams, and Zoom. I don't need remote access to my desktop or laptop (windowing system) though. SSH is just fine for me.
As a relative Linux outsider, the X/Wayland situation is definitely putting people off switching. We consistently hear "X is deprecated, don't use X, Wayland is the future" yet every single time we find, read about, or are recommended something cool it always ends with "oh by the way this only works on X not Wayland".
I used plan9 extensively (Almost all my machines including my mail server and primary desktop) before I went to college. That doesn't mean its ready for most people to use.
Have we (those who are not paid to work on this) "followed the money?"
The glacial Wayland rollout to usability (it was AT LEAST a decade, don't try to fight me on this) is just so odd to me; to the point that I feel like it would be useful to examine the most powerful entities in this.
It's just really hard for me not to at least consider: Was the push for Wayland in the hands of entities who perhaps didn't have a particularly strong interest in a classically useable desktop? As in, a combination of entities who really have no interest in a Linux desktop (Ubuntu, who lets be real, has caved to Microsoft) or entities that are trying to hard to be cool and in doing so broke stuff that just works? (Lookin' at you, gnome?)
>Was the push for Wayland in the hands of entities who perhaps didn't have a particularly strong interest in a classically useable desktop? As in, a combination of entities who really have no interest in a Linux desktop
You answered your own question.
The Linux community's (read: most graybeards') love for Linux is inversely proportional to their love for a GUI. The terminal is still the One True Way(tm) of doing anything serious in Linux.
Consider that among the hottest topics in Linux in the year 2023 is still the holy war between vi vs. emacs, nearly a quarter of the way through the 21st century. The people in charge of Linux (be it literally or figuratively) do not give a damn about GUI, which includes desktops. Using a mouse is heresy, a touchscreen is sacrilege.
I don't know. How did KDE, the best desktop in the world, come about? It's better than Windows's -- the DE of a multi-billion dollar OS with 98% planetary market share that's been iterated upon for like 30 years across uncountable dev hours. Dolphin vs. Explorer! Billions of dollars!!! Versus five Dutch guys, working for free!!!!!! Split views! Terminal integration! Dark mode BEFORE just a couple of years ago! File metadata under the filename! Tabs! TABS!!!!
I don’t think you need a conspiracy theory to explain Wayland underperformance: free OSS labor doesn’t produce great things, when those great things require a lot of unsexy work. If you want to follow the money, recognize that there is no money in a Linux desktop, so no professional developers turning out professional product.
You have an ecosystem where most software is made for free by enthusiasts, and orders of magnitude less people may be paid a bit for some of the core parts.
How do you expect the much bigger former group would work on your new thing? They wont, unless it starts spreading. And it can’t really spread unless it is being supported by all the community. That decade was mostly a build up — once it has the critical mass, improvements will be exponentially greater.
Anyone saying that it is a drop because it already took a decade is just not familiar with how any of open source/bazaar projects work.
They benefit from their purer FOSS licensing behind GTK (in contrast to their competition, Qt) and place in history. They're working on a doomed experiment, on the fumes of enchantment of Apple design and the false hope that, maybe someday, "regular people" will use Linux. On a tablet or something. It'll all be tablets -- nobody'll even have desktops anymore. The hardware will converge, if you will. There will be a Unity of interfaces.
A lot of people say that X is slow because it’s a communication method, and it’s also outdated, buggy and difficult to maintain. However, from what I know about Wayland, Wayland is also a communication method, and Wayland is weaker than X in terms of functionality, so it is not suitable for desktop use. Although it has been 14 years since it was released, there is little progress in protocol development, and there are a lot of bugs in implementations. Moreover, the performance of the implementation is no different from that of X, the quality is below expectations, it is unstable, and the development maturity is low. Also, Wayland input method v1 and wlroots input method v2 are technically and functionally regressive to XIM that appeared in the 1990s.
Simply put, Wayland in 2022 is more technologically obsolete than X. Many people have been interested in Wayland and have been cheering for it, but Wayland seems to have no hope. How can there be no books published on the subject of Wayland programming in 14 years? If each Linux distribution accelerates the migration to Wayland, users will ditch the Linux desktop. I look forward to seeing a new display server to replace Wayland.
The difference being that X development has essentially _stopped_ because it's so complicated that nobody wants to touch it anymore. Does it work? Yes. Is it mature? Yes. Are apps more compatible with it? Yes. Will it evolve from there? No. While Wayland has been picking the slack slowly, it has been usable for years now and apart from the screen sharing issues and occasional driver problem. Wayland will continue to get better and make a graphics stack that's more adapted to modern needs.
The OpenBSD team is still developing the Xenocara X11 system, though AFAIK they aren't particularly adding features because the system is feature-complete (they aren't adding features to relayd either but it's not "abandoned" in any sense: as much as it conflicts with the CADT philosophy it's actually fine for software to be mature).
While I don't like making predictions, if I had to guess I'd guess that in a decade's time the OpenBSD team will be maintaining the upstream display server for most Linux distros just like they maintain the encrypted shell and the BGP server for them.
I'm one of the many users who will take a mature, well tested product over one that's experiencing "Ongoing Development" and lacks many of the features they need.
Too much ink has already been shed around the faults of Wayland, but I'll put it pretty simply - The people on Wayland have no business designing user interfaces and are questionable at developer interfaces.
If you mix up wayland with user interfaces, you are not even in the ballpark of getting the deal — pretty pretentious to have such a strong opinion on something you know jackshit about.
Especially that the people on Wayland are literally the ex-maintainers of X.
> How can there be no books published on the subject of Wayland programming in 14 years?
Just my personal observation here, but I don't see a lot of value in programming books any longer due to how often things change. Usually I'm looking for official documentation or recently updated blog posts on the Internet.
Mhmm, this is what will push users from the Linux desktop. Not the billion other usability issues. I think this is a case of someone caring a whole heck of a lot about a small thing.
I have a Compose key (RAlt on my current laptop). It behaves inconsistently across apps, because that stuff is implemented in each app, and they’re not implementing the same functionality.
Some ambiguous sequences are interpreted one way in one program (e.g. <-> = ↔ in Alacritty) and another way in another (e.g. <-> = ←> in Firefox). Order and whether includes are involved may or may not influence these things.
And I’ve come across one or two programs (generally your do-things-from-scratch, pure-canvas sort of things) where the Compose key just didn’t work at all. And I’ve encountered more than one or two web apps where the Compose key is basically broken.
(In general, Compose seems to be handled the same way as IMEs—speaking as a user, not a developer. Pressing Compose in GTK apps inserts an underlined ·, which is then replaced with the characters of the sequence as you type, underlined, which is then finally replaced by the resolved sequence when you’re done.)
All this was a problem under X. It’s still a problem under Wayland. It’s not a problem under Windows with WinCompose: it handles it all, not each app; so you don’t get any progress indicator (no underlined · or anything, it just consumes the keystrokes until you finish a sequence), but it all behaves consistently between apps, so long as they run as the current user. (That is: “run as admin” and WinCompose’s hooks don’t apply so the Compose key just doesn’t work.)
As for input methods in general… ugh, I recall the trouble I had under i3 with xim and ibus and whatever the other *im things were, it was even more inconsistent than I think I’ve observed under Sway. I don’t remember altogether why or what I did, and I haven’t tried doing exactly the same thing, so maybe it is actually about as bad as ever. But I do know that I’m generally finding a much better experience of input-related stuff under Sway than under i3. I haven’t had to tweak GTK_IM_MODULE or QT_IM_MODULE or XMODIFIERS environment variables or whatever other things for years.
Hmm… I also remap Caps Lock to Backspace, and I’ve noticed a couple of apps (e.g. Chromium, no idea if XWayland use is a factor) failing to handle repeat on that key. Little niggles like that aren’t uncommon, again I think there’s too much being left to the apps rather than being handled by the compositor or whatever.
That part is easy, and is basically the same in Sway: I have `input * xkb_options "caps:backspace,compose:ralt"`. The problem is what pressing the Compose key does: inconsistent stuff.
Huh. I admit I haven't tried it on a huge variety of clients but among emacs, firefox, and urxvt (which probably make up 99.9999% of my typing time) I get consistent results.
Do you customise your .XCompose? If you don’t, you’ll probably get fairly consistent results. But I do things like include https://raw.githubusercontent.com/kragen/xcompose/master/dot... and add a bunch of things of my own. Overlapping definitions and the likes become common.
I disagree with many of the conclusions about protocols. It does lead to some counter intuitive results, but if you want backwards compatible interfaces, you should indeed implement each version of the interface. For something like an input method, having the constraint that new versions of the protocol must be backwards compatible would be highly problematic. Having display servers and input method buses implement the versions of protocols that people use would be the best way to go, as it still means that input methods and applications themselves do not need to deal with this bifurcation.
Another highly misleading aspect is pointing out how many years Wayland has been in development. It's true, but it's not like progress has been linear. A vast majority of the Linux desktop ecosystem has been on NVIDIA graphics cards. I reckon Google probably has one of the largest deployments of Linux workstations internally and I believe they've only just begun the transition into Wayland. Thus, it should not be surprising that say, Chromium, has been behind on Wayland support.
While I empathize with people who think that the Wayland transition has been very expensive and slow, I don't really know what the alternative is supposed to be. There WERE and ARE some competing options. There was DirectFB, Mir. Arcan still exists. Probably others. Wayland gained mindshare as the baseline protocol for windowing, and here we are.
Will it fix everything wrong with the Linux desktop? Well, no. However, the painful irony of this post is that practically nobody uses XIM anymore due to limitations, they use out-of-band input buses. So each toolkit has its own plugin architecture for input buses, and then each input bus needs to have plugins for each toolkit, and you need to have the input methods themselves ported onto these. Note that you can also still use this approach if you want to in Wayland, especially important since Qt has been slow to adapt to protocol evolution lately.
All I can say is, it ain't going to be finished tomorrow, but a lot of what Wayland does really is architecturally a step in the right direction. There are some inherent downsides to the approach, but it also TODAY, right now, solves a lot of practical problems that users face that are hard blockers.
> A vast majority of the Linux desktop ecosystem has been on NVIDIA graphics cards
Source? Nvidia's proprietary, closed drivers and the moving target that comes with that means I avoid Nvidia graphics hardware at all costs and I'd thought it was widely known their cards aren't great if Linux is the primary use for your hardware. (Linux is my daily driver and I also develop on/with it, but I don't game at all)
Usually it's because someone bought a non-Linux computer that came with Nvidia, or they wanted/needed CUDA for some reason, as there's not yet a very good equivalent for AMD (ROCm is not a CUDA equivalent).
It is already quite widely known that Nvidia is not necessarily great for Linux desktops. I know people who bought entirely separate AMD graphics cards just to drive their display even if they still needed the Nvidia GPU for compute.
It's usually possible to get Nvidia displays working, but the experience isn't nearly as good as with AMD.
Exact figures are hard to come by, but vary from 60-85%; Steam says around 75%[1]. NVIDIA proprietary drivers were immensely popular on both Linux and BSD in the early 2000s when I first got into Linux, and having tried the then-closed source ATI drivers a singular time, I can absolutely attest to why: ATI and NVIDIA were basically the only serious desktop GPU options at the time, and the closed source, I believe Catalyst? drivers were as good as junk. Almost everything about them screamed fucking useless. For a really long time, they defaulted to setting multihead systems up with multiple X displays instead of Xinerama by default, which is so unbelievably user-hostile; it sounds okay on paper, but if you've ever used X with multiple separate displays, you'll understand that it stopped making sense literally the moment Xinerama existed, and it was already quite a thing the late 90s. But after I set up Xinerama, the problems continued into every other aspect of the desktop. In that era, where you pretty much always needed discrete graphics, NVIDIA was your Linux option. No doubt that it's going to change so as long as NVIDIA's Linux desktop support continues to stagnate, or at least stagnate moreso than it adapts.
(I have not used NVIDIA on Linux as a daily driver for five years. I tried it again when the open source kernel driver came around, but quickly switched back after realizing it still wasn't stable enough in Sway or KDE.)
Steam is going to be heavily skewed Nvidia since many games either need it or run poorly without it. Gamers seem to tolerate an otherwise nearly unusable computer for Nvidia graphics acceleration where most users won't.
Steam merely confirms what I already knew as a long-term Linux user: when it comes to discrete GPUs, NVIDIA has long been the biggest player. This isn't just Linux-specific either, NVIDIA commands a large portion of the GPU market period. Forgetting gamers, NVIDIA has the most robust options for hardware acceleration in many cases, especially for AI use cases like PyTorch, and a lot of productivity tools like Blender. ROCm and HAP exist, but AMD has often struggled to gain widespread support.
If there's any strong evidence that NVIDIA still isn't winning in Linux desktop marketshare, I'd certainly be interested to hear it, and it would be news to me. I can imagine Intel technically has a lot of marketshare due to laptops, but then again, Intel and NVIDIA users are hardly mutually exclusive on laptops, either.
I don't really believe the majority of desktop Linux users have Steam installed. I do believe that the majority of Linux users historically (at least leading into the 2010s) used NVIDIA GeForce graphics cards, for a multitude of reasons. Even today where AMD and Intel GPUs are clearly the better option for stability, that doesn't mean they're the best option for doing work with. If anything, gaming on AMD is pretty comparable today, in part thanks to Valve. Productivity on AMD is where things are not in good shape. I wish it weren't so, but compare the CUDA and OpenCL/ROCm ecosystems and it's pretty obvious. Most recently, when AMD released some pretty enticing GPU options into the market, ROCm users were surprised to find that their cards either were entirely unsupported, or support had not yet been stabilized. That's just not how things work with CUDA.
I'd pay for things to get better here. I desperately hope that Vulkan as an option for compute can mature and compete with CUDA, but I also think that to do so, quite an ecosystem will need to be built up around it, even if the API and drivers prove sufficient to work as a solid runtime.
Zoom used to work in Wayland quite some time ago, then it started crashing on meeting join under Wayland, and it was still like that last time I tried it under 5.11.10. Then they finally implemented screen sharing properly in 5.10 or something, then in 5.12 they actively broke it again (popping up a message box saying it won’t work except under GNOME or something like that), so I downgraded to 5.11.10 and it’s fine again. I tried 5.13.0 and it was still broken. I haven’t tried upgrading since; maybe they’ve unbroken it again, one can always hope.
Anyway: Zoom 5.11.10 is working fine for me in all regards so long as I run it under XWayland. My ~/bin/zoom:
I believe a shell alias would also work if that’s how you run things.
Minor suboptimalities in its window management and UI scaling compared to native Wayland under Sway, but nothing big, except for it only handling input and screen updates around once a second if you have any window (except the main one for some reason—but settings, a meeting, chat, participants, they’re all affected) open but not visible (e.g. as an inactive tab—tiled or floating is fine). Oh, and that I get REPLACEMENT CHARACTER from typing astral plane characters with my Compose key, which strongly suggests stupid UTF-16 stuff and I can’t remember if that happens under native Wayland anyway. Or the hidden window problem.
Meet shares my screen just fine. Zoom claims that it should, but it refuses to recognize my setup as "supported", probably because I'm using sway and not gnome.
I got it working for awhile, but after an update it stopped. We mostly switched to meet, and I stopped caring. If I need to share my desktop in zoom, I just drop from the meeting and rejoin from chrome instead of the app.
So is there a project/protocol that everyone can get behind?
If what is claimed here is true (wayland is a bad protocol) that might help explain the slow development.
I mean hackers might not like doing the boring stuff, but they'll do even the boring stuff on a project that's exciting, technically innovative or excellent, etc.
For all y'all saying "I'm still using X because Wayland isn't ready"... this is your reminder that X is DEPRECATED. The entity committing resources to maintaining it -- Red Hat -- has abandoned it and committed to Wayland only going forward. Unless another entity steps up with funding, X is doomed to bit rot and Wayland is the future. The responsible thing to do is to suck it up, file bug reports, and write code and/or documentation to improve the Wayland side of things because that's where the money and developers are -- period.
(And before you slag off Red Hat, remember that they are the Atlas holding up the entire Linux userland ecosystem for the past couple decades or so now.)
Nobody is going to take the effort to actually remove support from released versions of GTK. You're just being silly.
Will GTK 5 not support X? Possibly! I don't really care though. I pretty much just use 2 as it is when I have to use GTK. But GTK and QT still support DirectFB, which is almost 2 decades old at this point. X isn't going away.