This is great. I recently moved to KDE Plasma (on Nobara Linux) after many years of Pop OS, because I recently moved into a new office, and no longer have room to have my Mac and Linux setups side by side, but have to consolidate them in a single seat.
There are 4-5 monitor arms and the monitors are a mix of 4K and 8K.
AFAIK (and I have tried quite a lot, including the latest of all major distros like Fedora, Ubuttnu, OpenSUSE, etc) KDE Plasma using Wayland is the only way to have a typical "full desktop environment" (like GNOME, KDE, MATE, etc) that supports this kind of multi-monitor setup (where the monitors have significantly different ppi and thus need different scaling factors) and "just work" like Windows does.
(BTW even macOS doesn't support this well in my experience - it renders the 8K at 6K and downscales that to 4K, and also with all monitors connected, one of them (seemingly random) blanks out for 1 second every few minutes. But I have only tested this on an M1 Max laptop.)
Any modern Linux "kinda" works — if you are OK with the 4K monitors being HUGE or the 8K monitors unreadably small. Per-monitor fractional scaling is an iffy thing on Linux, and only KDE + Wayland.
It almost works on X11 but just doesn't. The usual jank happens when switching monitors via a KVM, or unplugging one or plugging one back in — but then it goes into a "jank loop".
So this is what finally dragged me reluctantly to Wayland. But I'm grateful to the KDE guys for hammering doggedly on multi-monitor Wayland support lately, because it is literally the only thing I have found that lets me boot up Linux and have all my displays work.
I still feel like KDE is a little buggy overall, but not significantly more so than GNOME, which is the only other desktop environment with a similar breadth.
P.S.
If you want to deal with a desktop that is simpler (yet more DIY and higher maintenance), I think you can make Sway or Hyprland work with arbitrary resolution multi-monitor setups. But it is a lot of work. I mainly use desktop Linux for writing code at my day job, so I hoped to have something I can just install and use right away without tweaking and reading docs (fun as that might be).
Pop OS works really well on 8K or 4K, but not both at the same time.
Not who you asked but its quite common in folks in Finance work (especially trading etc) to have 4-5 (or even more) monitor setup - they need to keep their eyes on the multiple blotters/charts simultaneously. In time-sensitive work, you can't afford to miss something happening on a window that is hidden behind others, visual comparisons can be super fast and reduce mistakes and switching windows is bothersome. I use a 43" along with two 24" inch monitors and programming with 8 split-panes (4x2) on the main one with chat/email/tailing logs on the other two makes things so much better.
This sounds like a great use case for VR workspace. Is that a thing yet ?
I haven't touched anything VR related since original Oculus - how far away are we from buying a 1000$ headset and it replacing all my monitors (and ideally I can travel with it) ?
Anyone here using such a setup ? What gear would you recommend ?
Yep, when I contacted Level1Techs to ask if their "1.4 Display Port KVM Switch - Dual Monitor - Four Computer" KVM switch would support it (the monitor needs 2 cables so you need a dual monitor KVM to make it work at 60Hz), the proprietor Wendel told me that indeed it does, they know a bunch of traders that use it with these monitors.
The 8K (I only actually have one of these) is for coding. I normally go into an evangelistic fervor when somebody asks me about it, but I am on my way to dinner, so let me cut to the TL;DR:
The 280ppi of an 8K 31.5" display is so much better than the 220ppi or so of a 27" 5K or Apple's ludicrously expensive Pro Display XDR 6K display, that is it night and day and I couldn't ever go back.
Even though the Dell UP3128K is like 5+ years old, it is so much crisper and better than any other monitor I've ever seen (including that Apple and the 5K UltraFine popular with Mac heads).
I didn't even think my middle-aged eyes were good enough to have that matter at all — I cannot see pixels on the 220pi monitors. But after trying it, I realized seeing the pixels or not wasn't the right metric. It's way, way better, and almost everybody who stopped by when I had the Apple and the Dell side by side on the desk (I eventually sold the Apple, of course) said indeed they could totally see that the 280ppi screen was way better. So YMMV, but everybody should try it. :)
I completely switched from Mac to Linux, for all of my day job coding work, specifically and solely because of the monitor. (Almost no Macs support 8K monitors. The latest M2 Pro and better models do support 8K, but only on TVs at the moment (there are still no 32" HDMI monitors on the market yet, and Macs only support 8K via HDMI).
As for why I need 3-4 additional 4K monitors, those are only "needed" for music — which I do on the Mac. I like to just leave my virtual gear (guitar amps, effects boxes, synthesizers, etc) spread out, like I would in real life.
But since I do have them, I also use them for the usual multi-monitor stuff — put Slack and Discord out of the way on the far monitor, have one monitor for the browser and dev tools, code and terminals arranged on the other two, just to avoid having to context switch.
On Pop OS, my middle age eyes also liked to enable the Gnome option "Large Text" in the Accessibility settings. I haven't found the equivalent on KDE yet. But it still looks great and super crisp, just a little bit small.
Great to hear that multi DPI setups work seamlessly on KDE Plasma. Actually that capability is what brought me to wayland 4 years ago. I switched from i3 to sway, using wdisplays and kanshi to control multi monitor setups... and it was glorious even then.
In 20+ years on Linux I've never been able to make my variety of environments "just work“ this seamlessly on X. It's one of my favorite things about Wayland.
Now that screensharing "just works" I've run out of complaints about wayland, though. Just that it's turned me into an insufferable snob about the blurriness and incompatibility of the X apps I still have to use in xwayland.
But then, I stopped buying NVIDIA more than a decade ago, and linux compatibility is a priority for me in my device selection. So I'm sure there are plenty of other users out there with a less rosy experience than mine.
Does KDE let you change workspaces on each monitor independently? Or does it carry over the X11 EWMH backage that requires you to have a single root window across all monitors with each workspace stretched out across it? So when you change a workspace, it changes across all monitors.
By "workspaces" do you mean what KDE calls "Virtual Desktops"? AFAICT, no, they don't seem to be changeable per-monitor. Additionally, when I was poking around in the System Settings to see whether there was some option for that, I enabled "Show animation when switching" -- and then half of my monitor froze!!! :-D
It still has Extended Window Manager Hints, presumably because it's a standard that applications can adhere to to get some semblance of coherency across applications and environments. It would be interesting if a modern standard could be created, but let's be real: unless a competing standard gets pushed by Red Hat it won't take off.
Yeah, this is why I had to go write my own WM to have sensible multi-monitor support. I had to break one particular part of EWMH to do it. And indeed, some apps (like pagers) don't work with my WM because of that.
I'm using a mixed monitor setting on KDE on Pop_OS (I know, I added KDE to Pop_OS just to try Plasma and loved it). It all works on X11 but the jank is true - I am indeed afraid of any plugging/unplugging to avoid disturbing the setup (using a Lenovo thunderbolt dock with Thinkpad X1 Extreme Gen 2 which uses Nvidia MaxQ graphics).
Maybe I'll go back to using straight-up Kubuntu (or even just Debian) - the only worry is quality of support for Nvidia card - Pop_OS folks were the only ones with a fully working Nvidia setup when I started using it but maybe by now, Kubuntu and/or Debian also has similar level of support. Would be great if someone using Kubuntu/Debian on a mixed monitor setup with an laptop with Nvidia graphics can share their experience - ideally with Wayland since I'm interested in moving to it for better multi-monitor support.
Are you doing different scaling per monitor in X11? Different resolutions is fine. I ran many different monitor sizes and resolutions with X11 and i3wm. Just configured everything with xrandr and just had a few scripts that I ran depending where/if I was docked.
But per monitor scaling seemed impossible. I just gave up and ran some monitors at a lower resolution so that the DPI was similar.
> Are you doing different scaling per monitor in X11?
I do this, but only some applications cooperate. I couldn't even figure out how to configure it right on kde even though the underlying capability is there.
It definitely isn't impossible, but the popular toolkits make it much harder than they should.
Yes, that is exactly the problem that KDE+Wayland solves for me. The 8K 32" is way higher ppi than a normal 4K (mine are 27") so you really need per-monitor scaling factor.
I used to run everything at 200%, and then the 8K looked great, but the 4Ks were all huge and clownish, like scaled up 17" screens. It worked, but it wasn't good.
Yeah, I didn't really mention it but my days usually involve working all day on Linux, but then switching to the Mac (using the same set of monitors, via a KVM switch) in the evening to do non-work stuff.
So that switch has to really work and not like... work most of the time, but jank out on a regular basis. (Which I found to be the case with X11.)
KDE+Plasma works with this daily routine too (although it is better to lock the screen first so it doesn't have to try to re-layout a bunch of windows).
It's probably more an issue with the thunderbolt dock. I had the same exact problems and they were traced to bugs in the dock firmware. You can try updating the firmware and see if that fixes it for you. Unfortunately, Lenovo bios/firmware has been really bad for Linux in the last few years.
No I use Thinkpad X1 Extreme Gen 2 - trackpad et all work fine (I think I also get the fingerprint reader to work but it was a somewhat hit and miss affair).
Do I understand you correctly, you use a mix of 4-5 4k and 8k monitors for just writing code? I mean I like a 2 monitor setup for coding but I don't know what I'd do with 4 monitors.
Also if you're happy with defaults a hyprland setup should not take you longer than 15min (sway probably longer), but both are quite a different way of working, so would take getting used to.
One thing i regard as a big lost opportunity is why wayland doesn't allow one to simply specify a desired DPI and it scales all monitors accordingly. As it is now one still has still tweak scaling factors to get two different size/resolution monitors to display windows exactly the same size. All the information is there why can't this be done? Am I misunderstanding something?
The new welcome screens are something baffling to me. When I open Kate, a text editor, I immediately want to type my text, not be greeted by options and look for "new file".
Like a lot of things in KDE, this is configurable: Settings -> Configure Kate -> Session (from the sidebar) -> Uncheck "Show welcome view for new windows" under "Application Startup/Shutdown Behavior"
DISCOVERABLE configurability. Things I can find in the UI. I hate the universal "add this line to <obscure config file> to fix that" thing that's prevalent in Linux.
Having a gui for settings helps with not having to memorize in what forsaken directory an application keeps its config files, because apparently that's not standard. Is it in .config? Or maybe .<app name>? Maybe /etc/thing?
If the Kate one is like the one shown for ark, I hope they are consistent, there will also be a checkbox on the welcome screen to disable it being shown again.
KDE is doing this to improve discoverability for new users as that's a major feedback from users to KDE, that it's hard to learn all the features it's capable of.
I get the rationale, but I've also seen a lot of software UX evolve into being annoying or useless for experienced users by catering exclusively to inexperienced users.
I hope KDE does not fall into this trap.
I think the next step is a global setting that says, "I am an experienced computer user, please don't try hold my hand all the time, it's creepy"
KDE is genuinely really good now. As fully featured desktop environments go it’s right up there with the commercial offerings.
Unfortunately I’ve moved on and I don’t want that anymore. A minimalist tiling desktop like sway or i3 suits me much better. Particularly on a touchpad where dragging windows is a faff.
I would say KDE Plasma surpasses anything out there even closed proprietary alternatives. From time to time I have to use Windows and mac at work (to test on other platforms than GNU/Linux) and they are much worse when it comes to desktop use, especially macos is quite terrible. Not even considering all the trends of surveillance and ads they try to add into those two now and constant dumbing down of the interface.
What I'm constantly wondering is why macOS is so popular, especially with tech people, like developers.
It's the most terrible platform, imho. It's more buggy than Windows, it has the same malware problem as Windows, it's spy- & adware as Windows, it's inflexible, it's completely dumbed down, and it's incompatible to anything (even it former self).
The usability of macOS is even worse than Windows, imho. Still it's very popular.
Why is the brainwashing by Apple's marketing so effective? I don't get it.
Because macOS has been the most consistent in the past 20 years and it works out of the box and has exceptional integration with exceptional hardware. Plus enterprise support. (Being consistent doesn't make it good or bad on its own, just being consistent can be a bonus for some).
A lot of developers are not technical-savy either to deal with linux's hardware issues, dependency issues, etc. Many IT doesn't want to deal with linux for the same reasons.
It's also easier to fix if everyone has the same hardware and software. Many devs use brew/iterm2 for package mgmt and advanced terminal support.
KDE may be the best DE out there but it doesn't matter as long it is not available on typical commercial devices and works out of the box with 100% hardware integration support. Business has an impact on what hardware their employees can use and it is often macOS/Windows because it has the lower overall investment cost and much better integration support with their device management tools.
> It's more buggy than Windows, it has the same malware problem as Windows, it's spy- & adware as Windows, it's inflexible, it's completely dumbed down, and it's incompatible to anything (even it former self).
And Linux has been the most unstable OS for me with broken nvidia drivers, bricked installs and so on all from updating Linux a few times in the past 3 years. We can blame Nvidia for this but it doesn't matter to everyone else, it is still Linux's responsibility to ensure it doesn't happen.
I never had to reinstall MacOS on any of my devices in the past few decades. Windows, I did have to reinstall because of bad Windows updates.
macOS is nowhere as bad as Windows with its spyware/adware, it's there a bit but it's not that big of an issue.
> it's inflexible, it's completely dumbed down, and it's incompatible to anything (even it former self).
That's the best part of macOS and why businesses want it.
KDE itself was hard to use for casual users because it is too flexible and comes with defaults that are not tailored to them. (I said was, because last time I tried it was a year ago and it had some weird defaults).
If KDE could optimize their initial defaults for the incoming new users (which are growing up with iOS/Android devices first), then they might have a better chance.
Stopping breakage on updates is one of the big promises of ostree-based systems (fedora Silverblue and Kionite), and declarative systems (Guix and Nix). Easy rollbacks, updates are installed in the background with no impact on the running system, and atomically switched on reboot.
I see breakage on updates only on macOS and Windows. There it is a constant. People usually fear updates on those systems by now. It's always: "Let's wait and hear what others say what breaks, and whether there are workarounds", or "Let's wait for the first few minor updates to the major update; you know…".
OTOH my Debian Testing boxes almost never break even I update them every single day. (You get maybe one issue caused by updated every 2 - 4 years; most of the time you just need to wait a few days to get a fix. If it's urgent a downgrade of the affected package(s) can remedy the issue instantly.)
So in my experience even a perpetual "beta version" of Debian Linux is more stable than "regular releases" of other systems…
I didn't had any not trivially solvable issues with my Linux boxes in the last 15 years. But I read about the major fails of either macOS or Windows more or less on a daily basis in the news… I really don't understand why anybody would use those systems therefore.
Of course YMMV… I don't use any exotic hardware, don't buy the most shiny stuff right away, and I only use Debian Testing—as almost all other distris are imho broken. The ones that aren't are imho on the other hand too complicated to handle (like e.g. NixOS).
> But I read about the major fails of either macOS or Windows more or less on a daily basis in the news… I really don't understand why anybody would use those systems therefore.
When the install base is over two billion of users, we're bound to hear issues. Just bitrot alone can cause some % of issues.
We're not hearing from 99% of the users, only those that have an issue and took the time to say something. There's no reason for the news or even people to go on the web every day to say Windows/macOS/Linux still works fine with updates with no software issues.
I'm not saying OS are immune to issues but there's no evidence that Windows updates is causing billions of devices to break or brick on a constant daily basis here. Yes, some software may break, some incompatibility issues but Windows is pretty decent with its updates. It's also fairly easy to rollback by uninstalling the previous Windows update.
Plus, macOS with Time Machine is an exceptional backup mechanism (Linux with TimeShift is great as well). Windows is...questionable. I use Reflect but I doubt most people know how to back up properly.
However, I can't say that I speak for 99% either, all I can report is from people around me. I work for an employer with over 900 employees and most are on macOS, I can confidently tell you that it is not an issue at all. We rarely have an update issue. Altho, we do inform people to wait a few days before installing an update, mainly to make sure our custom stuff is working fine first.
> So in my experience even a perpetual "beta version" of Debian Linux is more stable than "regular releases" of other systems…
Considering one of the most popular distro is Ubuntu, which itself is based on the Debian Sid branch and a huge number of distro is based on Ubuntu; it's already safe to say that Debian is stable usually across most of its update channel.
Debian is nice but I struggle to get it to work on my laptop with nVidia GPU. Some of its defaults was odd as well.
I can say Debian stable is extremely snappy, it honestly shocked me. I just wish it would work out of the box on my stuff without me spending hours to figure it out. I gave up and switched back to Pop_OS; nvidia and Wi-Fi chips didn't work at all.
Hopefully, that might improve over time as I hear Debian 12 is going to be more flexible with "non-free" binary firmware and drivers for its images/installer.
> I don't use any exotic hardware, don't buy the most shiny stuff right away, and I only use Debian Testing
That's the thing, as long as we do that, it'll be safer and less likely to break. We have to work harder to find the right hardware for the right linux support. Casual users are just going to buy whatever's configured for them.
For me, if I want to get the best Debian experience, I have to find a laptop without nVidia chip, an older Intel or AMD Chipset and hope it works out of the box. Don't get me started on Wi-Fi chipset as well.
> Stopping breakage on updates is one of the big promises of ostree-based systems (fedora Silverblue and Kionite), and declarative systems (Guix and Nix). Easy rollbacks, updates are installed in the background with no impact on the running system, and atomically switched on reboot.
That will be wonderful and by then, maybe, Linux may be considered as long as it is all simple to use.
Until then, macOS/Windows/ChomeOS will continue to be the most commonly used PC OSes.
I was happy when my sister switched to using Linux (Mint) but they lasted 6 months before switching to Windows and they said they have no plans to try again because it ain't worth the hassles.
And modern macOS has no Nvidia drivers at all. Would Linux be a better OS if they simply removed the choice for the user to use Nvidia if they want like Apple does? I'd argue not and I'd also argue blaming Linux for a third party's support they don't really control is a bit ridiculous.
It has loads of weirdly missing basic features like the ability to adjust the sound of a monitor's speakers when connected by HDMI or DP. Updates are the absolute slowest of any OS regardless of whether it's a whole new version or a security update. And if your SSD kicks the bucket the whole machine is a brick.
Why would it, there is no modern Mac (excluding Mac Pro) with nvidia GPU and soon, not even an AMD GPU once they stop selling all Intel based Macs.
macOS is not an OS you can install on any other hardware. So, how does that apply to the reason why companies and developers are continuing to use macOS in the first place?
If the companies and developers need nVidia support, they wouldn't buy the Macs in the first place, they would've bought the devices and use the OS that supports it.
We're not talking about which OS is better or whatever, we're talking about why there is momentum and why many developers/users are still using macOS despite the obvious better solutions out there like Linux + System76/ThinkPad/XPS/etc.
> Would Linux be a better OS if they simply removed the choice for the user to use Nvidia if they want like Apple does?
They mostly already do (or did). Many distros refuse to let users install the closed-source nvidia's drivers from the installers or included in the image because it's not open source. I'm not disagreeing with them but the fact is that I am only able to use Linux due to Pop_OS that had nvidia drivers built into their image. I even got a few other people to Pop simply because of their hardware using Nvidia GPU. I couldn't use Debian/Fedora 2-3 years ago because of this.
Many distros seem to be opening up to allowing non-free drivers into their image and installers; Fedora is getting better at this. Debian 12 sounds like it is going to open up to this as well.
Users would have an easier time working with Linux if many distros include support for installing nVidia drivers out of the box.
If I bought a ThinkPad with nVidia GPU and Linux didn't even boot on it because the nVidia GPU is too new or there's no open source drivers, I'm still going to say it is Linux and I'm not bothering with it anymore, I'll just install Windows and move on because it just works and does what I want it to do.
> It has loads of weirdly missing basic features like the ability to adjust the sound of a monitor's speakers when connected by HDMI or DP. Updates are the absolute slowest of any OS regardless of whether it's a whole new version or a security update. And if your SSD kicks the bucket the whole machine is a brick.
1. I don't use monitor's speakers, they're horrible but I can understand why that'd piss people off. Definitely agree that macOS should support adjusting the monitor's settings and audio out of the box instead of forcing people to use a third party tool.
2. Meh, all updates are slow. Fedora updates are just as slow if you install updates via their software app instead of dnf, which requires a reboot for any updates. Debian upgrade took just as long on my older laptop, so I'm going to say this is not a con on macOS because it applies to various OSes.
3. SSD has nothing to do with macOS tho, this is a hardware lockdown by Apple, yes it is horrible but it does nothing to slow down or pause companies and developers from continuing to use Macs. This is fairly easy to work around by using Time Machine to back up your data and swap out the logic board at Apple or get a new Mac. Also, Macs don't really break this easily, by the time it may break, companies and developers are already buying the next models.
They still sell the Mac Pro. It costs $6000+ and it's whole raison d'etre is to basically be a box that has PCI-e slots. And a big part of that is GPU upgrades. Isn't that why folks were so excited for the 2019 Mac Pro? You blame some random Libre Linux distro (which is philosophically against you doing what you're doing!) for not booting on a random laptop with an Nvidia card but who do you blame for a Mac Pro not booting with a GeForce card installed? I just think it's an insane unfair double standard.
Yeah, the machine bricking when the SSD is a big problem. Waiting a week for parts or a replacement is ridiculous and you can't even boot off an external backup while you wait. That's awful. Especially if you have any BTO config because a wait is guaranteed.
macOS updates are insanely slow. I have timed several at 40+ minutes even for just a point update. In that amount of time I can completely reinstall Ubuntu and get set back up. I have never had a Linux or Windows update take that long even on slower worse hardware. Maybe I could get one that slow if I paired an ancient Atom processor with a 4200RPM HDD or something, not a relatively recent i7 with a fast SSD.
I also agree with this. Have to use all three OSes and Apple is a disaster. KDE Plasma on Linux is years ahead and only getting better, while the closed source OSes are just getting worse. Really too bad that KDE and Linux communities don't have the same marketing powers.
MacOS used to be better, before the phonification trend.
As a heavy terminal user, one thing I very much like about MacOS is that it doesn't steal ASCII control characters for GUI operations. Fortunately KDE/Qt key bindings are largely configurable, and this is a major reason I use it. (I think I blame CDE for bringing Windows-style key bindings to *nix.)
Finder also has some good things — I miss quicklook and column view. And while I'm picking on Dolphin, its information panel seems hardcoded to focus-follows-mouse even though I don't use that mode.
Around ~15 years ago ios as a platform became a big thing. It's what powers iphone, and you can write apps for it. You can only write those apps on mac hardware. There are hundreds of millions of ios devices in use, all using apps sold to ios users, developed by people running on macs.
> It's more buggy than Windows, it has the same malware problem as Windows, it's spy- & adware as Windows, it's inflexible,
Citations needed please.
Yes, you can get malware on a mac. That's always been a potential, but malware problems on mac have never come anywhere near the scale of malware-on-windows. Not even close.
> The usability of macOS is even worse than Windows, imho. Still it's very popular.
So what we can conclude is that your 'ho' aren't very popular(?). So what?
That can't be it. It must be that hundreds of millions of people are 'brainwashed'.
I got tired of various windows laptops and every linux distro failing to 'sleep' properly when I closed a laptop lid. I got tired of not being able to have 2 apps both play sound at the same time under linux. Oh shit - pulse? jack? alsa? Oh - you chose the wrong version of the wrong distro - recompile your kernel to make that work right - just copy these 87 lines from a linux forum and paste (as root!) in your console and reboot - just be thankful you're not being 'insecure' like Windows.
"But none of those are problems now! You did it wrong! My mate Paul said it's always worked!"
I moved away from daily Linux-as-desktop in... 2008, because I'd spent a good 6 years on various laptops/hardware/etc trying to 'make it work right', and whenever something didn't work, it was always somehow my own fault for lacking some 'easy' piece of info - if I just keep searching/researching/testing - this time it will be better/fixed/good.
I lost a weekend in 2008 trying to get a new laptop set up. Video worked, but could not connect to an external projector. Hours later, eventually connected, but only mirrored, not two separate screens.
This was a days after I'd been assured "oh no... yeah, I know you've mentioned problems before, but these days, it's all just fine - I never have any problems with my linux/laptop!"
I decided ... 6-7 years of chasing/learning/compiling various options was more than enough.
Forgive me if I'm skeptical after many years of hearing "it works great!" and continually hitting walls where no one could ever help. Then... moving to a platform where... you close the lid, it sleeps. You plug in a monitor, it works. Plug in a printer, it works. Plug in a camera, it works. And then... being told somehow that all that end-user experience is somehow subpar to Linux... just doesn't ring true.
I do daily use Linux servers. And I've been tempted to give desktop linux another spin 15 years later. I have family running daily Linux/desktop/laptop - it's mostly working for them, but I still hear about weird issues I've not had to face in years.
If/when I find myself with a lot of free time, I might give it another spin, but... I'm in no rush.
I pretty much went the opposite and gave up on sway/i3 after many years. I just want my computer to work so I can focus on things that matter to me now (tinkering with my Linux OS used to matter to me, but I'm over that). Every little thing on i3/sway is a chore to set up.
I even went full click ops with OpenSuse Tumbleweed now. No more toiling over my NixOS/Arch configs - I don't need fully reproducible environments for 1 or 2 machines. I want to add a printer, I click YaST printer, add it, done. I want to install full Qt Creator IDE, I download it from the website, double click the rpm and I'm up and running in < 5 minutes. Want to auto hide my taskbar? Right click, set auto-hide, done in 30s. Plug into conference room HDMI cable and mirror? Go to monitor settings, set to mirror, done in 30s. Etc.
I would say that it’s not everybody’s jam, though. I’ve tried using it several times and something about its general UI/UX and design philosophy feels odd/grating to me, and no amount of twiddling with its numerous knobs fixes it. A lot of non-KDE Qt software also shares this problem.
I've used bspwm for many years now, and recently started experimenting with KDE on NixOS.
I have to say, the Plasma desktop is very polished and mostly works as I want it to. The amount of knobs I can tweak to set it up just as I want is not far off from minimalistic WMs.
For window tiling, Bismuth[1] looks interesting, though I've yet to give it a try.
There are some major issues, though, like random freezes and the compositor failing sometimes, which might be caused by Intel drivers, but I'm hoping that switching to Wayland might resolve it.
I'd really like this to be my main productivity environment, as when it works, the experience is pretty slick.
For me KDE 3.5 was still the best. But I use cinnamon now, it comes pretty close. Any attempt to try KDE 4 or KDE plasma over the years made me give up quickly again due to annoying issues (like showing a giant volume icon in the center of the screen whenever changing volume with no setting to disable it)
I was you until about six months ago. KDE 3 got nearly everything right and instead of continuing to evolve, they pretty much threw it all away and went in a radical new direction, promising to revolutionize the Linux desktop.
Note to aspiring developers: This is not how you keep users.
Continually frustrated by the "opinionated" nature of GNOME, I kept trying KDE 4 and then 5 every few years, always hoping for the best, and already getting disappointed by stability issues, show-stopping usability bugs, or half-working implementations. It's like they just kept adding features but never bothered to test any of them.
That changed recently. I gave one of the newer 5.x releases a try (on debian testing, if it matters) and have been very pleasantly surprised to see that all of my previous issues are just simply gone. I don't know what changed but they are certainly doing something right because the current releases are excellent.
I would suggest giving KDE another try. I also feel like you will get a better experience by staying away from any distros that try to layer on their own tools and customizations like Mint and Kubuntu.
In Plasma 5: System Settings -> Audio -> Configure Volume Controls... -> uncheck Audio Volume under Show visual feedback for changes to:.
I know it's not the only issue you have, and not all of them will be solvable, but I found this solution quite quickly (though I did search for volume in the notifications settings first, which didn't show me anything).
Also, if you like KDE 3.5, have you tried Trinity?
Has the stability improved in recent years? Last I used KDE was 20.04 and while the usability was amazing (with a few minor annoyances and bluetooth not working because of course not) the shutdown menu eventually straight up broke and crashed every time it was opened, so I had to resort to terminal poweroff from that point onward haha.
Not a major problem, but I wonder how the hell can something like that even make it into an LTS.
LTS means they take packages form Unstable and release them on day X no matter the actual state, and than you don't get any updates or fixes for the next 2 years… That's how LTS "works".
I'm using KDE on Debian Testing. Since many years now. It's the most stable and bug free desktop by a large margin compared to any other Linux desktop or the two mayor commercial offerings.
Just don't use any Ubuntu or derivative. They deliver always broken KDE packages. That's a know issues since at least a decade.
Depends on what you mean by "stability." The number of crashes is down but UI churn seems like it's at an all time high.
Settings reversion is a constant peeve. Seriously, how many times to I have to set the height of the task manager. I have one screen that runs at the same native resolution it's always ran at. Tell me how hard it is to preserve the fucking setting? That's just the most recent annoyance but it's constant things like that where an update reverts a setting or two. No, it's not a distro thing. No, my package manager is not walking all the /home/*/.config trees--I checked.
Less annoying is the constant "how to turn off new intrusive feature" quests that occasionally you have to do. Setting something once and forgetting about it is tolerable. It's having to do it more than once that sets me off.
File indexing, a feature that I never asked for, seems finally under control. It's finally not sending me looking for the off button. Although, the number of times I've had to turn it off again after a new release? I really wish there was more respect for my preferences. Why is it on again??????? Good job fixing it but, jesus christ, why is it on again?
I'd like to not have to constantly set my audio output back to "Analog Surround 5.1 Output", turn the Mic off, and then turn the subwoofer back down. But this is Linux. If the last 20+ years have taught me anything it's that audio will always suck on Linux. Always. Forever. When someone replaces pipewire, in a few years, it will still suck. (BTW, bluetooth touches audio... connect the dots.)
But crashes? For me, they seem rare now. Fantastic job there.
But honestly, I'm not sure those are valid points.
KWin remembers windows settings. If it doesn't for you, you maybe used a switch that disables that (even I would not know where this switch could be). If you have some apps that try hard to use their own placing policy (which KDE apps don't do), there is always the override in the window rules right there to be turned on…
Configured settings never change randomly on updates. It may be that new defaults are set, and in case you've never touched the affected setting it will use its new default thereafter. But once you set something by hand, it will stay so. Forever.
Audio just works. Since many years now. At least if you don't touch it to hard… It of course automatically remembers settings OOTB. I have a setup with Bluetooth speakers, build in audio, and headphones, internal and external mic, and I had never issues since years. Powering the Bluetooth speakers on or off automatically switched profiles, and also always restores the profiles flawlessly. It's still PA, as I'm not going to switch until Debian does so by default. (That's actually always a very good strategy regarding any new shiny things in Linux! It will save you a lot of trouble.)
All the described broken things sound like "a distri thing", to be honest. If it's a Ubuntu derivative I would not even wonder…
Yeah it used to be quite buggy aout two years ago. But nowadays. I have no issues at all. Well almost none. yeah sure some little hiccup comes up sometimes but overall just runs smoothly. Most probably the improvement also has something to do with my switch to openSUSE Tumbleweed, they really pay a lot of attention to KDE software integration, and it shows.
I’ve been on Ubuntu / KDE for the last 2 years, I keep it all patched an up to date. The other week I was playing around with the task bar widgets and somehow managed to delete or hide the text menu from top of every single app window. It drove me nuts, I could alt + f to bring it back for a second but I still can’t figure out what setting it is that I’ve screwed up. I’ve got a work around in place of a little hamburger widget on each window which expands out to the full menu but it’s still annoying.
If I would need to guess I would say you've added the global menu widget to the taskbar. That would hide the "regular" menu bar in the applications. Removing the global menu widget should bring back the "regular" menu in the application windows.
OTOH, who really needs a menu bar? You can have nowadays a "command palette" like app menu in KRunner. Much better usability than remembering or searching a menu structure imho—in case you don't need the discoverability of a menu, of course.
I’ve been on Ubuntu / KDE for the last 2 years, I keep it all patched an up to date. The other week I was playing around with the task bar widgets and somehow managed to delete or hide the text menu from top of every single app window. It drove me nuts, I could alt + f to bring it back for a second but I still can’t figure out what setting it is that I’ve screwed up. I’ve got a work around in place of a little hamburger widget on each window which expands out to the full menu but it’s still annoying.
Since KDE is shipped with build in telemetry now I can't recommend it to anyone. Opt-in or not, I've seen this game played before and it never ends well.
KDE telemetry is strictly opt in. Some distro like openSUSE even disable it completely. We don't nage you into enabling them. KDE is a non profit, we won't sell any data and only wn handful of developer have access to the data. We have a clear process about how to introduce telemetry in an application.
We have a clear privacy policy https://kde.org/privacypolicy-apps/ and privacy is something we really care about. See for example initiative to develop kde itinerary to be able to track your travel in a privacy councious way or all the work on gpg/autocrypt in kmail wnd kleopatra.
It doesn't matter if it is "opt-in". The problem starts with "built-in". KDE and Qt is very huge and highly complex software. Telemetry can be enabled or disabled without my knowledge, one update away. The same is true for your policy which makes it worthless.
When your machine metrics are recorded it is much like a fingerprint. Who is tracking you can sell to you better, which by definition means you get a worse deal. Also you are better surveyed for how you can be used. You think KDE doesn't have people who do this?
IT-sort of people on the internet: "Security is very important. We dont want to get hacked"
Also IT-sort of poeple on the internet: "Why are you paranoid of a tech company is collecting your data? You really think they are gonna hack you?"
Also others in the field: We need Intrusion Prevention Software, IDS, pfSence firewall, etc... So we can be safe from hackers while our machines host extra automated services we dont need, among reasons.
> When your machine metrics are recorded it is much like a fingerprint. Who is tracking you can sell to you better, which by definition means you get a worse deal. Also you are better surveyed for how you can be used.
KDE e.V. is a registered non-profit, I don't think this would be legal for them.
> You think KDE doesn't have people who do this?
No, because it's not a commercial company. I do think they have an interest in collecting metrics because really it can be very hard to get useful information otherwise. In something of the scale of KDE it's going to be hard to determine which parts of it get the most use, and what crashes in some particularly weird circumstances.
In fact for instance I recall KDE upset quite a few people when at one point Kate lost the project functionality -- apparently the KDE devs didn't realize that the function was actually quite widely used. If you don't have data on this kind of thing it can be hard to figure out a good way forward.
There are 4-5 monitor arms and the monitors are a mix of 4K and 8K.
AFAIK (and I have tried quite a lot, including the latest of all major distros like Fedora, Ubuttnu, OpenSUSE, etc) KDE Plasma using Wayland is the only way to have a typical "full desktop environment" (like GNOME, KDE, MATE, etc) that supports this kind of multi-monitor setup (where the monitors have significantly different ppi and thus need different scaling factors) and "just work" like Windows does.
(BTW even macOS doesn't support this well in my experience - it renders the 8K at 6K and downscales that to 4K, and also with all monitors connected, one of them (seemingly random) blanks out for 1 second every few minutes. But I have only tested this on an M1 Max laptop.)
Any modern Linux "kinda" works — if you are OK with the 4K monitors being HUGE or the 8K monitors unreadably small. Per-monitor fractional scaling is an iffy thing on Linux, and only KDE + Wayland.
It almost works on X11 but just doesn't. The usual jank happens when switching monitors via a KVM, or unplugging one or plugging one back in — but then it goes into a "jank loop".
So this is what finally dragged me reluctantly to Wayland. But I'm grateful to the KDE guys for hammering doggedly on multi-monitor Wayland support lately, because it is literally the only thing I have found that lets me boot up Linux and have all my displays work.
I still feel like KDE is a little buggy overall, but not significantly more so than GNOME, which is the only other desktop environment with a similar breadth.
P.S. If you want to deal with a desktop that is simpler (yet more DIY and higher maintenance), I think you can make Sway or Hyprland work with arbitrary resolution multi-monitor setups. But it is a lot of work. I mainly use desktop Linux for writing code at my day job, so I hoped to have something I can just install and use right away without tweaking and reading docs (fun as that might be).
Pop OS works really well on 8K or 4K, but not both at the same time.