Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Windows is not technically inferior to Linux. To the extent it has problems, it is mainly because of top-down anti-user behaviour mandated from corporate. But anyone capable of using Linux is capable of hacking out that BS and getting a generally superior experience. I use both literally side-by-side, two laptops with a KVM switch, and I still greatly prefer Windows for many reasons.

Some reasons: Even as a low-level programmer fully capable of resolving problems, I want to spend my time working on my programs, not working on making my OS work, and Linux frequently demands that I spend hours chasing down issues. Windows does a better job of managing memory/swaps, at least out of the box. Windows has a stable userland with 30 years of backwards compatibility. Windows makes good use of both GUIs and CLIs, letting you choose whichever is faster for the task, while Linux distros and devs have some kind of bizarre ideological purity culture and generally refuse to make good GUIs. Windows has a built-in tool for easily making full system images while the system is running, without requiring the image destination be larger than the system drive including unused space. Windows developers are not so in love with dynamically linked system libraries that dependency management becomes a pain in the ass. Windows generally has a polished UX with a lot fewer papercuts.



I think there are 3 major points that made Linux much more viable option in the last 2-3 years.

1) Somehow both GNOME and KDE got much better in the last 2 years. It's very smooth and polished experience that I now prefer to both MacOS and Windows. I only need to install 1 or 2 extensions and it's good to go for me.

2) AI! It's orders of magnitudes easier to fix any Linux issue now compared to 3 years ago. The issues that would take a whole afternoon of fighting are now just a couple back-and-forths with the LLM like ChatGPT or Gemini.

3) Valve and SteamOS. The large and mostly successful push by Valve to make Linux be the platform for gaming has cleared many Linux issues and hurdles on the way. I think this will have ripple effects in the industry. My prediction is that thanks to Valve and SteamOS we will see a viable, widely used Linux based phone in the next 3 years.


> we will see a viable, widely used Linux based phone in the next 3 years

Isn't there already a viable, widely used Linux based phone OS called Android?


The point isn't "run the Linux kernel on a phone", the point is "run an non-big-tech OS that respects the user's privacy and choices". See also Google's recent announcement regarding locking down Android app installation to "protect users"


That's not what was said here, though - and also not something that's going to become a viable major platform.

The folks who care about privacy can't agree on the definition of a privacy respecting phone, so whatever you make some of them will be unhappy. 99.9% users care way more about price, availability of apps, and hardware, in that order. App developers will only write apps for the platform once it has sufficient users, and users will only switch if the platform has sufficient apps.


Linux the kernel, not Linux the desktop OS


Right, but you’re presumably not going to run Linux the desktop OS on your phone. You’re going to run a mobile OS and the Linux kernel.


Why not? It works fine on pine devices.


Every long term review of Pine devices I've read ends with ".... so I ended up going back to my previous platform"


I guess I’m not clear what the point would be. I suppose you could build a phone that uses more of the gnu ecosystem than Android does, but you’re not going to run desktop apps on your phone because none of them are made for the phone. It would be a bad experience. So it becomes (Desktop) Linux for (Desktop) Linux’s sake.


His comment stands?


Woooah that's a long way from 'gaming works on linux' to 'this is now suitable as a general purpose mobile OS'


Linux has been able to serve most non-gaming use cases for over a decade now (source: I've been running the OS longer than that). The one thing it used to not be able to do was play games ... and now it does that.


For non technical users it simply isn't true.

The happy path has improved a lot. When Linux is working it's reasonably usable. But once something breaks it breaks HARD and recovery is still miserable.

For reference I've been using Linux since Red Hat 5.2 circa 2000. I cut my teeth debugging problems without internet access. I ran an LTSP lab at my high school. I remember the hell that was XF86Config (I was there, Gandalf, I was there 3000 years ago).

....and like the previous commenter I run Windows on my personal machines because I want to spend my free time using them, not debugging them.


The only app non technical users use anymore is a web browser. And since Linux has the same web browsers, non techies don't care. Also, it isn't spying on them or putting ads on their desktop, or breaking the mic randomly like Windows does. Big differences to most people.


I'm halfway between a technical and non-technical user. And half my time is spent in Chrome. But my other half time is in Excel, Dropbox, and Everything. Do those run on Linux? Maybe there are equivalents but I don't have the time to investigate. Access crashes too frequently these days but I couldn't find a GUI equivalent for PostgreSQL. Spying/ads/breaking the misc aren't in my top 10 Windows issues.


> And since Linux has the same web browsers, non techies don't care.

....which is why Chromebooks took over the consumer market. Oh, wait.


I dunno, I thought about this before switching to Linux, when I gave my wife a Linux box I had sitting around in a pinch during the pandemic laptop shortage—a lot of people these days just need a browser, and there’s not really much to go wrong with that. If something does go wrong you can just nuke the whole thing and start over pretty easily.

I’ve certainly run into some odd situations on my desktop Linux machine over the past 6 years since I started using it full time, but I think most of them were related to the nature of how I use the machine more than inherent instability. I think I’ve spent many more hours of my life unwinding piles of malware and bloat from non-technical folks’ Windows machines than debugging this one.


My parents used Linux as their home computer for three years, regularly updating it and doing basic document writing with open office, as well as all of their banking etc

They don’t know what Linux is, and know nothing about tech, they just know that we had a 30 minute lesson on “here’s Firefox, this icon means you need to install updates, here’s how you print”.

Oh and this was Linux Mint back in ~2016

Things have only gotten easier since then


I'm now also comfortable to have my family members use Linux on day to day basis. To the point where I use linux to revive 10 year old Macbooks, since the Apple hardware is just solid. I don't dual boot, I just fully flash with Linux.

When something breaks,

Fixing Linux means running some commands that LLM suggests.

Fixing Windows means downloading some shady .exe that may or may not fix the problem and it may or may not backdoor your machine.

Fixing MacOS means paying $5 for some app that maybe does the thing.


I don't get this. People would put up with absolute nonsense on Windows. But when it comes to Linux, they want to experiment, mess around with the configs, copy/paste random commands from internet and basically turn into l33t haxers and then stuff breaks, its Linux's fault. Like how? Install Fedora, don't add any extra repos, don't install anything not in the Software Center and let us see how many times your system breaks.

I have been using Linux since 2000s as well. I do remember the rpm hell, dealing with x config issues etc. It is NOT the same experience now a days. I don't have the time or inclination to mess around so I use Fedora + KDE and that basically stays out of my way. I don't rice my desktop or do any hacking around beyond basic automation and I have had zero instances of the system just breaking.


Examples from the last 3 years:

* I wanted to update a Raspberry Pi from Ubuntu LTS 22 to LTS 24. Turns out this is basically impossible. Ubuntu themselves tell you not to do it and their recommended solution is to wipe the system and try again. I ignored them and tried to do it anyway and my Pi ended up refusing to boot. Great.

* I needed to update a Raspberry Pi to change the list of WiFi networks it knew about. Except apparently there are two different networking stacks for Linux with different config files and I edited the wrong one.

* I built a new TrueNAS server. Turns out that you absolutely cannot configure the networking from the GUI. There's a section there, sure, but every time it refuses to save the information until you "test the changes" and that fails to reconnect every single time. You have to locally plug a monitor into the machine, boot it, and log in with a keyboard to get to the config there.

* Not strictly a bug, but I installed Debian in WSL and it doesn't include `man` by default. So I get a command line and no help for it. Brilliant.


But, a Raspberry Pi isn't supposed to be a replacement for your desktop; it is meant as a device for experimentation.


The Raspberry Pi 500, essentially the Pi 5 inside a keyboard, is sold as a "refined personal computer". "A fast, powerful computer built into a high-quality keyboard, for the ultimate compact PC experience."

https://www.raspberrypi.com/products/raspberry-pi-500/


Then marketing struck again. Anyway, that isn't a device the average user would buy, so I'm not concerned about Ubuntu failing to upgrade on such a platform. I would take the complaint as valid if the issue existed on a consumer laptop, but this isn't the case.

Right. I'd like to see them do the Windows 11 upgrade on the same hardware...


Oh, and from a few days ago:

* I want to install jj

* Its docs say to use cargo-binstall

* How do I get that? With cargo, so sudo apt install cargo

* `cargo binstall --strategies crate-meta-data jj-cli` -> `error: no such command: `binstall``

* `cargo install binstall` -> `error: cannot install package `cargo-binstall 1.17.7`, it requires rustc 1.79.0 or newer, while the currently active rustc version is 1.75.0`

* `sudo apt install rust` -> E: Unable to locate package rust

* `sudo apt install rustc` -> `rustc is already the newest version (1.75.0+dfsg0ubuntu1~bpo0-0ubuntu0.22.04).`

Apparently the guidance is to manage your rust versions with a tool other than apt that you install with `curl ... | sh` because no one ever learns anything about security

.....yep, just as user friendly as I remember.


This would not be any easier on Windows?


Yeah like... on Windows that's the exact same steps you would need to take if you insisted on using binstall? You might have slightly different steps for installing rustup for Windows (e.g. you need to install Visual Studio).

The other path I can see (looking at https://docs.jj-vcs.dev/latest/install-and-setup/#windows) is that you could maybe instead use winget directly.

Though honestly IMO this is more of a failure on the jj devs to not provide something that can be installed straight using apt, I guess (looking at https://docs.jj-vcs.dev/latest/install-and-setup/#linux). For Arch for example you just install it from the official repos.


The way Windows programs used to install was you insert a CD or download an .exe, doubleclick it, and then repeatedly press "Next" until "Finished".


> * I wanted to update a Raspberry Pi from Ubuntu LTS 22 to LTS 24. Turns out this is basically impossible. Ubuntu themselves tell you not to do it and their recommended solution is to wipe the system and try again. I ignored them and tried to do it anyway and my Pi ended up refusing to boot. Great.

"Ubuntu themselves tell you not to do it" - you do see it right? Let us see how you forgive Windows for breaking things by ignoring Microsoft's advice and blame them anyway when it breaks.

> * I needed to update a Raspberry Pi to change the list of WiFi networks it knew about. Except apparently there are two different networking stacks for Linux with different config files and I edited the wrong one.

Why? Why not connect it to the network you want so that it just connects to that going forward?

> * I built a new TrueNAS server. Turns out that you absolutely cannot configure the networking from the GUI. There's a section there, sure, but every time it refuses to save the information until you "test the changes" and that fails to reconnect every single time. You have to locally plug a monitor into the machine, boot it, and log in with a keyboard to get to the config there.

And TrueNAS's shortcomings are somehow Linux's fault just like every Windows thirdparty software issue is Windows' fault?

> * I want to install jj * Its docs say to use cargo-binstall

No, they don't ask that as the first choice - this is what they say in https://docs.jj-vcs.dev/latest/install-and-setup/:

    Installation¶
    Download pre-built binaries for a release¶
    There are pre-built binaries of the last released version of jj for Windows, Mac, or Linux (the "musl" version should work on all distributions).

    Cargo Binstall¶
    If you use cargo-binstall, ....
You could have just used the pre-built binaries as per their advice. But if you didn't, you should have atleast bothered to click on that cargo-binstall link to see that it is an add-on which has its own instructions - it is not bundled with cargo by default. Unlike you, I did follow the steps and was able to install jj without issues:

    $ > curl -L --proto '=https' --tlsv1.2 -sSf https://raw.githubusercontent.com/cargo-bins/cargo-binstall/main/install-from-binstall-release.sh | bash
    + set -o pipefail
    + set -o pipefail
    + case "${BINSTALL_VERSION:-}" in
    ++ mktemp -d
    + cd /tmp/tmp.8IdPJtQBlE
    + '[' -z '' ']'
    ...
    + case ":$PATH:" in
    + '[' -n '' ']'
    $ > cargo binstall --strategies crate-meta-data jj-cli
    INFO the current QuickInstall statistics endpoint url="https://cargo-quickinstall-stats-server.fly.dev/record-install"

    Binstall would like to collect install statistics for the QuickInstall project
    to help inform which packages should be included in its index in the future.
    If you agree, please type 'yes'. If you disagree, telemetry will not be sent.
    ...
    INFO resolve: Resolving package: 'jj-cli'
    WARN resolve: When resolving jj-cli bin fake-bisector is not found. But since it requires features test-fakes, this bin is ignored.
    WARN resolve: When resolving jj-cli bin fake-diff-editor is not found. But since it requires features test-fakes, this bin is ignored.
    WARN resolve: When resolving jj-cli bin fake-echo is not found. But since it requires features test-fakes, this bin is ignored.
    WARN resolve: When resolving jj-cli bin fake-editor is not found. But since it requires features test-fakes, this bin is ignored.
    WARN resolve: When resolving jj-cli bin fake-formatter is not found. But since it requires features test-fakes, this bin is ignored.
    WARN The package jj-cli v0.39.0 (x86_64-unknown-linux-musl) has been downloaded from github.com
    INFO This will install the following binaries:
    INFO   - jj => /home/xxxxx/.cargo/bin/jj
    Do you wish to continue? [yes]/no yes
    INFO Installing binaries...
    INFO Done in 7.549505679s

    $ > jj version
    jj 0.39.0-d9689cd9b51b4139d2842fcf6c30f65f4eed8cd1
    $ > 
Again, a) this is third party b) just because you don't know how to follow the instructions, doesn't make the OS bad. Hell it doesn't even make cargo-binstall or jj look bad. By now, you should see that years of experience != knowing how to use things.

Having said all that, none of this stuff you mentioned even remotely resembled an average user's workflow who just uses his computer for listening to music and browsing the internet with some occasional document editing thrown in. Despite its warts and shortcomings, Linux does a much better job today than it used to.


> "Ubuntu themselves tell you not to do it" - you do see it right? Let us see how you forgive Windows for breaking things by ignoring Microsoft's advice and blame them anyway when it breaks.

Not giving a supported upgrade path between version N and N+1 of your operating system is unacceptable, user hostile, and not something a home user could deal with. "Install from scratch, wipe all your files, and set everything up again" is not OK. You can upgrade Windows from 1.0 through 11 without Microsoft saying "nah, this is impossible": https://www.youtube.com/watch?v=cwXX5FQEl88

> No, they don't ask that as the first choice - this is what they say in https://docs.jj-vcs.dev/latest/install-and-setup/:

"the binaries" are a tarball whose instructions refer back to the previous document, whose "Install > Linux" section starts "from source" and says "go obtain Rust > 1.88", so all of the previous problems still apply.


> "the binaries" are a tarball whose instructions refer back to the previous document, whose "Install > Linux" section starts "from source" and says "go obtain Rust > 1.88", so all of the previous problems still apply.

Again with the assertions without checking things. This is the path of "the binaries":

https://github.com/jj-vcs/jj/releases/tag/v0.39.0

I downloaded the file myself and extracted to see:

    $ > ls -l jj-v0.39.0-x86_64-unknown-linux-musl.tar.gz 
    -rw-r--r--. 1 xxxx xxxx 10373711 Mar xx xx:xx jj-v0.39.0-x86_64-unknown-linux-musl.tar.gz
    $ > tar xzvf jj-v0.39.0-x86_64-unknown-linux-musl.tar.gz 
    ./
    ./README.md
    ./LICENSE
    ./jj
    $ > ls -l jj
    -rwxr-xr-x. 1 xxxx xxxx 27122184 Mar  5 02:33 jj
    $ > file jj
    jj: ELF 64-bit LSB pie executable, x86-64, version 1 (SYSV), static-pie linked, BuildID[sha1]=70d48428bc2100069e6813aff97e3dce8d2bb4a0, not stripped
    $ > ./jj version
    jj 0.39.0-d9689cd9b51b4139d2842fcf6c30f65f4eed8cd1
    $ > 
It is overconfident low-skill users like you that bring a bad name to Linux.


From your link, at the very top: "See the installation instructions to get started".

Not "figure out how to extract a tarball, find somewhere unspecified on your path to put things blah blah" but "to get started go read this doc whose first step is to install rust, which your package manager isn't capable of".

This is a fairly standard Linux experience, not one reserved for developer tools.

On Windows, if you're not going through an app store you get an EXE or MSI installer that you double click and it does everything else necessary. Every time.


Yeah. Maybe just stop using Linux. You'll never be happy with it anyway. Most its-never-my-fault people aren't.


And this is why Linux desktop remains a ~1% marketshare OS, despite all of the vocal complaints about the corporate enshittification of Windows. Countless people say they're going to switch out of frustration, and then quickly meet reality and understand how good they actually have it with Windows when they try Linux, not at all helped by encountering the snobby community who will deride anyone for not knowing everything they know. The Linux ecosystem very much assumes you already have the knowledge of having always used Linux. For somebody who just started using it, "following the install instructions at the top of the page" is a perfectly reasonable thing to be doing. It is not the user's fault if those instructions are bad and you could totally get it working more easily if only you already knew what you were doing.

I note you also dropped the line of argument about the OS updating, where you were chiding them, saying they did need to follow instructions in that case. Of course, the instructions in that case are indefensible - you cannot seriously suggest an OS is production-ready for the real world if the instructions are "this cannot be updated. Seriously, don't even try.".


> The Linux ecosystem very much assumes you already have the knowledge of having always used Linux.

Yes, because as per the poster, they are not a novice:

> For reference I've been using Linux since Red Hat 5.2 circa 2000. I cut my teeth debugging problems without internet access. I ran an LTSP lab at my high school. I remember the hell that was XF86Config (I was there, Gandalf, I was there 3000 years ago).

No one is expecting a novice to know how to run curl, untar and compile. This is not that situation by the very admission above.

> For somebody who just started using it, "following the install instructions at the top of the page" is a perfectly reasonable thing to be doing. It is not the user's fault if those instructions are bad and you could totally get it working more easily if only you already knew what you were doing.

Did you actually go to jj's github which the poster mentioned? This is what is literally the top of the Installation page:

    Installation and setup¶
    Installation¶
    Download pre-built binaries for a release¶
    There are pre-built binaries of the last released version of jj for Windows, Mac, or Linux (the "musl" version should work on all distributions).
I demonstrated in this thread that if you download and untar the pre-built binary, it works perfectly. No curl command or compilation necessary. Again, I don't expect a novice to know this but for someone proclaiming to have wrestled with XF86Config config, this should be par for the course.

> I note you also dropped the line of argument about the OS updating, where you were chiding them, saying they did need to follow instructions in that case. Of course, the instructions in that case are indefensible - you cannot seriously suggest an OS is production-ready for the real world if the instructions are "this cannot be updated. Seriously, don't even try.".

I admit that I was shallow on this point. I did research further and Raspberry Pi situation isn't great when it comes to upgrades. Most people are using separate SD cards to host the OS and doing a hard upgrade. I admit and apologise to @Arainach for not checking further on this point and ignoring it.

Edit: I guess today was the day I couldn't ignore Linux bashing from an experienced user and got somewhat carried away. My tone could and should have been softer.


> You can upgrade Windows from 1.0 through 11 without Microsoft saying "nah, this is impossible"

Have you tried that lately? It was probably true for Windows 10, but not 11. There is no supported path to install 11 if you don't have the Microsoft-approved hardware with TPM etc, which would certainly include Raspberry Pis. Installing Windows 11 on non-Microsoft-approved hardware seems to require levels of jank at least as bad as anything I've seen in Linux. Advice is all over the place, usually involving full reinstalls, setting random registry keys, running Powershell scripts downloaded from a random Github repo as Admin, or something along those lines. And no telling which if any work at any particular time, since Microsoft is constantly fighting them apparently.


That's a lot of words to say "I didn't click on the linked video of someone doing it on physical hardware".


> Why? Why not connect [the Raspberry Pi] to the network you want so that it just connects to that going forward?

I'm not the guy who wrote that, but I had the same use-case myself. (Except that I happened to choose the correct networking stack so I didn't have a problem). I wanted to set up a Raspberry Pi in my parents' house that would run Tailscale so I could use it as an exit node. (With my parents' full knowledge and permission). I wanted to pre-configure it with their WiFi password so that when I showed up for Christmas, I didn't have to spend any time configuring the device, just plug it in and go have dinner. (Then they changed ISPs, got a new router with a new WiFi password, and I had to ask them to plug it into the wired network so I could connect to it remotely and change the WiFi password again, so I had to do that work twice. But thankfully, I didn't have to walk them through the steps, just say "Hey, please plug it into the router with an Ethernet cable until you get an email from me telling you I've reconfigured the WiFi".)


I think you are missing the point. Windows programs would install with "Next", "Next", "Next", "Finished". Just look at what you posted and compare the user experience.


There are more details that make me believe that linux as mobile OS is more feasible in near future. Apart from the Valve&SteamOS push, the one notable phenomenon that I see happening is retrogaming. In retrogaming handhelds ecosystem there are now many devices with form factor close to phones (they often literally use screens from older iPhones, etc) that are running on Linux or in some cases provide option to easily switch between Android and Linux. For example on Anbernic RG35XX using Garlic OS there's a toggle in the UI to switch between Android and Linux. Similarly Retroid Pocket 5 allows switching in the bootloader menu.

As a separate point, it seems quite feasible to run Android apps in VM on Linux based phone and make the experience fairly seamless. Something like what Waydroid provides.


> As a separate point, it seems quite feasible to run Android apps in VM on Linux based phone and make the experience fairly seamless.

But why?

The premise of Waydroid seems to be to bring Android apps you want to your Linux desktop. But why would you want the phone in your pocket to run Desktop Linux so that you could then run Android apps on your Desktop Linux mobile phone instead of just running Android on your phone?

What desktop Linux features do you want on your phone that would justify this complexity?


I want to use pre existing apps from the Android ecosystem, but I want the system to let me install and change anything I want. It looks like android is going to heavily restrict installing apps that are not on play store and there are now ~5 apps that I use that don't exist on Play store, but only on Obtanium or Zapstore.

My hope is that installation of the Android apps on Linux phone could be made seamless.


It seems like an Android fork that supports the stores you want would be a lot simpler.

I’m sure Google would deny Google Play Services to any popular fork that didn’t follow their rules. But they would do the same to any Linux desktop or whatever that didn’t follow their rules, too, if it became popular.


Yeah, you have a good point that Android fork would cover a lot of what I'm asking.


It looks like the combination of PostmarketOS (based on Alpine linux) and Waydroid would seem to fit that.


Linux has all the pieces. Other commercial vendors like Jolla have put it together before: https://en.wikipedia.org/wiki/Sailfish_OS

There are open UI shells from KDE and GNOME, multitouch gesture support, Android emulation... it's all there.


I think this is an underrated comment. On (2) you're totally right. Fixing issues is so much easier now I can interrogate an LLM for the right changes to make. It's not for everyone, but it makes my life a lot less stressful.


+1 on the UI thing.

I don't know what it is, but UI on Linux always feels too disjoint from the rest of the system.

It's a bit like how Windows 3.11 was just UI-on-DOS. I get the same feeling.

Don't get me wrong - I love Linux for all its CLI use but for some reason I've never been able to primary drive it without going insane after a week.

Windows just seems to feel more put-together and I guess that's because the kernel probably has hacks to support Office, and Explorer probably has hacks to support the kernel, etc.

The only other system I've felt this level of unity in is FreeBSD with its userland+kernel harmony.

Maybe I need to try a Linux desktop again as I haven't done it in ~10y but the other comment here about Fedora not feeling production ready doesn't inspire much hope...

Any ideas?


I run Trinity Desktop [1] on Linux. It's basically KDE 3 kept up to date (and has been around as long as I've been running Linux), and has a more or less similar look and feel to Windows from the 98/XP days. I run it on Ubuntu (currently 22.04), but it works with most distros.

Many Linux users seem to like upgrading (if you can call it that) to the latest eye candy every time Gnome or KDE or whoever puts out a new release. I'm the opposite. I do think much of the UI work in Linux has done more harm than good. But that's the nice thing about Linux: I don't have to care, precisely because of the lack of such close coupling between the GUI and the underlying OS. I can't stand the GUI that comes by default with Ubuntu, but I just don't use it; I use something else instead.

[1] https://www.trinitydesktop.org/index.php


It's changed a lot in 10 years.

I felt the same as you, up until quite recently, although I was using Xbuntu which uses a very barebones desktop environment. Since changed to CachyOS + KDE Plasma late last year and haven't booted up Windows for 3 months other than to extract a few files. I"m a MacOS laptop user, Windows desktop user, but these days I much prefer CachyOS for speed, responsiveness, easy customisation. You may still find you prefer Windows but it's worth a revisit I think and easy to try via a USB Boot as you know (although running it off USB is way more sluggish I find).


I’m daily driver Linux now after three decades of Windows usage. I have Bazzite-dx (Fedora based) on my desktop and Cachy (Arch based) on my laptop, both using KDE Plasma for the GUI.

I can’t place my finger on it, but Bazzite feels more “coherent” despite using the exact same GUI.

I had the misfortune of using a Windows 11 machine the other day and I didn’t even recognise it. They’ve taken a huge misstep with the Copilot rollout.


10 years is a long time, you should definitely try again. Go with one of the mainstream distros, like Ubuntu or Mint. Regarding Fedora, I heard the opposite, but as I wrote in another comment, users tend to have vastly different experiences with a given OS. If you like Windows' UI, Zorin could be the distro for you.


XWindows/Wayland being a userland application with no solid hooks into kernel space (that has both advantages and disadvantages), where as Windows operates the Window Manager/GDI within the Executive [for performance]. It makes it feel disjoint. Mouse acceleration differences, too (also impacts macOS), but that's something you get used to, though I find gaming awkward on macOS because of the weird default acceleration.


>I don't know what it is, but UI on Linux always feels too disjoint from the rest of the system.

Right up until you try to access any settings menus.


Windows UI is the most disjoint though, with designs accumulated over the past 20 years still kicking in various places.

You really should, yeah. I've given up Linux as a daily driver in favor of a MacBook but I do have a work mandated Windows machine and I hate that thing with a passion. I cannot think of a single thing that's better on it than on my MacBook or any Linux distro I've ran as a daily driver.

In fact, most of the time I want to do any tasks which are not directly Teams or MS office related I find it easier to just use WSL.


> Windows UI is the most disjoint though, with designs accumulated over the past 20 years still kicking in various places.

But every Linux distro has its own UI, and pretty much every distro makes it easy to configure it to look how you want, with tens of thousands of themes out there developed over the past 20 years by people wanting their os to look a certain way.

The most glaring inconsistencies are going to be user-inflicted. If I spend a weekend tweaking defaults to look just right I need to be ok with possibly tweaking any new software I download to fit my theme.

But even from a non-power-user perspective, if my mom runs into problems with her computer it's much easier to walk her through a fix over the phone if she's on Windows or a Mac.

My dad, who is very tech-literate, once tried Linux and all the trouble shooting guides required him to open a command prompt (because there isn't a consistent GUI you can use to fix things across distros). He never forgave it.


As if you don't get a jumble of UI frameworks on Linux too.

You can run KDE but depending on the app and containerization you open you'll get a Qt environment, a Qt environment that doesn't respect the system theme, random GTK apps that don't follow the system theme, random GTK apps that only follow a light/dark mode toggle. The GTK apps render their own window decorations too. Sometimes the cursor will change size and theme depending on the window it's on top of.


Sure but its not baked into the system utilities like in windows.


I used to argue in favor of Windows with basically the same argument. But honestly, using Windows nowadays seems to require even more hacking than Linux. Particularly if you don't have the newest hardware made to Microsoft's standards. Or don't want to deal with regular full-screen ads to update to Windows 11, or don't want Copilot jammed into every app.

I installed Linux instead, Fedora specifically, and everything just worked. It actually cleared up some weird hardware issues I had on Windows that I could never manage to track down. I'm pretty sure I didn't need to do any CLI or config file tinkering for anything that wasn't getting an actual CLI app I wanted to use running. Beats the dozens of different registry hacks and powershell scripts downloaded off random Github repos people kept telling me I needed to do to make Windows 11 work and not be too annoying.


Exactly.

I want to have a computer with stable vendor supported OS so _I can do my stuff_ not tweak some os level configs.

I _don’t_ want to spend my time playing an os systems programmer.

OS is a _component_. Like the wifi driver. I think it’s great some people love developing wifi drivers but personally I just want network that-just-works because there are billion other cool things you can do with a computer.

Similarly I want an OS that just works! Without asking me to do a anything! Because _i don’t really care_. (I mean i care it works but i expect the engineers actually developing an os offering to have a far better idea than myself what is a good stable default config for the system)


> I want an OS that just works!

This is exactly why modern Windows is problematic. MacOS is better. A right Linux distro (e.g. Fedora Silverblue) on right hardware (e.g. Thinkpad T series) also just works™; this basically the same kind of limitation as with MacOS.

I wish they issued a Windows Rock Stable edition. Ancient as rocks (Win7 look, or maybe even WinXP look), every known bug fixed, every feature either supported fully, or explicitly not supported. No new features added. Security updates issued regularly. It could be highly popular.


MacOS has the drawback today any software compiled more than x years no longer works.

That is an unforgivable sin in my eyes.


IMHO - disagree but it depends on point of view so this is not ”you are wrong” but ”in my view it’s not like that”.

I think it’s the role of the software vendor to offer a package for a modern platform.

Not the role of OS vendor to support infinite legacy tail.

I don’t personally ever need generational program binary compatibility. What I generally want is data compatibility.

I don’t want to operate on my data with decades old packages.

My point of view is either you innovate or offer backward compatibility. I much prefer forward thinking innovation with clear data migration path rather than having binary compatibility.

If I want 100% reproducible computing I think viable options are open source or super stable vendors - and in the latter case one can license the latest build. Or using Windows which mostly _does_ support backward binaries and I agree it is not a useless feature.


Software shouldn't rot. If you ignore the cancer of everything as a subscription service, algorithms don't need to be tweaked every 6 months. A tool for accounting or image editing or viewing text files or organizing notes can be written well once and doesn't need to change.

Most software that was ever written was done so by companies that no longer exist, or by people (not working for a software company) no longer associated with those company they wrote the tool for. In many of these cases the source is not available, so there is no way to recompile it or update it for a new platform, but the tool works as good as ever.

Binary backcompat is incredibly important.


I didn't say backcompat isn't important.

There are lots of other ways to run old binaries than at your main OS level.

There are tons of other platforms that precede the current ones.

I would not like the requirements from those platforms to hamper the current gen os.

I do think it's valuable to be able to run the programs from those platforms.


Yes Apple should have kept supporting 68K software and have emulators for 68K, PPC and 32 bit x86.


false equivalence much?


So exactly how far should Apple go back?


On Windows I occasionally still run useful software compiled before 2000.

Mac works great out of the box. Linux can do whatever you want if you put some work into it. Windows sits kind of in the middle, and it turns out for a lot of people that's a comfortable spot even with its trade-offs.


Agree - there are variations to how much tweaking Windows needs.

Enterprise Windows config that comes eg in Thinkpads is more ready out of the box than the consumer OEM configss.


They would still need to develop new drivers for new hardware, which could cause issues. But yes, the situation you describe would be much more stable than Win11.


I miss win2k personally. the UI was decent, and I was able to install enough oss on it that it felt like I could do most of the things I did with linux, but with good font rendering. there were also, to my surprise, a couple of apps (winmerge and proxomitron) that felt like they should totally have been linux apps, but for which I have yet to see anything as good over on the linux side.


What were those applications about?


winmerge is an open source diff/merge tool with a really good UI. comparable linux apps are meld and kdiff3, but winmerge is more capable than meld and feels a lot more polished than kdiff3. I'm actually surprised no one has ported it to linux, though I presume a lot of the polish is due to focusing on look and feel in a way that is tied to the underlying windows gui libraries.

proxomitron is a rewriting proxy, which I always thought was a very nice approach to webpage filtering. again, I remember it having very good UI/UX as well as being fast and capable.


> meld and kdiff3

When I need to compare large text files I use diffuse instead of meld.


didn't know about that one! I'll check it out


I am using Windows now for about 20 years after being a FreeBSD user. I switched to Linux actually to be able to interact with my work environment which was using Windows tools (both office but also embedded development tools at the time) and I think virtualization/emulation was a lot easier. After bricking my laptop multiple times while traveling, because I always feeled inclined to fiddle with everything in my OS even during meetings. Linux was the perfect distraction (also me having latent ADHD). I eventually switched to windows for my laptop, when I needed to focus on productivity. What was meant to be temporary was permanent. I actually have stayed a console user and still use a lot of cygwin, using also the posix compat stuff earlier and now using WSL. Lately I even learned to like Powershell a bit. Only the latest annoyance with trying to force me into the Microsoft cloud and the temporarily very instable UI (taskbar hangs with not updating clock made me miss meetings) makes me think about switching to a free OS again seriously.


> Linux frequently demands that I spend hours chasing down issues

This is one of the points where people have vastly different experiences. I'm one of those that has fewer issues with Linux, and I definitely don't spend hours fixing problems. And this despite the fact that I use Arch, which is supposed to be an unstable distro. Why is that different users report so different experiences I don't know. I think that this might be partly due to perception: we tend to forgive more the OS we like. But your case doesn't seem to be just about perception. So I wonder how much the hardware could play a role here. I think Linux has quite good hardware support nowadays, but maybe I was just lucky so far.


No, it has to do with how your system is set up. Without fail, someone who has his opinion about Linux is using some ungodly bloated corpse of a distro because they genuinely do not know better. Hence why they blame their system's problems on the kernel, despite ostensibly never having any actual problems with the kernel itself. It's not like Linux is particularly perfect, but how many complaints about using it as the basis of a desktop system include the mistake that is the devtree? Or the fact that nice values are complete placebo? Or the million quirks with it's specific implementation of SIGALRM?

You don't have problems with Arch presumably because you've avoided building your system into a neutron star of corporate shitware, while that's the default state for most distributions.


> Hence why they blame their system's problems on the kernel

I'm not blaming anything on the kernel (other than memory management). The userland ecosystem is part of what makes an OS, a perfect kernel with no userland is of no value to the general populace. You don't really get to discount everyone's complaints about the Linux experience because they aren't complaints about the kernel, or at least you won't convince anyone by doing so. It is clearly possible to solve many of the issues I have on top of the Linux kernel, because Android used to be decent, but it seems the desktop ecosystem is just locked in to too many bad choices at this point.

The vast majority of complaints about Windows have nothing to do with the NT kernel, either, which by most accounts is actually quite good.


I mean, I literally opened my post talking about how it's the structure of your system, as in the components you choose to make up your desktop. The kernel is ultimately irrelevant, that's the point, I'm not sure how you managed to miss that. Linux does not have a canonical userland beyond the GNU Coreutils*, no matter how much Redhat and its sycophants would like it to be the case. Windows does, however. When you're complaining about Linux, you're complaining about your specific choices in how you structured the system, usually boiling down to your choice of distribution (but not essentially.) You can certainly at any point pack up your bags and leave the dogshit behind, your pain is a self-inflicted gunshot wound. Such is not the case on Windows, where you are proffered absolutely no choice, because there is a canonical desktop experience. Nobody made you use Gnome or other such crap. But there are certainly plenty of people who will lie to you and try to get you to stay within Redhat's kingdom of sewage.


You forgot the part where they edited a bunch of config files (that they didn't understand) for no reason or are running some experimental UI extension that makes their mouse pointer have a trail of stars or something.


All of these points apply to every OS. Windows bricking itself after updates is far more likely to be caused by all the bullshit software installed by manufacturers and the user.


Me too! I daily-drive Windows as my personal, Mac for work stuff, and Linux for all server-like activity. And I have been for 20 years.


I hear you, me personally I only use Fedora when it comes to Linux. It is easy to install, has everything I need. I keep everything default - maybe change wallpaper and install my apps and IDE. That is it. From this standpoint it is MUCH better than a Windows box for games, which always has some ads, long update, sometimes it breaks, a lot software is actually writing their config in "My Documents" folder which bugs me a lot AND I do not have to have Microsoft account (actually I don't need any account on that matter) to create a local account on my own computer. So yes, I would agree with inferiority argument almost completely. If Linux got a bit mainstream, the ecosystem and apps would follow very quickly I am sure. But Microsoft is trying hard to people try out Linux (which is good btw) so let's see. Myself, I am off to Fedora even for the games, I am done with Copilot and inability to have offline account.


Linux just isn't friendly to newcomers.

1. Lots of rough edges "yeah it almost works if you tweak it a little" yeah thank you but no.

2. CLI is better for doing the thing you want, but GUI is better for discovering what options you have in the first place. The fact that GUI is an afterthought on Linux says a lot.


Not only is GUI better for discovery, it's not even always true that CLI is better for doing the thing you want. Depending on the complexity of the task, building a tower of CLI commands/arguments can be a pain, and if it's something you do ~once a month, good luck remembering the syntax. A GUI lets you not even have to think about it, not have to memorise syntax or go out of your way to write a script to save it. And while CLI is great for things you do routinely... Windows still offers great CLI support, so you simply get the best of both worlds.


We're in the age of LLMs and this is exactly what they shine at. Just the other day I got tired of Libre office having some crappy custom file picker.

"Claude, change the libre office file picker to the system default"

"Beep boop it is done"

Linux has a big leg up over windows in this regard because all the GUIs are essentially wrappers around CLIs and text files that LLMs can deal with quite well.


> A GUI lets you not even have to think about it, not have to memorise syntax or go out of your way to write a script to save it.

Unless the GUI buries what you want to do in five or six levels of menus and options--and then changes where they're buried in the next release, so you have to re-learn everything all over again. That's happened to me with work computers more times than I care to remember.

By contrast, my collection of shell scripts on my home Linux computers is still serving me well after more than twenty years.


>Windows does a better job of managing memory/swaps

Not sure I can buy that one


100% of OSes are better than Linux in this regard as long as overcommit is the default.


I always turn off swap and that solves this problem. I really don't understand why it is on by default anymore. If you need swap, you are doing something very wrong somewhere else.


So, it depends on how much RAM do you have. Also with a swap enabled system can swap out some very rarely used memory pages, and cache some frequently used files instead - so by disabling swap you rob yourself of this opportunity :)

I have 64 GB RAM on my workstation, yet i still have swap enabled (but with lowered swappiness value).


> anyone capable of using Linux is capable of hacking out that BS and getting a generally superior experience.

MS has made hacking out the BS harder and harder with each new version of Windows. Back in the Windows XP days, yes, I could avoid a lot of the BS on my home Windows computer (although I still had to deal with it at work because work computers are usually locked down so employees don't even have admin rights to them--if I have an issue with my work computer I have to put in a support ticket to the IT department). But even then there was enough friction on my Windows home computer to make me start using Linux at home. For a few years I was running both OSs at home, but even that got to be too difficult, and I simply stopped using the Windows computer at home, at least for my own use (see below). I've never looked back.

I do still have one Windows laptop at home, because some of the Python programs I write (I write them on the Linux computer) have to run on Windows, so I have to have a way of testing them. Even that is clunky compared to how easy it is to do things on my Linux computers. That laptop runs Windows 10, and if I am ever forced to upgrade it to Windows 11, I will probably just stop testing my programs on Windows (fortunately my livelihood does not depend on being able to do that), because Windows 11 is a nonstarter for me; the BS level has just gotten too high.

> Linux frequently demands that I spend hours chasing down issues.

As someone who has been running Linux at home for well over twenty years now, this has not been my experience. Back when I still had Windows computers at home, I spent more time dealing with issues with them than I have spent dealing with issues with my Linux computers at home.

I would make similar comments about the rest of your post.


It’s really unclear whether you have any Linux experience. Because it feels like someone who knows Windows very well, and spent some time with Linux here and there.


Windows USED to have a polished UX. Is it still better than most Linux distros? Maybe.

But it has regressed so far from where it was 25 (or more) years ago that every day I'm still infuriated and depressed that I've had to return to it for work. The idiotic UI blunders alone must waste hours of my life per week.

Aside from the absolutely baffling functionality removals, there are the hateful petty ones. Great example: the removal of Remote Desktop. I erred in getting my parents Windows laptops, thinking they'd benefit from the familiarity. NOPE. And when they encounter some defective bullshit that stops them from doing what they're trying to do, they call me for help but I can't log in from 2400 miles away because MICROSOFT REMOVED THAT ABILITY. Disgraceful.


You can use TeamViewer (free) for remote assistance for Windows. It even has a web client so you don't have to use Windows if you don't want to.


Parsec.app is also really good and free. Both are so much better than Remote Desktop.


TeamViewer is absolutely atrocious for incidental home use.

Endless chasing of the versions upgrades/matching, to the point of rendering itself unusable.

For one-off I use something else.

For more persistent access needed I use either Tailscale with mstsc or *VNC.


What about Remote Assistance (built-in)?


It's not built in anymore. Microsoft took it away.

I can't remember exactly what the last name was that they gave the remote-assistance tool, but they removed it from all but so-called "pro" Windows... the one used by the LEAST-likely people to need it.

More anti-user BS from Microsoft. The baffling aspect is how Microsoft thinks it benefits from screwing users like this. I can't wait to buy my parents a Neo and shitcan Windows from their home forever.


Augh, annoying. I don't have non-Pro handy so I didn't realize it was removed. I also see "Quick Assist" but maybe that's the other one you were alluding to that you also found to be missing


Yeah... not sure. What I found, though, is that the people most in need of remote assistance are denied it.


What about Macs, you use those?


I have never tried one, I'm not interested in Apple's walled garden approach. Buying a Macbook and being unable to put a (properly working) alternative OS on it if I don't like the one on it is a non-starter to me.


Asahi Linux is definitely a "properly working" Linux distro at this point. MacBook hardware tends to be better than almost all the alternatives I've tried.


Only works on three generations old SoCs and doesn't support external displays still.

The hardware is great and it's impressive Asahi Linux works at all but it's not for everyone.


Agreed that lack of m4+ support is unfortunate but it'll get there. However, m2 MacBook still outperforms any other kit I have (dell developer edition shipping with Linux etc.) Also external displays work today as long as you use a USBC display port cable, the other USBC video standards are supposedly in the works but display port dongles aren't exactly hard to find.


> Agreed that lack of m4+ support is unfortunate but it'll get there.

The question is when? How many generations behind will they be then? Will they have to skip support for some generations to keep up?

> m2 MacBook still outperforms any other kit I have

That's fair but you should consider what happens when Apple decides to lock down the boot process on newer hardware. I'm sure most people would give in and use macOS instead of going back to worse hardware.

If you don't see that happening just remember that all of Apple's other devices have a locked down boot process. They possibly only allowed it on Mac hardware to ease possible concerns people may have had during the Apple Silicon transition. Apple does not provide documentation for the hardware so Linux support is based on reverse engineering. If third-party operating systems do not support any of the recent hardware what is stopping Apple from cutting that unused feature from new hardware?

> preliminary m3 support just landed upstream in Linux this past week

Still way too early to actually use. GPU support is not there yet.


In all honesty, two generations behind is likely the best we'll ever get given the resource constraints on current porting efforts. I'm totally fine with that though. Buying prior gen laptops has been what I've always done since I was a kid, and I'm pushing 50 now. Buying current gen anything is a trap unless you absolutely need the fire power (and I certainly don't.)

Naturally, a hardware vendor can lock down whatever they want in their future hardware, that's their call, but specifically Apple hasn't. Maybe they will, maybe they won't, but in either case there are still millions of m1, m2 and m3 mackbooks already out there, why _not_ use them?

If/when Apple chooses to lock down future 3p boot options, it won't affect that pre-existing pool of hardware. Sure, eventually that pool of will dry up, or maybe in a decade I'll need something faster than an m3 (assuming it takes less than a decade to get m3 to where m2 is at today, which is fairly conservative), but maybe not. My requirements are fairly modest with the personal dev work I do, I'm sure plenty of folks are in the same boat. We might as well use what decent hardware already exists and is available now. If appearing to support Apple is the problem just cover your kit in silly stickers, that's what I do.

After all, it's not like Apple will make a dime off me buying their shit second-hand anyways...


Also preliminary m3 support just landed upstream in Linux this past week


You can put linux on the older Macs. I have 2015 Macbook with Debian and everything works very smoothly.


macs are amazing, you can close the lid on a macbook and the computer will be mostly asleep and won't burn down the house; and the best part is it'll work when you open it again. amazing tech.

macOS though, that's getting worse each year.


I have the exact same lid behavior on my HP laptop running Arch Linux.


Trust me I’d happily take a similarly specced Linux-compatible laptop over this company-mandated MacBook, alas.


You do know the reason the Mac goes to sleep when you close the lid is because of the OS right?


It doesn’t matter why it works to anyone but geeks


> anyone capable of using Linux is capable of hacking out that BS and getting a generally superior experience.

Go ahead, try to delete the useless Microsoft Edge browser if you're not in a select few EU countries.

In my experience, you can't do it cleanly. Asking LLMs will tell you the following:

1)Modify a certain registry key to enable deletion. Which I did, but the only thing that accomplished is un-gray the delete button in the Control Panel. Once you press it nothing happens.

2)Windows will eventually reinstall Edge. So you're basically screwed.


the swap and vm tuning required on Linux to make it work even somewhat reasonably compared to Windows or macOS is batshit insane - or brain dead to quote Linus. you shouldn't need to do any of it and yet if you don't you risk hard locking your system when you run out of RAM - or even just write a lot to a disk - with zero warning.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: