> Free/libre OSS won the battle for the server, lost the battle for the desktop
I don't think it lost the battle on desktop - if you don't rely on platform(or interest)-specific software like the adobe/microsoft suites and some 3A games then you'll be more than fine on the linux desktop.
Shame that the battle for the desktop is actually (mostly) being fought on the laptop - and Linux sucks there. The default settings are usually extremely poor for powersaving. Why doesn't powertop+TLP get installed, configured and activated by default when Ubuntu/Fedora detects its installed on a laptop? Also, sleeping still isn't reliable (if I close my Macbook with some work on I can be damn sure it'll be exactly as I left it when I open it. With Linux, its still not 100% reliable). Wifi still also isn't nearly as reliable (packet loss, connection drops etc.) as either Windows or macOS.
The fact that TLP exists is a great step. It wasn't that long ago that I had a dozen different custom hacks in place to get better power usage while on battery, now TLP does all of them and more.
Still kind of disheartening that I get 10-14h battery life on my 13" 2015 Retina under macOS, but it drops to 4-5h on untweaked Ubuntu. I can push that to about 7-8h with the help of Powertop and TLP. That's still a vast, vast chasm to overcome for Linux. Yes, I know Macbooks do non-standard things with Wi-Fi and Thunderbolt controllers (special ACPI sleep modes etc.), that isn't a bad thing, Linux should attempt those as well. To end on a positive note: I do agree TLP (and other powersaving tools) are good progress.
It's not you personally, but I see this a lot in tech, in particular from the FOSS crowd 'Well, if you don't need x or y, then its fine!'
FOSS will always be second tier for most people until the plethora of software available is at some relative parity, similar to that of macOS at least. (I'm talking strict availability). The fact that there isn't more cohesive efforts to have big software vendors port their software to FOSS systems I feel like is part of the problem.
I personally feel the FOSS community doesn't have good evangelism to actual software developers that develop non-idealistic software. Adobe is a good example of this. So is Microsoft. So is Autodesk (With some exceptions, granted, but I think those come from the need to run in parallel processing environments that Linux supports quite well, and Maya). Heck, so is Wunderlist. Or 2do. 1password. They don't make native apps or even some semblance of apps for any FOSS platform, and I see no attempts to evangelize coming out of FOSS communities to get these places on their side. I'm sure there are exceptions I'm missing or perhaps some of these examples listed have progressed but my point still stands.
FOSS is great for users, and has really evangelized well to a certain subset of users in particular (mostly the tech literate enthusiasts and developer crowd, myself included). I however, will never accept this is a good answer. We should be demanding better evangelism to get big name products on the platforms. I do the best I can myself in this regard, but I feel that the entire idea that we should be more inclusive of these vendors/companies/developers is scoffed at in the FOSS/OSS communities and I do not inherently understand why. Yes, they create code that isn't meeting the guidelines of FOSS/OSS, but if you don't create the best toolsets independent of them or have some semblance of balance, then the platform will stagnant at x number of users.
I think idealism colors perspectives but reality is much more grey than either side is willing to cede, perhaps.
It's an interesting point of view, but I'm not sure I agree with the underlying argument. Specifically, I have not seen the free software environment stagnating at all. If anything I have seen it accelerating fairly consistently since I first became aware of it in 1985. There was a time when I was forbidden from using free software in my day job. Now I am encouraged to use it. How far we have come.
This point of idealism comes up a lot and I think it's something of a red herring. Keep in mind that the FSF's goal is singular - to promote software freedom. It is not idealism to hold to that mission. It is their only purpose for existence.
Let me give you a real world example of why a mixed scenario is not realistic. My wife has an Android phone that she bought from the phone company. It is on version 5.02. There is a bug in that version of Android where a log file fills up and the phone refuses to connect to WIFI connections that have had a lot of activity. This bug has been fixed for a long time in the Android code base. The code base is open source. I can inspect it, compile it, etc, etc. I can see the bug. I can see the fix. But I can't load the fix on the phone because the phone is locked. I can't even get root on the phone. I can't fix the problem. The vendor has told me that since the phone is 2 years old (2 years!) that no updates will be forthcoming and I should buy a new phone.
The problem is that no matter how free a piece of software is, it is really only as free as the environment in which it runs. We have seen over and over again, that companies will collude to ensure that their interests trump that of the user. A world in which free software exists only as an extension of non-free software is a world in which it is marginalised to the point where software freedom is lost entirely.
It is not idealism that prevents software freedom from working well in organisations that do not want software freedom.
Where I agree with you is that we have a long way to go to allow users to connect the dots between the problems they have as consumers and the protection that software freedom affords. My own wife thinks it is completely reasonable to spend $1000 replacing a perfectly good phone simply because the vendor wishes it. Somewhat unusually, this is a consumer movement held dear by developers, but virtually unknown to the consumers it seeks to protect. This is clearly a problem. However, we won't fix that problem by abandoning the purpose of the movement. Will my next phone be an open source phone that denies me software freedom, or a proprietary phone that denies me software freedom? Does it matter?
I don't know if my point came across well, so I will try and re-iterate what I mean here in some more detail.
To be perfectly clear: I share your motivations, and I cede that the FSF is a lobbying organization for promoting libre software. I want to note that I'm not specifically taking target at FSF here just generally the OSS/FOSS community in aggregate as I have interacted with them.
Now, to the argument! :)
My basic argument is as follows:
1. Its more important to have a libre/free OS than it is to have all Applications ever be libre/free. This is a matter of pragmatism. Your wifes phone for example, would be much better off if the phone OS itself was libre. This in my opinion is the best approach to take. It gives you the bulk of the freedom that one does seek, without completely alienating developers who, in some cases given the realities of today, rightly (or wrongly), don't want to or can't open source their software. I'd rather have their software on an FOSS platform than have to leave a FOSS platform to run their software. Thats just the economics of the world at work.
2. If one accepts #1 to be a justifiable position, which I think it is, the singular focus as far as the software part is concerned is to push that operating system forward with that set of ideals. Too often I find in the OSS/FOSS communities that they don't evangelize the platform, they evangelize the ideas, which again, i completely get, and I think for users, esp. of a technical bent, get it, and use it, and benefit from it. However, I don't think the average user, or even the majority of technical users, can get away with the switch. In part, its because the same logic that makes a FOSS/OSS operating system amazing doesn't particularly scale to Applications. Again, I will reference the missing software of Adobe, Microsoft, even task app like 2Do, Wunderlist, or the lovely Pixelmator, or even Zoho business apps are not available as native Applications on FOSS/OSS platforms, even Electron-type apps aren't scaling to Linux quickly. I think this is because the platforms aren't being evangelized. Everyone is concerned about the licensing, whether they can review an apps code, etc. or they get caught up in other mundane details, instead of unifying around a platform model that allows Apps of all types to exist (which i realize they do now, but I'm talking about community voices).
3. I would say to your point, an open source phone that runs 'nonfree' software is better than a nonfree phone OS running nonfree software. Why? Well, see above, but specifically because things like OS updates and OS changes would be viewable at a source code level, thereby if any apps are making modifications to it, you will see that.
4. The one thing, and this is my biggest thing, is that the lack of evangelizing as a platform for users is until recently, there is a huge lack of direction in the FOSS/OSS community. Elementary OS is the first time I've seen where a Linux distro is actually putting users first and adopting modern design and aesthetics. Another recent version of that is KDE's Neon built on ubuntu. Great desktop experiences. I feel the user experience suffers from a lack of focus on evangelizing and focus on a platform, instead of just ideals, that presents itself as a coherent whole that is accessible to develop for with some consistent standards. FOSS/OSS doesn't have to equal completely decentralized missions split between 2000 organizations. Where there has been unification lately, the better those products are becoming and actually entice users on a platform. High Idealism is well and good, but without focus on the platform, its hard to argue to switch. If you build a good platform, and get developers to make apps for it that people want to use, you are, in my opinion, gaining more than you're losing with this approach.
I have never seen this approach until very recently in the FOSS/OSS community. I'm glad its happening. It looks like huge steps in the right direction. Now if we could get some of the larger organizations behind a consistent evangelized message for desktop and mobile you could have some real change.
Thanks for clarifying. It definitely makes more sense to me than what I understood from the previous message. Personally, I think there is room for a variety of approaches. It's one of the reasons I'm happy that the OSI is around to champion open source methodologies as distinct from software freedom. Likely, as you say, there would be some benefit from people concentrating on building user-focussed end-to-end experiences.
Having said that, I don't think the FSF needs to get involved with that as the role of prioritising software freedom in an ever changing world is probably more important and also more difficult than ever.
One thing I would caution is assuming that encouraging non-free applications on a free operating system is always going to be win-win. I worked at Corel when they were doing their Linux distro. They misunderstood free software badly. They saw it as an opportunity to lock people into their proprietary applications by changing the underlying OS (without having to pay for developing it). This led to some pretty strange business decisions and ultimately wasted a huge amount of money.
There were a couple of spinoffs from ex-Corel employees who tried to maintain this way of thinking and it was always an uphill struggle. Probably the only one to make any kind of profit was TransGaming and I'll maintain that was substantially because Gav State is both a talented and genuinely nice guy.
I think in order to to make this kind of thing work, you need to understand free software (and the goals of free software) at a very deep level. From my experience, it's just no compatible in a natural way.
I agree they did misunderstand OSS/FOSS badly. The last thing a company like Corel needed (or needs) to do, and I know this i very opinionated; is to make their own Linux distro. This is kind of my entire point. A platform as such that is coherent and well evangelized (I know i keep coming back to this, but I was around for the 'we're going out of business in 90 days' Apple years. It took a huge level of evangelism to keep that ship moving) perhaps would have encouraged Corel to just make their software for Linux, which to me is the win. Yes, its a proprietary Word processor or Image manipulation tool, but the OS would still be free/libre, and Corel, at the time I'm assuming this happened, could have been a win with the platform.
I reject, as someone else mentioned, as an aside, that macOS is just free software with a non-free GUI on top. That is missing the boat completely.
Your story with Corel is my experience with the majority of companies I worked for, very few if any, did care for FOSS for anything else then cutting down costs without giving anything back.
This is why licenses like MIT are so loved by such companies, now with SaaS and Web UIs it is even better for them.
If you're into safety than you should probably go with a functional programming language with low-cost abstractions. Rust isn't an expressive language, its only benefit compared to other general purpose languages is its memory management.
I haven't developed this thesis fully, but I've been thinking that Rust's model is a better version of functional programming, and we lack a good term for it (since it's obviously not a pure functional language). The point of functional programming is to avoid shared mutable state by eliminating all side effects and using a rich type system that's hopefully easy for the programmer to use. Rust avoids unsafe shared mutable state, without requiring the avoidance of all side effects, and by using an even richer static-analysis system that tracks exactly what side effects are safe.
Of course, there are a lot of useful things a pure functional programming language gets you, like Haskell's implicit IO scheduling and threading, that Rust doesn't. But for many use cases where functional programming languages are great, they're great for specific reasons that Rust is also great at.
I've had similar thoughts. Rust does a really good job of asking, are you really sure you want that shared mutable state? It's possible to break out but I've found the friction involved has guided me towards better solutions.
It's worth noting that Haskell has "shared xor mutable" via the ST Monad. It effectively providing mutable memory cells that aren't allowed to "escape" a scope in much the same way as `&mut`.
Also having actual purity knowledge (something Rust punted on before 1.0) can sometimes be useful. Although honestly 99% of the time it's only for the benefit of the compiler, which isn't a big deal if you have the tools to write code that's efficient from the get-go.
e.g. list fusion is enabled by purity, but is largely uninteresting in Rust because lazy iterator chaining already orders operations and avoids intermediate lists just like list fusion.
A common point I've seen is that "Functional programming sees the aliasing vs mutability issue and declares that the solution is to avoid mutability altogether. Rust goes in a different direction, saying that only aliasing XOR mutability is allowed".
It's solving the same problems, and the solution ends up having many similarities with functional programming.
When people keep writing "aliasing XOR mutability", don't they actually mean "aliasing NAND mutability"? And if yes, is there a reason to use XOR here? And if no, isn't it the case that if it's worth communicating, it's worth communicating clearly?
Can you elaborate more what makes you think Rust isn't an expressive language ? And what feature your «functional programming language with low-cost abstractions» has that your miss with Rust ?
In my experience Rust is really close to OCaml in term of expressiveness. I don't know what feature Scala or F# have that Rust doesn't.
The only major thing I can think about is Higher Kinded Type in Haskell. They are indeed cool and there is work underway to add HKT to Rust (but nobody is sure yet if it's possible though).
I don't know a great deal about scala or f#, but I know scala has higher kinded types as well as implicit function arguments which are things that rust does not have. OCaml had parameterizable modules and functors, tail call optimization, and presumably other important features.
I'm much more familiar with Haskell, which definitely has a lot of things that rust does not, not just higher kinded types but sophisticated type level programming, type families, functional dependencies, existential types, etc etc etc. Additionally it has lazy evaluation, a very sophisticated green threading system, software transactional memory, etc. Not to mention of course the purity. Haskell has a bunch that rust does not -- of course the reverse is true as well, but it's worth mentioning.
At the end of the day, though, rust is meant to address use cases that no mainstream functional language does, and it is very successful at it, while also having a ton of really cool features which makes it very exciting for FP enthusiasts.
Rust is very expressive. The main benefits of functional programming languages is contained with Rust's Iterator trait and the ad-hoc polymorphism that traits allow.
I don't think that means it isn't ad hoc. Type classes in Haskell were originally introduced as a form of ad hoc polymorphism, for example: http://202.3.77.10/users/karkare/courses/2010/cs653/Papers/a... (One of Wadler's great papers btw, worth a read.)
"ad-hoc polymorphism" is an informal term; traits are sort of both ad hoc and parametric.
I've sometimes imagined someone writing a paper about specialization and naming it "How to make parametric polymorphism less parametric" after Wadler's paper.
Here's an example: `Stdin` and `File` both implement the `Read` trait. Therefore, you can write a function that takes the traits that you want to use as input parameters.
fn do_something_with<S: Read>(source: S) {
// do something with input source
}
Then you can do the following:
let stdout = io::stdout();
do_something_with(stdout.lock());
let file = File::open(path).unwrap();
do_something_with(file);
You're free to narrow down/expand the types/methods that can be used with the function if you add more traits.
As I understand the term, "ad-hoc polymorphism" refers to dispatch based on types (as opposed to "parametric polymorphism", where one code path works for any type). Nothing to do with type-safety.
What are the benefits of the native package manager over something like Vim-Plug? Because as I know it can't download/update/delete/snapshot/diff packages from vim.
> Thankfully there is a spectrum of software available, so other people's choices and preferences (whether proprietary or not) aren't forced on everyone.
The software architectures - mostly the ones close to the users - of the business and education sectors are almost full with proprietary software which tends(and tries hard) to be incompatible with alternatives, especially with FOSS. The mentality of the regular user is to use what he/she has been taught to use - freedom has been presented as too "geeky"(catch phrases like "if you're different it doesn't mean you're useful"). The educational ministries literally sold those people to brands. It might sound quite disutopic but it can get worse.
> Also, OpenBSD, DragonflyBSD, and FreeBSD can all be used as server or desktops without much fiddling. Their installers are better than anything in the Linux world. You can have one up and running with a nice wm, browser, etc, in 15 minutes.
When I wanted to try FreeBSD the installation failed multiple times and after the first success I couldn't install a DE for hours. I've also had problems with the package manager because I couldn't install anything before doing something with some configuration files. OpenBSD is the same category but at least it doesn't failed at installation as much.
> I like FreeBSD, and I use it as a server OS and on a NAS box but you only need look at https://wiki.freebsd.org/Graphics to understand that if the "Linux Desktop" is a joke compared to MacOS and Windows then the "FreeBSD Desktop" is even more so.
I've never heard anything good about the "MacOS desktop" and I'm pretty sure that the "windows desktop" is far behind the "linux desktop"(plugins, performance, menus etc.). Have you tried something else than xfce or unity?
> far behind the "linux desktop"(plugins, performance, menus etc.). Have you tried something else than xfce or unity?
You're missing the point, GP is talking about a level below the DEs, namely the graphics stack, which if unavailable or inefficient makes any discussion about the DEs moot since the GUI may very well be unusable at all in practice.
In that regard the "Windows desktop" has plenty of favorable points and "macOS desktop" is just stellar because it has had a comparatively perfect track record of driver support including a compositor since forever.
With a VESA/OPenGL/Quartz software render fallback many people on the Hackintosh crowd have been using OS X on unsupported cards via software fallback and not even notice they weren't getting the full Quartz QE/CI, which is nothing short of astonishing.
This is the signature attitude that points to the point being missed, what good is conky or awesomewm or ratpoison or xmonad or openbox if I can't make tearing and rendering artifacts disappear nor get proper colour management or sane HiDPI support? (PS: Xinerama, I hate you)
As for the DE experience on other platforms, Windows has had a kinda-tiling WM that regular people do use in the form of AeroSnap, while macOS is now getting tabs-in-the-window-manager, and Exposé+Spaces that evolved into Mission Control has been a positively brilliant experience for years.
Meanwhile on Linux, the very fact that you have to massage it into something useful ever so slightly at every level is, to me, a telltale sign of its deficiencies. Things are getting better (wayland, drm2), but they're getting better on the other platforms too. The best part is that nobody's standing still and things are moving forward (hopefully, as from the outside it looks like Linux people are running in circles these days around, so I hope reinvented wheels are getting rounder).
> As for the DE experience on other platforms, Windows has had a kinda-tiling WM that regular people do use in the form of AeroSnap, while macOS is now getting tabs-in-the-window-manager, and Exposé+Spaces that evolved into Mission Control has been a positively brilliant experience for years.
> Meanwhile on Linux, the very fact that you have to massage it into something useful ever so slightly at every level is, to me, a telltale sign of its deficiencies. Things are getting better (wayland, drm2), but they're getting better on the other platforms too. The best part is that nobody's standing still and things are moving forward (hopefully, as from the outside it looks like Linux people are running in circles these days around, so I hope reinvented wheels are getting rounder).
And then again, after using almost 10 years xmonad at work and at home for everything, using Windows or OS X is definitely not a nice experience. I don't really care about GUI tricks or clutter on my desktop. I just want my terminal, my editor and my browser to appear when I press the key combinations and I want my desktop to manage the windows so that they're always in the right place.
The other thing I don't miss from the Windows and OS X world are the updates. I don't want to spend one work day to update my system to the next version and possibly fixing things because something changed in the xcode and I need to reinstall it to get my toolchain to work, or something changed in the Windows Linux subsystem and again I'm spending time fixing stuff.
I like things to be simple and a very basic Linux installation with a good wm which you know by heart is miles ahead of everything else. And it doesn't change so you can focus on your work.
Most people, otoh, enjoy playing games, watching videos, that sort of thing. Some people even do work that's dependent on the graphics card. Just because you use your computer for a very minimal amount of tasks doesn't mean everyone else does.
Absolutely, I've been using it for a while (NixOS being my current distro), but I took what my parent was saying as that the graphics stack is irrelevant because they don't need to do anything with the graphics stack.
I do all of those things. My gaming PC is solely for games and netflix and it's connected to our television. I think it more as a gaming console than a computer. It has Windows 10 in it just for Witcher 3 and to remind me of never installing it to my laptop or workstations...
With my arch/xmonad laptop I can handle everything else, except that damn Witcher 3 :)
I think you are referring to the critique on television?
Perhaps you can pay more attention to the last sentence of vertex-four's comment. That being "Just because you use your computer for a very minimal amount of tasks doesn't mean everyone else does."
There are a million and one ways to use your graphics. Be it watching videos or gaming, which while you might have a problem with that, is actually done by people. It can also be used to smoothly decode and render a training video, or to make spreadsheets more pleasant to crawl through without the screen tearing and making everything look like a mess. There are even more minuscule applications you can do with graphics hardware and the software to drive it: transparency to subconsciously still remember what you're returning to when you click on another window, for example.
As nice of you as it is to warn the parent about their information-action ratio, you could use this opportunity to appreciate people's freedom of choice as well as understand there is more to Windows and Mac than Netflix and Call of Duty.
Wow, I am not sure how you managed to infer my lack of appreciation for people's freedom of choice merely from a title of a book. You may understand what I am trying to convey, if you ever read the book.
See, from my point of view enough was added to the discussion with just a comment about a "niche book". Rest of the comments on it are mere reactions of ego:)
ChuckMcM's criticisms seem to be of the X Window System, not of any particular desktop environment. Architecturally, it really doesn't fit with modern systems and has a lot of legacy baggage. It doesn't matter what DE you like, X still powers it, and it's the issue.
Wayland is on the way, which may bring Linux graphics into the 21st century, but I haven't been following that very closely.
Apple's way more on the ball about GUI stuff. While OS X lags behind in OpenGL support, which sucks for games, they nail everything their desktop environment needs to function. But notably, they don't use something like X. They have their own software stack.
DRI2 is an improvement over DRI, of course (which was already about five to eight years behind SOTA for GPUs when it came out), but it's still a rickety, insecure mess--for one, any application can fire messages at any window in the X environment and read input sent to any window, which has some obvious implications for the usefulness of containers on the desktop--that is sorely in need of replacement. This isn't to criticize X when it was developed, mind you--it was fine then. It should have been defenestrated by 2002.
Full disclosure: I get the feeling from your posts that you're looking to win an argument, rather than learn anything, so I'm not likely to respond to you further unless your tone changes. HTH. HAND.
Thanks for the informative link on the technologies that comprise the graphics stack on a typical GNU/Linux system. I see the author also has a great explanation of the different code components that form part of the Direct Rendering Infrastructure (DRI): http://blog.mecheye.net/2016/01/dri/
It's not a security problem really. That's just a side issue. The problem is, today, if I want to watch a movie, play a game, do anything that requires using some slightly advanced GPU stuff, I'll get tearing, artifacts, or downright crashes.
This is mostly because X11 sucks. (It's also partly because the proprietary nvidia drivers sucks). Nowadays I reboot on windows to watch netflix or play games (even linux-compatible ones) because the experience is just much better. Linux has been constrained to work stuff.
Now wayland is coming (apparently. NVidia is still trying to pull some shit[0]) and should fix all that. That's great ! I tested wayland a few months ago though, and my GPU's proprietary drivers weren't supported. So still no games, still no movies. :(.
I think this honestly comes down to driver support rather than the Xorg stack. I dual boot with Windows 8.1 (Which is the OS supported by my manufacturer) and Arch Linux, and Arch is consistently more stable and faster for viewing videos and playing games than W8.1.
For example, I tried watching TNG from my external drive the other day (Using the latest VLC for both). On Linux this isn't a problem, the quality is good and there's no stuttering. On Windows it was hellish: The quality was extremely poor and it froze every (literal) 10 seconds to grab more data from the disk.
I've been using a Linux machine (Ubuntu+X11+nvidia+Kodi) as my HTPC for years. What's so sucky about it that I didn't notice during that time? No tearing, no artifacts, no crashes.
If you're using the nouveau drivers, everything's fine, but you don't get hardware acceleration. That means no games, but mostly tearing-free rendering, and no artifacts. The nvidia proprietary drivers is probably what's causing problems.
I'm using the proprietary drivers, 340.96-0ubuntu0.14.04.1 to be exact. The machine is 2009 vintage Atom 330 with Nvidia ION graphics, so I need VDPAU support for Full HD video decoding.
I have the suspicion that by the time Wayland is done, it is as much a mess as X is right now. Meaning that we would be better of just finding a way to treat every program as a root window (effectively killing the information leakage that exist right now).
Back in 2011, security researcher, Joanna Rutkowska wrote a very enlightening article on the inherent security deficiencies of the X server architecture which allows allows any windowed application to control any other one: http://blog.invisiblethings.org/2011/04/23/linux-security-ci...
Good to see she added a better graphics system at some point. I suggest looking at prior work, though, as it's by security engineers with knowledge of adding it to every aspect of lifecycle and intent to minimize TCB. For example, GenodeOS is using Nitpicker + a microkernel. The abandoned EROS OS, whose site is still up, combined a different windowing system with a capability kernel and new networking stack. All to minimize risk in system in fine-grained way.
Yup yup. Like, I recall a blog post by a Docker core contributor and employee saying, "Run your desktop apps in Docker containers to sandbox them! Just volume the X socket into the container."
(Of course, the Dockerfiles were also running apps--like, say, Chrome--as root, with no user namespaces, so it wasn't exactly great anyway.)
Sure it's breakable, but that still isn't bad advice. In practice, an application run with the evil bit set won't know how to break out of such a container, nor how to abuse the X socket.
It's only bad advice if you're relying on Docker to provide security against targeted threats. Know your enemy.
I.e. X11 working in same way as Windows, OS/2, BeOS, MacOS, OSX, etc.
No, it's not. OS X has GUI isolation, an application cannot read keystrokes or read other application's contents, unless you explicitly give permission to do so. This is the reason why applications that can do more, such as Alfred, require you to explicitly enable these rights in the Privacy section of the Security & Privacy prefpane.
Windows also has GUI isolation (UIPI), but it's a bit murkier. As far as I understand, lower-privileged applications cannot read events from higher-privileged applications.
Windows10 is already recording every keystroke/data of yours - what's the difference here? Also, Canonical's snappy packages can kinda solve this problem.
Since about the time of Windows Vista, non-administrative windows aren't able to hook keyboard events for administrative windows.[0]
So while it's possible for a program to listen to keyboard events for other non-administrative windows (such as the password for a browser), it isn't possible for a non-administrative window to grab keyboard input for stuff like windows password prompts, or information typed into administrative console windows, etc.
Based on this thread, I'd say the problem is a lot of people haven't given them the time.
If you select your hardware, they work pretty flawlessly. Certainly right there with Windows or OSX or anything else. In a lot of cases you don't even need to be very selective, you just need to have a modern machine.
> Based on this thread, I'd say the problem is a lot of people haven't given them the time.
> If you select your hardware, they work pretty flawlessly.
Maybe the fact that I have to give them time is one of the major problems? You know how much time I invested in getting my Windows and OSX desktops to work flawlessly? Zero.
If that's your anecdote, here's mine: _Every_ time I install Windows I need some third party driver just to make graphics and networking go. Then I need third party software just to get a development environment or a decent shell.
Upgrading these are hit and miss, and it's more common than not that third party drivers do not support newer versions of the operating system.
Linux, meanwhile, integrates all of these components. As long as you stay away from non-supported third party drivers it just works, and upgrades are painless. (Until some desktop developers change the GUI again, but that's another story.)
I have been using Linux for a long time and while all the underlying criticisms of X and the desktop environments are valid I have to concede things have come a long way.
I think a modern Ubuntu 16.04 Unity desktop for instance is actually a bit of a revelation for long time Linux users because it just works out of the box. I didn't have to install a single package or fiddle with anything for a change. It's fast, smooth, robust and works exceedingly well out of the box and wait for this it's even a delight to use.
It's another thing I am a long time Gnome user and won't even have tried Unity because of the now I think inexplicable bad press Unity has got but it's been quite a revelation. I urge those who have not tried it yet due to bad press to give it a shot and prepare to be suprised.
I compare it to my Windows machines and OSX laptops and do not feel a particular difference apart from preferences and of course once you get into specific use cases like adobe apps or gaming there is still ground to cover. But for a general productivity desktop with full acceleration I think it's there in many ways. If you need a rich ecosystem of dev tools it goes to the top. There is of course always scope for improvement architecturally and other things and I think that is happening with mir, wayland and Vulcan.
I do a windows reinstallation about once every second year, and I routinely get problems.
Network drivers are almost always missing. Looking at the device manager, about 5-10 different devices fails to auto-install, all part of the motherboards. The default graphic drivers tend to "start", but is limited in refresh rate and resolution, and moving around windows shows a noticeable stutter until I install some official drivers from the graphic card manufacturer. Sound normally works without issues.
On linux, the problems are almost the reverse. I have yet to have network problems on a fresh debian installation. Graphic is a all-or-nothing deal, which means either x-server starts up normally or it refuses to start completely. Sound is normally a pain, but looks to have better default behavior in the last 3 years or so.
> I do a windows reinstallation about once every second year, and I routinely get problems. Network drivers are almost always missing. Looking at the device manager, about 5-10 different devices fails to auto-install, all part of the motherboards.
What are you reinstalling, Windows XP every time?
Newer versions of Windows are getting better at finding and installing sane drivers for the hardware. It’s still not perfect, Windows Update doesn’t have all the latest drivers and OEMs ruin everything especially on laptops, but these days I install the latest Windows and everything usually just works.
Windows 7, desktop machine, and windows update don't really work without a functioning network connection.
But if you don't believe me, and choose to believe those reporting issues with Linux, just do a google search on windows install and network issues. There is plenty of people reporting the same issue. to cite windows own support page:
"If Windows can’t find a new driver for your network adapter, visit the PC manufacturer’s website and download the latest network adapter driver from there. If your PC can't connect to the Internet, you'll need to download a driver on a different PC and save it to a USB flash drive so you can install the driver on your PC"
At this point, Windows 7 is 7 years old. Do an apples to apples comparison here, and try getting your same hardware working on a copy of Ubuntu 9.04 or RHEL 5.3. How hard is it?
Why should I only look at the exact date which windows 7 was released, rather than the date when a major release of debian got out, say 2011 with Debian 6.0? No network issue, occasional problems with x-server, problems with sound (as said above). Network works out of the box. On windows, issue with network, issue with graphics (until driver was installed), no problem with sound.
A few months ago I bought a gaming laptop on release date. Guess what, network worked without issue. X-server did not start until proprietary drivers were installed. Sound worked.
Windows release schedule do not match Debian's release schedule. Each year after a release, the default drivers will get worse, but the general experience can still be obtained by people who experience the install process. The experience I have endured with windows is issue with new motherboards and especially network drivers (and built-in raid cards for the installer... good grief, that was a wasted afternoon trying to get the installer to accept the raid drivers). Second to that, the fall back graphic drivers look crappy and is bad in every way except that its slightly better than a basic command prompt.
We're literally talking about a 15 minute setup on a variety of nixes on a wide-range of hardware.
Intel chipsets and their graphics drivers have excellent support out of the box. That goes for everything from refurbished Thinkpads to Chromebooks to used Macbook Pros to new desktops.
I run a Linux desktop for work (typing on it right now) as a dual-boot with my Windows setup. It is a "modern machine", at least if an i7-6700K and a 980Ti are modern, and I couldn't get X to start with Ubuntu 16.04 LTS. And yet it did with Ubuntu 14.04, which is bizarre. My time is valuable enough that I didn't debug it further than that--if I wanted to light my time on fire I'd just go set up a Hackintosh and have a better desktop experience, so for now I'm getting by with 14.04. But that sucks profoundly; it was my first attempt at a Linux desktop in probably three years and I was pretty disheartened by how things have gotten worse for my use cases in that time.
"If you select your hardware," you can definitely have a decent time. But I don't know many people for whom the operating system is more important than what you can do with it, and part of "what you can do with it" is "use your hardware."
Having had an issue with Ubuntu and nvidia in the past, you might want to google NOMODESET and setting it at boot, which should let you boot into X/Unity and get the latest drivers.
> But I don't know many people for whom the operating system is more important than what you can do with it, and part of "what you can do with it" is "use your hardware."
Absolutely. But if OSes aren't directly equivalent (and the hackability of a nix gives it more power than Windows can ever have), then it's worth sorting out those hardware issues (as frustrating as they are).
> Having had an issue with Ubuntu and nvidia in the past, you might want to google NOMODESET and setting it at boot.
Thanks; I later ran into something that hinted at that. Frankly, though, at this point I don't care. I have something that works and will be patched until 2019. I don't care about my desktops. Every second I spend debugging something stupid on a desktop is a wasted second. This is bad for me and I resent it.
> and the hackability of a nix gives it more power than Windows can ever have
Ehh--if you have to use that "hackability" to get something that is minimally usable, that's kind of a push. (Or, when you consider OS X, a serious negative, because the only thing I have to do to get OS X to where I want is install Homebrew and a short list of packages, none of which require configuration.) I don't care about desktop environments or tiled window managers, the extent of my interaction with my WM, which I could not name, is a sort-of reimplementation of Aero Snap. Again, if I wanted to throw a bunch of time away on an operating system, I would set up a Hackintosh and actually be able to use Photoshop. Which, in keeping with the theme of "Linux desktops are a fractal of unusability", would be a significant improvement. I tried to avoid a reboot and use the GIMP yesterday for something. I think I need counseling now. I ended up using Pinta, whose layered image format ("OpenRaster", which wants to be a standard but it seems like nobody uses it?) is so bonkers and edge-case that ImageMagick doesn't even support it, to say nothing of Photoshop or even Paint.NET.
It turned out, kind of to my surprise, that Linux on the desktop offers very little to me as a Linux developer, sysadmin, and long-time entrails-reading "power user". That's pretty damning, in my book.
Yeah...I went down a pretty bad rabbit hole trying to figure that one out. I mean, XCF already exists? And application support is marginal, rather than nonexistent.
Without a bug report, nobody can tell if your case is unique or general. You spend time to rant here but with a bit of additional time, you could have opened a bug report with the appropriate information and maybe help people running into the same problem.
I think it should be obvious that I don't have a reproducible case anymore; this was a couple months ago now, after I built a desktop. Nor am I interested in expending multiple hours creating one, because that doesn't benefit me--my stuff works now, if suboptimally. I couldn't leave it in that state then, because I had work to do with that hardware; I couldn't leave it in a trashed state just in case somebody had questions and needed to autopsy it, now, could I? Expending more time than I did would be better served just biting the bullet and setting up a Hackintosh so I have an environment I like, rather than tolerate.
And this isn't a "rant". I promise you, when I'm ranting, you will know.
I'm self-employed, so yes. I've got four grand of Apple gear in my office, and if they sold an xMac I'd buy one tomorrow. I don't consider it to be a moral question, and in practice Apple doesn't seem to really care.
That's a fair position. From mine, Apple gets its vig from me in plenty of other ways. There's zero pirated software, music, videos, etc. in my house or my business; giving Apple another channel through which to sell me stuff, from developer licenses to apps, doesn't trouble me.
How are you supposed to run Adobe inDesign on a Linux desktop?
Or any other useful application, really.
Linux should just kill off its desktop. Mac OS X won the Unix desktop wars, and has the expected use models of a desktop, with a proper modern GUI API as well, instead of the ancient X hacks.
> Linux should just kill off its desktop. Mac OS X won the Unix desktop wars, and has the expected use models of a desktop, with a proper modern GUI API as well, instead of the ancient X hacks.
Mac OSX won the "desktop wars"? That's not even funny - it's one of the most awful DEs. You people have clearly no experience with any other OS besides the one from your favorite brand. "Hacker news"... more like "Noob Army News".
We've banned this account for repeatedly violating the HN guidelines. If you don't want it to be banned, you're welcome to email hn@ycombinator.com. We're happy to unban people when there's reason to believe that they'll only post civil and substantive comments in the future.
You have a longstanding pattern of abusing HN, including (I believe) with multiple accounts that we've had to ban in the past. We've given you many warnings and requests to stop. Since it seems you can't or won't stop, I'm banning this account as well.
I've never heard the opposite. I've switched to OsX 4 years ago after being completely unhappy and irritated with Windows for over 12 years. Now after months of using Ubuntu (unity, xfce) and Mint (Cinnamon) I'm convinced that OsX is best for average (and power) desktop user by a huge margin. Although I would be much more happy with something Open Source.
You can hear the opposite from me then. I have an OS X laptop and a desktop running Gnome (Fedora) and I find Gnome far better. Originally, I didn't like Gnome 3, but after I used it a lot, I got used to it; I added some extensions, and now I prefer it. In my opinion:
* Finder is garbage (alt tab is broken and non configurable. I can't figure out how to convince it that the built in display is always the primary display. It seems to struggle with nfs mounts which Gnome doesn't at all. I've had just as many, or more issues with OS X and projectors as I have with Gnome). Where is the centralised location for Finder extensions to tweak it to my preferences?
* The default command line tools that come with OS X are old and basic (reminds me of Sun's tools) and have a death of a thousand paper cuts (e.g. top defaults to sorting by pid, lol; they still ship bash 3.5.57, locales are all fucked up).
* brew, compared to dnf or apt, is not good. But that's Open Source people trying to shore up OS X deficiencies so I won't call it "garbage". (F)OSS people put in a lot of effort to keep it running and they deserve praise.
* The programs from Apple are all terrible. I literally don't use any except Safari sometimes since it allegedly doesn't use as much battery as Chrome or Firefox. I guess I sometimes copy paste stuff into Textedit and I sometimes use Calculator.
It's all certainly usable but there is no consensus that OS X is the best. YMMV. My feeling is that each time there is an OS X release, it jumps ahead of Gnome; but Gnome usually trundles along and surpasses it and maintains a lead most of the time.
It can be used for free until you decide to buy it.
> alt tab is broken and non configurable
Cmd+Tab? What exactly do you find annoying?
Try Missing Control (F3, or set a mouse hot corner in System Preferences -> Mission Control) to quickly see all windows (for all process, or Ctrl+F3 for current app, Cmd+F3 for desktop) and just raise the one you want.
Cmd-` to change between windows within an Application is a behaviour I don't like and prefer Cmd-tab to cycle through all windows. Gnome defaults to the same thing as Finder but it's possible to change the behaviour.
> Which [of Apple's programs are terrible], and how, exactly?
Well since I stopped using a lot of these programs where possible, my opinions may be out of date. But Quicktime player doesn't play most files. It's slow. It hangs. It's all around worse than VLC and MPV. And you used to have to pay to upgrade it to do things that other players do for free. And why can't it play DVDs? Or can it? If so, why is there a separate DVD player? (Rhetorical questions; I don't care since I don't use either program).
App Store is outrageously slow to do anything. 14 seconds to check for updates? wtf? (server backend and not gui issue, but part of UX). I had XCode update fail at downloading the 3 or 5 GB ball of mud. Instead of resuming the download or checking the hash of the file to make sure it wasn't corrupt, it just tried to download 5GB again.
XCode takes up 5GB and seems to be required for some particular development work. It's not really appropriate for Mac Book Airs with their small ssds. So I just don't do that work or try to use a Linux VM.
Facetime just doesn't work 3/5 times.
Preview is alright for reading PDFs, but you can't look at an image in a directory and press a hotkey to see the next image in the directory (as least, I haven't seen how); eog does this.
Mail is an absolute pile of garbage. The threading is confusing as hell. It's soooo slooooooow. Then it hangs when pulling in sysadmin style alert folders (with thousands of mails). Deleting mails is also very very slow (i.e. cleaning out said mailbox takes days). Instead of pushing operations to a background thread it where you can see the progress like in Thunderbird or whatever, it just beachballs. And for some reason, if you want to attach Calendar to an online account you have to do it through Mail (wtf) and if you don't want Mail to be used for the Mail then you have to configure the account correctly. Some people may be tricked into subjecting themselves to the pain of using Mail. :(
With Mail being so terrible, one ends up using Thunderbird or Outlook which have Calendars integrated. So Calendar becomes superfluous. Which is a shame because it works alright with online calendars, but it doesn't seem to have the integration to help plan anything with people.
I don't understand why Notes, Reminders, and Stickies all exist. Reminders should be rolled into Calendar. Notes allegedly integrates with Google but I don't see anything from my Google Keep account in my local Notes hierarchy. And is Notes supposed to compete with OneNote? They've a lot of catching up to do.
Then there's a lot of programs that I haven't opened in a decade since they used to be terrible (iTunes can't even play ogg out of the box, wtf) or I just don't have a use (photobox, game center, imessages). I used to have Pages and Numbers but I don't remember being impressed with them. From what I read, I haven't missed anything by not using them. But if they're free now, why aren't they installed by default for people who might want to write a document?
And if I've signed on to App Store with my icloud id (which needs a credit card to get the free updates, wtf!?), why is there no single sign on? Why do Game Center and Facetime and iMessages prompt me for iCloud credentials? I guess so I can sign in with different accounts (corp vs. personal?) But keychain access doesn't prompt me. So maybe keychain isn't backed up automatically to the cloud... :(
I used Linux on the desktop from 2001-2007 or so. Switched to OS X until then. It's no contest: OS X is smoother, more consistent, and lower maintenance. Linux desktops seem to have peaked with GNOME 2.x. I'm not even sure what's going on with Unity.
I tend to agree. My development workstation runs Linux (and it's fine, but on that machine I only need Sublime, a terminal, Chrome, and IntelliJ--anything creative or entertaining I do on my laptop or in Windows respectively) literally only because I'm not buying a Mac Pro. It works, sort of, after spending a nontrivial time figuring out kludging together decent replacements for common functionality (xsel garbage for pbcopy/pbpaste, etc.); it probably helps that I already install the GNU coreutils on my Macs because I know the GNU tools better.
In retrospect, a Mac Pro would probably not be that much more expensive than the time I spent getting this fairly minimally demanding environment set up, though on the other hand I did re-learn a decent bit of stuff about Linux in the process. On the gripping hand, things related to X aren't particularly important to my life, so that's kind of a push.
I agree that GNOME and Unity have jumped the shark — I think that I started to notice when GNOME got rid of xscreensaver and replaced it with a screen-blanker, with an unfulfilled promise of adding screensavers back someday, c.f. https://mail.gnome.org/archives/gnome-shell-list/2011-March/...).
Meanwhile everything went down the path of compositing and otherwise burning way too much CPU to do not a whole lot extra.
I switched over to a tiling WM years ago, and have been extremely happy since. It does exactly what I want; I can connect to a running instance with my editor and reconfigure it, as well rewrite it while it's running. It's pretty great.
Then I'd encourage you to try KDE. 4, not 5; 5 is still being developed.
I don't know what's going on with the Gnome guys, but KDE is flat-out the best desktop manager I've used, and I'm including Windows and OS X in the comparison. It's not without glitches, but the glitches are mostly of the "Google can't be bothered to make Chrome cooperate, and no-one has updated Emacs' DM support in the last ten years" sort.
As a long time Mac user I find KDE hard to enjoy. The UIs of KDE apps are almost universally overpacked, messy, or lopsided (one side of the window had its controls crammed while the other has awkward white space). It might sound like a silly gripe but it really bothers me. The design of apps from GTK+ desktop environments (GNOME, Cinnamon, Pantheon) generally feel much better, but they have their own problems.
Would you mind explaining what you mean by outrage? For productivity, Unity beats any Gnome 3 setup I've tried so far. And we're still not talking about the maintenance/security nightmare that is the Gnome 3 plugin system.
> By "outrage" I've meant canonical has developed it because they didn't like gnome shell(they like the innovation-thing).
After watching the "we're the only ones who know best, so shut up"-antics of the Gnome developers, I can understand that canonical got cold feet.
> Unity has problems with multiple monitors, consumes more RAM and CPU and is also hard to customize.
In my experience, setting up multiple monitors was much less of a hassle under Unity than under Gnome. I don't have any recent data on the resources use, but I wouldn't run either Unity or Gnome on a low memory setup. Firefox and Chrome dwarf any RAM use for compositing anyway.
>But we should talk about it if we're here...
Ok, gladly. In Gnome 3, a lot of functionality comes from extensions. Even changing the theme (from a black top panel) needs an extension. Installing extensions is done over the web using your browser (this works to a varying degree out of the box). I don't know of any recent changes, but about a year ago I had a look under the hood, because I wanted to make my own extensions, and was shocked by how they work.
First of all, there seems to be pretty much no integrity checks, signing, hashing to prevent malicious extensions. That's pretty much a no-go if you want to use Gnome in any kind of industrial setting (you will have to maintain them offline and manually on the file system level, sidestepping the supported way).
My second gripe is the stability of extensions. Since you're downloading from a website, the currently offered may not fully support your slightly outdated Gnome install. However, if you keep your Gnome up to date, expect random failures of your extensions.
From http://lwn.net/Articles/460037/ about the reasoning behind sidestepping distro packaging (emphasis mine):
"The second reason is that extensions are not working with a stable API, so have very specific version requirements that don't really fit into the normal assumptions packaging systems make. If a packaged extension says it works with GNOME Shell 3.2 only and not with GNOME Shell 3.4, then that will likely break the user's attempt to upgrade to the next version of their distribution. We'd rather just disable the extension and let the user find a new version when it becomes available."
So, you just updated your Gnome and your productivity extensions fail. Now you have to search replacements and, should you find them, configure them anew. Sometimes extensions just randomly fail. This was the main reason I finally gave up on using Gnome for work.
And speaking of stable APIs: another shock was to see that extensions don't operate against a plugin API but are basically Javascript code smudged into the existing running code. This makes gauging the effects one extension has on another pretty much impossible.
My final conclusion regarding Gnome 3 was that it is a wobbly work in progress, less configurable than Compiz (there just aren't that much extensions to choose from in the end), based on questionable design principles and taste. It's okay for hobby use, I guess. But I have yet to find a distro that provides an as polished DE setup with Gnome as Ubuntu does with Unity.
Don't get me wrong, I want Gnome to be great, since many distros use it as default DE and I want an alternative if Canonical mucks up Unity with their Mir transition and their focus on smart phones. There's also Cinnamon wich I kind of like, but it has the same problems regarding extensions as Gnome. I will give KDE a closer look in the future.
> After watching the "we're the only ones who know best, so shut up"-antics of the Gnome developers, I can understand that canonical got cold feet.
If only we could exile the GNOME devs, Canonical & the systemd devs to a desert island, the state of the Linux desktop would be … well, probably not as good but at least it'd be a much more collegial community.
To get back to the original topic: the attitudes of the GNOME devs, Canonical & the systemd devs reminds me of that of the OpenBSD devs, with the exception that the OpenBSD guys are generally right, just socially inept in how they convey their message.
> If only we could exile the GNOME devs, Canonical & the systemd devs to a desert island, the state of the Linux desktop would be … well, probably not as good but at least it'd be a much more collegial community.
I wouldn't judge them that harshly, but the kerfuffles had the effect of splitting the Linux community into those who embrace progress and those who shun it. Sadly, most of the experienced went the latter way because the progressive path was full of cranks.
> To get back to the original topic: the attitudes of the GNOME devs, Canonical & the systemd devs reminds me of that of the OpenBSD devs, with the exception that the OpenBSD guys are generally right, just socially inept in how they convey their message.
Most of this, in my experience, comes from sticking to a very strong opinion that is heavily based on ideals. The further apart from the real and existing world these ideals are, the more caution should be taken when implementing them. Otherwise you will be placing a huge turd on someone's desk. During work hours. On a deadline.
I think this is the main problem with the attitude that the Gnome and systemd devs have. The Canonical devs took their ideals at least closer from a working model (OS X) and they were (I assume) motivated by pragmatism.
The OpenBSD devs base their ideals probably closer to their own experience. That makes it more likely to be right.
I can recommend using Forklift. It has its own share of bugs, but you don't get the freezing and general crappy experience of Finder when accessing network shared :)
> Now after months of using Ubuntu (unity, xfce) and Mint (Cinnamon) I'm convinced that OsX is best for average (and power) desktop user by a huge margin.
I recently set up Ubuntu and Mint computers for some folks, and can report that Mint is not bad, but Ubuntu is dreck (seriously, the change-user-password dialogue hangs: do Ubuntu users never change their passwords‽).
I've been running Debian with stumpwm for years now, and am convinced that this is the way of the future: a tiling window manager, extensible in a real language, capable of performing literally any task I ask of it. Most of the time, my main window is full-screen — that's odd to someone used to a classic desktop interface (as I used to be), but it's actually very much like a modern tablet or phone.
I use KDE and can confirm, unfortunately, that (at least on my setup) I have encountered many bugs pertaining to graphics. I also often get PulseAudio issues like the sound skipping when the system is under heavier-than-usual momentary load. I can't run Chrome developer tools without it crashing inside the driver every few minutes. Mounting an NTFS partition works on second try every time. I've used GNOME previously, which was too memory-hungry and often too unresponsive to use.
I love Linux as a developer platform and that's why I'm staying here. But if I could give up the shell, package management, understandable system architecture and the like, I'd move to Windows in a jiffy. Its desktop works flawlessly on my PC.
Virtualbox has so-called 'seamless' mode, in which the X11 windows from VM running on Windows host appear as separate windows in the Windows desktop. However, they only appear separate; they don't have separate panel button and they are not directly reachable via alt+tab. One needs to switch to VM first, then use keyboard shortcut set up in VM to access them.
If you alt-tab it to the vm, you have to alt tab again to switch between linux programs.
Usually that's not a problem, because I only have emacs and/or a terminal session running. With the control key on the right, you switch back to your host os.
To switch to a windows program from linux you simply press RCtrl+Alt+Tab.
Virtual desktops in Win 10 are very handy while using a VM. Ctrl+Win+d opens a new virtual desktop. Ctrl+Win+Arrow Keys switches between them.
> I've used GNOME previously, which was too memory-hungry and often too unresponsive to use.
Maybe you should buy a dev computer instead. Gnome consumes far less memory than the average DE. I always try every DE at every release and as far as I've experienced gnome-shell and cinnamon are the best at customization/plugins/performance.
Edit:
> I'd move to Windows in a jiffy. Its desktop works flawlessly on my PC.
I'm currently at work and I've a windows and an ubuntu on virtualbox - gnome-shell is pretty smooth while windows appends the "Not responding" text to every window's title, the search in the menu is much slower than it should be and the apps are often frozen - it is far from "flawless" for me.
For me, it looks like a lie, because of no information about hardware, distro, kernel, etc.
For example, pulseaudio works fine for me for at least 6 years on 6 notebooks and 2 workstations of various vendors (HP, Dell, Acer, Medion). But if it will work badly under heavy load, then obvious command will fix that: `sudo renice -n -10 $(pgrep pulseaudio)` .
Assuming that other people are lying because they have trouble with things you like is paranoiac behavior. The almighty Linux desktop is not important enough to lie about.
I am leader of Linux User Group in my country, so I am aware about typical problems with Linux desktop. Last problem with PulseAudion on major distro, I heard off, was years ago.
Linux on the desktop is still full of issues that were solved by Windows and Mac a decade ago. Fonts still look like crap (Ubuntu is an exception), and even with patches there are still problems with Java, old GTK libs, etc.
I could care less about games, MS Office, a d a lot of other things that are deal breakers for some, but I have tried Linux on the desktop and it is still lacking.
I agree that Linux on the desktop has issues but fonts isn't one of them. Linux has had excellent open-source fonts and font rendering for a long time.
My experience with fonts is the opposite; in Win 7 the letters are always tattered and manifest some color distortions around the letter edges; no matter what I do with Cleartype settings, both in browser and other programs such as Outlook. In Debian with Xfce or Gnome, the letters look much better, smoother and almost no color distortions apparent. I do not know if it is because of different font or different rendering (hinting) algorithm, but I never saw text representation on Windows be as good as on my Debian.
My latest foray involved Infinality patches on Fedora 23. Honestly, with the exception of Ubuntu, this is the best I've ever gotten with Linux. I think part of this is due to Google having released a lot of great fonts with open licenses (droid mono, e.g.). But still, they just aren't as good as Windows 7. And it's not because I'm just used to Windows. I rarely use Macs, but when I do, I'm more than happy with the fonts (they're probably better than Windows). In both, the display and sizes are consistent in all apps, but Linux has so many different toolkits that don't respect the default - Java, VLC, Chrome, etc. - basically like 50% of the apps I'm using most of the day.
My latest try with running Fedora 23 on my laptop outside of the VM seemed okay... until a recent Kernel / X update turned it into a hot mess - fan constantly running, processor overheating warnings in the logs, etc. Maybe it was my NVidia card or something. The problem is that I have a Dell Precision which is one of THE few models (along with Thinkpads) that are well supported on Linux.
But I just don't have time to deal with these things anymore. If Windows 10 doesn't clean up its act, I'll probably be moving to a MBP, even though I haven't had good luck with them in the past.
Honestly, I'm at the point where I'd pay serious money to RedHat or some other company (maybe Dell or Lenovo) to put out a well supported laptop / Linux combo: nice fonts, supported discrete video, working ACPI, upgradable parts, and no spying or trying to monetize the OS user nor dumbing down the interface a la Apple and MS.
> What exactly do you play i wonder, sure there are some games, some 1st party games from Valve come out to Linux, some 2K titles will get a Linux release on steam 3 years down the line (and usually be somewhat broken), and there are some indie games.
You need to do some research.
> But please let's not portray Linux as a viable gaming OS.
I'm playing on steam+linux for years and I don't really miss anything. There are 1-2 games I'm running on wine+steam but all the others are on linux.
But please let's not portray Windows as a viable OS.
kek.
Seriously fanboyism is getting a bit extreme.
What % of games that come out each year are playable on Linux?
What % of AAA games that come out each year are playable on Linux?
What % of games that come out from the big 3 are playable on Linux?
How come that Windows is bad and evil but somehow a close source platform (which constantly abuses its monopoly go listen to what devs and small publishers have to say about Steam) that runs on top of Linux with always on DRM is acceptable by the free folk?
P.S.
How come Steam titles system requirements for Linux so often come with a warning "AMD and Intel cards are NOT supported."?
> So no the Linux gaming market is still pitiful it's not much better than it was 5 or even 10 years ago with Wine if anything it's worse since DX11/12 games pretty much do not run on Wine.
It's MUCH better than 5 years ago.
> On the indie side things are much better so if you are only a very casual gamer or really like some old school stuff you'll have things to play, if you want to play AAA titles Linux isn't where you find it.
Those "AAA games" used to be casual, over-hyped walking simulators...
I don't think it lost the battle on desktop - if you don't rely on platform(or interest)-specific software like the adobe/microsoft suites and some 3A games then you'll be more than fine on the linux desktop.