It's interesting that the problems are all with things that I actively dislike about the modern Linux desktop. I mean, I guess it's OK that it creates thumbnails of images...but, the tendency to grind away for seconds whenever opening a big folder (Windows does it, too, I guess) is just annoying. I end up using command line most of the time for file management tasks because it's too slow and cumbersome to use the UI.
Also, it seems to be file types that would never be automatically parsed on Windows or Mac. I mean...a Nintendo music file? Why on earth would the desktop environment need to do that? (And, I say this as someone that composes chiptunes and enjoys listening to them, but I don't need my desktop environment to grok them).
And, I guess I like that Linux does things out of the box that Windows and Mac need third party apps for (much less so, today, but still a factor I notice when I reboot into Windows). But, maybe this is overkill?
And, yes, I think it's clear that Microsoft made a significant investment in security a decade or so ago, and it has paid off massively. Windows is remarkably more secure, stable, and reliable than it was a decade ago. I still prefer Linux, but the case for Linux over Windows is nowhere near as compelling and clear cut as it once was.
No, it generates thumbnails outside the main UI thread; sometimes it's a bit slow in so doing, but I've never seen it hang an Explorer window, regardless of file size or quantity. (Windows 7, but it would astonish me to learn that 10 displays a regression here.)
> No, it generates thumbnails outside the main UI thread; sometimes it's a bit slow in so doing, but I've never seen it hang an Explorer window,
As an example of the contrary, I've seen misbehaving third party thumbnail-providers cause Windows explorer to crash entirely.
Only way to "fix" it was to install the software which added the thumbnail-provider, or go into the folder via cmd.exe and rename the file you "knew" caused issued to a different extension while doing the operation you originally came to do.
That may have been on Windows 7 though. I don't know for sure if this weakness still exists in Windows 10.
If the third-party thumbnailer is implemented as a shared library (DLL) that Explorer is configured to load, then sure, a segfault or similar in the library will kill the whole process. Not sure how that's Microsoft's fault. Sure, there's an argument that loading a library is the wrong model, but there's a performance tradeoff, especially given Windows' relatively slow IPC capabilities. The real surprise here is that Linux manages to be comparably slow, but when you complicate a simple, fast IPC model with dbus and a million other middleware layers, I suppose it starts to make more sense.
I'm not excusing the Linux implementation for being slow, which I agree it sometimes is (Hello Dropbox and my huge, flat "Camera uploads" folder...).
I'm just saying that I've seen bad things happen on Windows too. Not blaming Microsoft, just saying that in a typical end-user scenario with lots of randomly installed software, you will have Explorer break too.
That's not the problem, most DE file managers use an external process too (via dbus), but it's still slow, even on an SSD or when the cache has alredy been generated.
They're also buggy, XFCEs thumbnailer thumblerd (also used in LXDE and probably others) used to have memory leaks when it encountered video files with unknown codecs, newer versions re-scan the entire cache every time you delete/move a directory, making quite a bit of disk IO.
I've found simpler file managers like rox-filer, pcmanfm to be much faster at thumbnailing.
There's a suggestion that that has less to do with inherent issues and more with wanting to display more information than can be obtained from whatever the Windows API calls stat(2). An indexer can actually help a great deal here, but I don't know if that's actually the issue - I should be getting a work-issued Windows 10 box pretty soon here, and look forward to experimenting for myself.
That said, even the slowness under discussion doesn't actually hang the UI thread. That's something you have to be pretty special to get so badly wrong, in this day and age where even web devs are learning better than to do expensive work in the main thread, and have the tools available to avoid doing so.
In everything from XP through 7, you can. Select the view you want to make default, then press Alt to display and activate the menubar. In the Tools menu, choose Folder Options; on the View tab, click Apply to All Folders. Current Explorer windows may need to be closed and reopened to show the change; all newly opened windows from now on will take the view you chose as their default, but also remember any changes you make to a given folder's view.
I generally prefer Details view myself, but know no reason why this shouldn't work for List or any other. Enjoy!
Well, Windows 10 has become a form of spyware out of the box, it sends information about what one types and does to Microsoft. They use dark patterns so that users don't disable it and stick to the defaults.
macOS is also very chatty, but privacy is usually given more consideration. Not at all as bad as MS.
"macOS ... privacy is usually given more consideration"
Definitely not my feeling last time I upgraded macOs: I had to give my full name, address, * phone number * and * * bank details * * while the upgrade is free of cost.
I don't like having these details hanging on server somewhere on Internet when it is not needed.
I felt like my profile was given a lost of consideration by Apple, not my privacy.
You have to * when installing a free MacOS upgrade *: you can create an Apple id without a payment method; you can probably remove the payment method after you enter one...
... But when installing or downloading MacOS, with the current (tested in January this year in Europe) Apple policy, the system will not let you continue with a 'none' payment method.
I was just thinking much the same - these things are a good reason to keep to minimal desktop environments and full control of the system.
My only interaction with "Tracker" has been to figure out how to disable it and get it off the system, as it was doing god-knows what and pegging processors. I can (and do) happily exist without apport.
I know this doesn't make me safe per se, but I do think that in becoming more windows-like and 'integrated' we end up with these unintended consequences.
Just a small comment on why the desktop environment might need to parse such a file: How about sorting music files according to their real length? Many desktop environments support such a feature, and users like those features, too.
Sure, they're features. Some people like the features. I'm getting old and cranky. My thinking is that I have a program to index and play my music library...I don't need the desktop to know anything about my music, except which program to load when I click a music file. (I've been amazed at how awful this experience still is on Windows...every time you install an application, it'll seemingly try to take over as many file extensions as possible. Microsoft apps, in particular, are egregious about this practice.)
The standard KDE/Qt based filemanager (forgot name) is much quicker even with showing the details. Meaning, Nautilus is just very inefficient as that other filemanager is able to show things much more quickly.
That's KDE Dolphin, has been improved a lot over the last few years.
KDE in general has been in good shape, better than GNOME 3 in my opinion. I've been using GNOME 2 and GNOME 3 (up to 3.8 when they completely got rid of fallback mode) for about 10 years, always felt that Nautilus as a file manager was basic and useless.
I switched back to KDE 4 (AFAIR it was 4.10 at that time), the desktop search and index implementation - NEPOMUK (what a bad name) had big performance issues when doing the inital indexing and when it does periodic indexing the desktop simply choke. It took a while to search for tips and best practices to settle it down. Later on I decided to completely disable that as I don't need it...
Later on KDE switched to a new search and indexing engine called Baloo, it seems that the design is better and configuration is more straightforward but I still cannot justify, disabled ;-)
In short, have been running Linux as my main desktop / workstation OS for 15 years, I come to a conclusion that OOTB (most distros) setup is not ideally optimized and secure for Linux Ninja, it takes time and effort to tune the system to suite your personal standard / taste (Now I run Arch Linux on most of my devices).
I mean, the answer is unequivocally, without the slightest doubt, yes. The Linux Desktop is probably a good 5-10yrs behind Windows 10 in terms of defense-in-depth mitigations as well as exploits in common targets like file parsers etc etc.
Security without a threat assessment is not very meaningful. If we are comparing a default linux desktop installation and a default windows installation, what would the test setup look like?
Let say we had two such machines and gave them each a reachable ip address and let the first test just be them running unattended until unwanted software got in.
In the second experiment we had the same machines go to random websites (top 1k), clicking randomly, using the default web browser.
In the third, we let them click and run attachment from email spam.
In the fourth and final experiment, we hire pen testers to target the machines explicitly.
With the same conviction that xpaulbettsx wrote, I have no doubt that the first 3 tests would show Windows 10 going down first. The amount of threats that targets window user is just order of magnitude more than those targeting linux users. The fourth test might give different results, but users who want to defend against targeted attack are generally advised to use extra security tools to defend themselves.
Just because Windows goes down first doesn't mean that Linux is more secure. It only means that it doesn't have a high enough market share to meaningfully exploit for a return on investment.
We should take the metric of "given a motivated party, how difficult would it be to exploit this machine" I have no doubt people are already sufficiently motivated to exploit Windows. But maybe only the NSA gives a shit about Linux- do we leave them unchecked?
Security without threat assessment is meaningless because security is about preventing a negative outcome. If the environment is perfectly safe then no security is needed, and if the environment is infinitely hostile then no amount of security will prevent a negative outcome.
Example: Being in an armored car in a war zone is still more risky than riding a bike in a peaceful country side, even if the bike has significant less security than an armored car. The Advice then is not to tell people in war zones to get bikes, nor is it to tell people to get rid of bikes in favor of armored cars. Security needs to match the need, which depends on the threat level.
In my above post I included "targeted attack" as the fourth test, named by security theory as an attack by a motivated party towards a specific resource. If a motivated party wants to attack a specific resource, then the defender needs to raise security above that of general security. Many government agencies have policies based on such threats, and neither a default Windows 10 or default Linux distribution would qualify for such environment. SELinux however was designed for that threat level and is thus common in military organizations, banks, and similar high risk environments.
You're ignoring that most Linux distros come with better defaults, i.e. no open ports. Reducing the attack surface is an important part in keeping the OS safe. Windows is remarkably bad in that regard.
Windows comes with its firewall turned on by default since XP SP2, most popular distros doesn't do that even today.
Windows 10 doesn't even respond to ping by default (which is a pita).
Not sure about other distros but opensuse comes with a pretty strict firewall out of the box. My logic says that this is probably common practice for many general-use distros.
I had a look at Windows 10, by default, assuming you clicked Private for the network connection, there are no ports that are open for any program to use. There are 33 rules for the All profile, 18 for the private profile, some duplicates, each of which specifies what local program is allowed to receive data:
9 connect to Modern Windows apps, 1 for ICMPv4, 12 for ICMPv6, 1 for IGMP, 1 for ISATAP, 12 for TCP, Cast to device, IPHTTPS, Network Discovery, WiDi. 15 UDP: 2 for Cast to device, 2 for DHCP, 1 for Teredo, 1 for Delivery Optimization, 1 for mDNS, 7 for network discovery, 1 for miracast.
There is no unequivocally answer in the domain of security, secure refers to a threat model.
When your threat model includes Microsoft or US surveillance then no Microsoft OS can provide you the security you're aiming for.
Then again Desktop Linux is no OpenBSD or GRsecurity[1], A hardened linux experience usually doesn't come out of the box with Desktop Linux, but still there are options to explore[2] if you're so inclined.
Windows does have better technical defenses but its attack surface is also much higher due to the thousands of deprecated but still supported older technologies for backward compatibility so I don't know, it's just different problems
The thing with the Linux Desktop is that you can selectively enable SELinux, use PaX etc. and have security comparable to, if not better, than Windows 10, plus the fact that Linux is a much more varied attack surface still applies.
Or you can do nothing, in which case you're probably less secure.
This is exactly the problem. You can buy Windows 10 machine and if it becomes vulnerable it's Microsoft's fault and you would count on them to fix it immediately, push out the automatic update and go on with your life. If default Linux desktop is insecure then consumer is supposed to figure out what exactly is insecure, be expert in knowing the alternatives and how to replace everything. The question isn't whether Linux desktop can be made more secure but rather why isn't it secure by default?
> you would count on them to fix it immediately, push out the automatic update and go on with your life.
Not only do you get security updates for your distro with the vast majority of Linux distros, but you also get it for all your 3rd party software, using the same system mechanism.
They may not push the updates automatically, (you can of course set it that way), but some of us still want to be in control of what gets installed on our machines
That's not to say it can't be improved, but the situation isn't quite as bad as you are painting it, ie there are mainstream distros that come hardened by default and security patches are regularly backported.
I'd argue that macOS is also technologically less secure than modern Windows, yet its users are in no more danger than Windows users are, (despite its theoretically security), because security depends on a lot of factors including the user culture, market share etc. i.e. Linux doesn't have a culture of downloading executables from random websites for one.
I'd guess most distributions feel like they're providing an adequate level of protection for their users as of now, without introducing too much friction. Once that is no longer the case, it's easy to turn on a few more knobs, the software is already there.
The 3rd party software is huge problem, actually. When you do apt-get or install using Ubuntu App Store it gives a false sense of security to novice user that things are safe. This is even more problematic because there are so many things one needs to download on default desktop to be on par with default Windows. The File Manager UI, for example, lacks too many features and user must investigate alternatives and either assume that everything is all right or deeply examine security vulnerabilities for each available option. Same goes for basic things like text editor or calculator and so on.
Why is getting the latest security updates giving a false sense of security exactly?
> The File Manager UI, for example, lacks too many features and user must investigate alternatives and either assume that everything is all right or deeply examine security vulnerabilities for each available option. Same goes for basic things like text editor or calculator and so on.
This is where you're being unfair, the notion that the default file manager is not good enough is subjective, it is plenty good for most people.
(Finder for macOS also lacks many features, yet many people never bother with alternatives).
Moreover, if you do need to find a replacement, if it is in the official repos, it probably means it is popular enough to be solid.
As for things like an editor,, are you telling me that Notepad is more featured than gedit?
> Why is getting the latest security updates giving a false sense of security exactly?
Because in quite a few distributions you don't get security updates reliably. For example Debian Stable excludes most WebKit-based libraries from their update policy.
So users of browsers like Midori and Epiphany or E-Mail-Clients like Evolution on Debian Stable, currently end up using a WebKit library that hasn't been updated in more than a year.
The same issue applies to Ubuntu and most of its derivatives as well. E.g. Ubuntu 14.04 users get a WebKitGTK+ library which hasn't been updated in almost a year.
Of course, the user doesn't get a warning dialog when he installs applications which rely on those outdated and insecure libraries.
Ubuntu users also shouldn't rely on packages from the Universe repository (which are by far the most packages), if they care about security. Those packages are community maintained and often don't get a single update in years. In the past they didn't even update Chromium reliably.
This is why I think Rolling Release is the only viable model of Linux distribution in desktops. Instead of overloading a team of security experts expecting them to backport every security change to the stable version of a package, they simply follow upstream releases, build it and make sure that isn't horrible broken (however, they can't be sure that the upgrade path is seamless for every possible configuration).
You. Are. Not. Supposed. To. Install. Debian. Stable. On. A. Desktop. Machine.
How many times will this have to be repeated? Stable is for servers. If you're running webkit based libraries on your server you have other issues.
For desktops, both Testing and Unstable are the valid options.
that's like ... your opinion. the way you try tu make it look as if it's a widely accepted best practice among Debian users is not cool. Debian stable is a perfectly viable distribution for desktop use.
Except... it is? Having spoken with quite a few people on #debian, hell, even a simple google search agrees with that. Don't use stable unless you have a very old machine. Using stable brings you... stability and three years old packages. Using testing brings you stability and six months old packages. Using unstable brings you stability and sometimes fun things and one week old packages.
For the Personal Anecdote Bonus Points, even sysadmin friends that swear by stable would tell me to use testing for a desktop distro.
People over #debian or your sysadmin friends are entitled to their own opinions and not representative of Debian users other than themselves, and for every random recommendation for not using stable you can find another random one about dangers of using testing or unstable.
If you check the Debian web site you'll see stable is the only one which is offically supported and recommended by the project and there is no distinction for "server" or "desktop" use cases. Had stable was non-suitable as a desktop OS you can be sure Debian developers wouldn't bother releasing and supporting thousands of desktop/graphical packages with the stable release.
In the end stable, testing and unstable have all their pros and cons, strong and weak areas and some are more suitable for some use cases. That Debian stable is only for servers and it is not a good desktop OS is a myth that needs to die. Stable is a damn fine desktop OS. Everyone is free to use whatever they deemed best for their use but when you post blanket statements like "You. Are. Not. Supposed. To. Install. Debian. Stable. On. A. Desktop. Machine." and "Stable is for servers" on a public forum you are spreading misinformation and you should just stop.
My actual experience of post release vulnerability patching is that Desktop Linux will provide the required update in a timely manner and it will be installed transparently through system update. The user often doesn't even notice there was a vulnerability in the first place. On par with windows update though a bit less intrusive and annoying.
Linux Desktop is not secure by default for the same reason Secure Linux does not offer the best desktop experience: more secure means less convenient. Desktop Linux aims at being convenient.
Also keep in mind, we are comparing a single Microsoft OS to a variety of Linux distros each with its own default. it would make more sense to compare all of those individually to see how they fit to a couple standard threat models. Then put them through a week of everyday use by a not knowing better user and see how much damage the different OS sustain.
Sure, extremely motivated individuals or governments might do that. But I'd still rate a distribution by it's default security settings. Being secure by default is important if you are shipping to thousands of users.
Unfortunately, while SELinux itself is good, the tooling around it has to be the most atrocious, useless steaming pile of thrash in this niche. From setroubleshootd randomly deciding to eat up 100% of CPU time (and no one being able to explain exactly what it does) to the endless fun of figuring out what policycoreutils-python & friends do and how, actually doing something useful with it is somewhere between "painful" and "frustrating". If Microsoft had published something like this, they'd have been the laughing stock of the whole Linux community.
I don't know anyone in my immediate circle of peers - not even people who use SELinux on servers or in products that they develop - who doesn't disable SELinux on their desktop. They're not idiots, either, nor re-booted Windows programmers that the IoT and DevOps craze has thrown into the Linux world, many of us have been using Linux since back when there was no E in RHEL.
I feel like there should be a way to write a new set of simplified tooling on top of the kernel API.
I've been running fedora at home an on my laptop for about a year now, and don't need to turn SElinux off. I only needed to add one custom role myself too, when trying to mount certain host directories as volumes in docker. Which is fair enough.
Ubuntu comes with AppArmor enabled by default [1].
Unfortunately its service will terminate at startup due to missing profiles [2]. This shows how much QA goes into security related stuff in a distro of this size.
One of the slides also mentioned at what point each distro introduced ASLR (ie. it might be worth upgrading your Ubuntu to 16.10 if you're on an older one)
I think you mean a new distro which is mostly, but not entirely yet another a Ubuntu-derivative, which comes packaged with its own DE and related software.
And no, this new and perfect email-client will still not try to beat Outlook by managing both email and calendar at the same time. Go away!
Seriously... What are Linux Mint and ElementaryOS even thinking?
Realistically, about the same. Many of the problems with Windows are rooted in being developed by a huge corporation with as much budget and manpower as it has.
I get your point, but I meant that the open source community works because it can leverage on the work from each other. The achievements would be multiplied by the potential reach.
> Maybe I'm not rational but I'm much more worried about Microsoft having access to my data than some random "Russian" hacker.
Russian hackers, past, present and future, routinely steal information and use it against the victim, whether we're talking about individuals or corporations. And I'm just talking about login credentials or credit card information, nevermind more personal data like phones or even intellectual property.
Microsoft hasn't done any of that, as far as I am aware.
Yes, that does not exactly strike me as a rational stance to take. Are you actually more worried about Microsoft blackmailing you about those photos from 4 years ago?
No, but i'd be worried about a government with power to control Microsoft using such information in a negative way.
Basically this scenario comes to mind :
a) A power change occurs within a government. This power change facilitates the changing of laws.
b) A corporation with massive stores of information about individuals is within this government.
c) New government doesn't like X people because they aren't Y people. New government coerces corporation within legal boundaries to fess up data on X people.
d) New government does horrible things using list provided by and facilitated by corporation, through no direct fault of that corporation other than the happenstance of existing within a country with ever-changing laws and regulation.
Presumably the commenter is physically out of reach of the Russian and Chinese governments, and being the victim of regime change in the US is more likely than being invaded by the other two.
You may be aware that since Trump, when you're not a US citizen, you have no rights to privacy. Everything Microsoft collect about the rest of the world is fair game. IIANM this feature was introduced with the patriot act update.
yup! Since trump[1]. As I said IIRC this feature has been introduced with the patriot act. At least out of US, privacy oriented online companies put forward that they are outside patriot act jurisdiction.
Well, I'm worried about Microsoft knowing who I am. It's hard to get legitimate Windows installs that aren't linked to non-anonymous payment methods. Not impossible, but hard.
Don't post online either, unless you are able to alter your writing style often. Passed writings linked to your profile will be used to try to identify you.
It all depends on your threat model and how wanted you are as a target.
Why? Surely using two multiple languages competently is a fingerprinting characteristic? Unless you manage to use them only in different contexts in different hardware...
Government and the police state are already abusing access to our personal information in awful ways. Sometimes that means cops stalking people whom they have a grudge against, or even a love interest in(1). CEOs of major companies leave their cell phones outside of important meetings.
When this metadata indexing was introduced in gnome/kde many users complained, because it pegged their cpu and was really unasked for. But some felt that this was something the MacOSX had and therefore some developers felt it was a good default. I'm not convinced, partly because of the increased attack surface.
The desktop environment itself is but a small part of the complete desktop. Some important differences between those specific desktops are are:
1) Clicking a file both runs the code and opens the file, and difference is hidden from the user. 2) Mail clients start pretty much any software automatically to open attachments. 3) Office software runs code embedded in documents with just a user prompt. 4) A lot of plugins are active by default. Flash and ActiveX used to be, but this is better now. 5) Code is run automatically on removable media insertion. 6) Users download software from random web pages instead of vetted archives.
These things are not technical but behavioral in nature and make desktops ownable. I hope the Linux desktop never emulates them. Web browsers have gotten so much better but one simple thing they could is stop downloading things automatically. That save dialog won't scare anyone, and users will stop having lots and lots of unknown files in their download directory.
I mean, competently implemented background indexing shouldn't be a security risk or a performance issue.
Microsoft had those problems, too, when they introduced their own indexer with XP. Doing desktop support for slow XP boxes, you rapidly learned to disable the indexer first. But by roughly mid- to late Vista, it had ceased to be a general problem. (Maybe earlier; I had ceased to be closely involved with support by then.)
The other thing about an indexer like this is that you need it well integrated into the UI to get the benefit. macOS has Spotlight, which is excellent. Windows has Start search, which is OK for programs if you don't misspell the name, and tolerable for documents if you use the MS default home directory structure. I haven't used desktop Linux since Ubuntu 8 or so, so I don't know what it has, but if it is indeed a Spotlight-like experience they're shooting for, file indexing is just the start.
> I mean, competently implemented background indexing shouldn't be a security risk or a performance issue.
Except ofcourse in the 100% theoretical, never ever seen in the wild, case of bugs in file-format parsers. It's not like Linux's "file" or "strings"-utility[1] has had a local exploits in the past or anything.
Uhm... So yeah... About that....
Back in the real world, this is a very real attack vector. Especially when it runs in the background on a large batch of files, automatically and unasked for.
Note: I'm not saying I'm against indexing content for easier access and help locating files. I'm just saying that you can't simply dismiss it as a security-risk because it runs in the background.
Showing dialogs is not a solution. Various studies have already shown users click any dialog which pops up without actually reading the dialog.
Loads of browsers do download automatically. Making things inconvenient and delegating security decisions to the user isn't good enough. Make it convenient and secure!
PS/Edit: Btw, under Windows 10 loads of things are indexed. It makes things very convenient. You use your pc like Google. Instead of knowing exactly where things are you just "Google" for it. With that I mean it has a good working search that's also really quick in giving accurate results.
>Showing dialogs is not a solution. Various studies have already shown users click any dialog which pops up without actually reading the dialog.
I can't count the number of times that I was in the middle of writing a sentence, a dialog showed up, I accidentally pressed space bar and I was left wondering WTF just happened.
Locate doesn't do what Windows 10 does with search (locate just does filenames). Plus it's slower than Windows 10 nor does it give the most relevant results first.
Windows 10 experience: you press start then type in a few letters and you already get good relevant results. This completely different from locate!
Yes, and there were a few implementations from different groups, as you would expect. I'm typing this on a current Arch and locate, mlocate and slocate are all available from the mlocate package.
I've used locate ever since I used Linux, my first distro was Slackware downloaded to oh so many stiffies (the hard plastic successor to floppies) circa 1994, I certainly won't die in a ditch insisting it was in there but I would be very surprised if it wasn't.
I'll put in somewhere in the middle of my "mildly interesting to maybe know" research list.
Indexing for search - it depends upon what you use your computer for. Some of us want to do stuff with our processors and memory and get very annoyed when the OS decides to use resources we were hoping would be used for our programs...
Sadly the big ones are. Because they consider this behavior "user friendly".
At the same time they think they can contain the threat by wrapping everything in sandboxes. Effectively infantilizing the owner/user of the personal computer.
Feeling mighty smug about my preference for tiling window managers and minimal distro choices.
But I shouldn't, they found bugs in software I use daily (ffmpeg for example), it would be relatively trivial to make me execute something with it, since my brain is trained to 'exes as threats' not mp3s.
Me too... It's usually a bad idea to apply one principle to everything but i'm finding less and less reason to not just throw minimalism at everything. Complexity is the source of so many problems in software.
I believe in sandboxing, I hope it gets better and easier to use.
I work on several C programs. I wish for the day when we have an easy to use, cross platform method of setting up a small set of open files at the start of a program, then be able to say "No more file access, no more network connections".
I know this hides a whole bunch of complication, which is why it's hard and why there are so many ways to do it -- I view it the same way as the move to distinct virtual memory spaces for each process. Once we have it we'll wonder why we ever allowed every program free access to the whole file system for it's entire life-span by default.
That is one thing I've looked at, and it looks great.
Hopefully someone (and it won't be me :) ) will write a library which looks like pledge but wraps all the various things in different OSses (I hear words like seccomp on linux)
on linux the low-level building blocks that can achieve similar are seccomp and namespaces, but the only abstractions that I am aware of involve separate launcher processes like runc[0] or firejail[1].
A library providing similar functionality to pledge that could be added during application startup or when doing fork+exec would be great.
"Better", but similar, would be to move to OS with the Object Capability model. Applications don't get access-by-default with security bolted on afterward, they get access to the objects they're initially granted and no way whatsoever to access anything beyond that.
Sadly that's a huge change in programming and security model for most and wouldn't be an easy change to make.
CloudABI https://nuxi.nl/cloudabi/ lets you run capability based apps side by side with traditional full POSIX apps on your OS. Out of the box on FreeBSD, patches exist for Linux and NetBSD, userspace support for macOS. So you get one binary that runs on multiple operating systems as a bonus :) The ABI itself is basically "FreeBSD, plus Capsicum always enabled from the start, minus any stuff that doesn't work under Capsicum".
That's a nice step in the right direction, thanks for bringing it to my attention. It does mean a fair bit of rewriting though which is mostly what my initial comment was trying to get across, it's a different world over there :)
> "No more file access, no more network connections".
You could potentially use setrlimit on RLIMIT_NOFILE to limit your number of open files.
Although... you probably still want to display something to the terminal which means you still want stdout and stderr, so an attacker could just close stdout and stderr before doing whatever they wanted with their 2 remaining fds.
I used to be part of a team writing a large C++ application with lua bindings. We had two lua environments, and you can specify the exact libraries available to lua, so we'd start up a lua environment without a filesystem or network once we were set up.
Just thought it was an interesting approach I'd share.
Hmm. If i look at the slides, the article should be renamed "gstreamer, and some stuff browsers on all platforms do, are insecure"?
Is it easier to change your media player on Linux, or to trust Microsoft?
Say, does a default Windows install still enable 20 networked services that don't belong on a home computer and can be exploited without the user downloading anything?
To answer the second question, no. A fresh install of Windows connected to the Internet will not be infected automatically (of course, assuming no new 0-day)
This title is pure sensationalism. The written piece says clearly "security vulnerabilities in the GStreamer multimedia framework. A combination of the Chrome browser and GNOME-based desktops creates a particularly scary vulnerability.". Somehow this very specific combination inflated to become Desktop Linux.
My default installation came with VLC, firefox and KDE. No gstreamer nor gnome installed, google products including Chrome are not welcome. Though I'm pretty sure manjaro is part of the Desktop Linux family.
Too bad this misrepresentation is hurting the message OP is trying to carry to the world. Then this message is hardly news, the guys at grsecurity have been at it for 15 years providing hardening security patches to the vanilla kernel.
Baloo, KDE's indexing engine is apparently hit by the same vulnerability that hits Tracker (except they didn't fix it, according to the article). Also, are you sure Firefox and none of the KDE applications are using gstreamer for the backend (e.g. for HTML5 videos)?
I dislike this idea that the Linux desktop is all Gnome and systemd, too, but the situation is pretty disastrous. Things that have an X in them, from X11 to (especially...) XDG shouldn't be trusted too much...
Gstreamer is more a codec library than a media player. There is a multitude of (GTK based) media players on Linux that tie into Gstreamer. Heck, even Firefox call upon Gstreamer on Linux to play embedded media...
Important bit is on the later slides: Issues on most codecs/parsers can be prevented by sandboxing. An exploding parser should never affect other processes, files, etc.
Seccomp (bpf version) is only available since 2012 really, but I hope more apps will start picking it up. It's pretty simple it should become a shameful thing not to use it in new apps.
You can apply the generic sandboxes to the whole process, but that's not the same as a targeted seccomp. For example, you can use one of the external jails to stop your media app from using the network, and that's great. But what if you want to stream content from the internet? Without changing the source, you can't apply the no-network rule only to the decoding part. That's what still needs work from the maintainers.
I am a big fan of grsec, RBAC and sandboxing stuff. But let's be real here people! Those are good features on servers where there isn't a giant security black-hole called X, where any local exploit of the app can turn it into a compromise of the entire GUI system.
Look at the hoops that adversary resistance focused distros like SubgraphOS have to jump through just to mitigate the giant attack surface that X opens.
Until Wayland becomes the usable default standard, "Linux Desktop Security" is an anachronism.
Security should be multi layered. So if one thing fails there's still yet another layer of defence. This because everything will have bugs anyway, so it should be assumed none of the layers will ever be fully secure.
systemd offers various methods to restrict daemons in their abilities. That's hardly used. Only recently tracker started sandboxing their indexers. Why block adding other security laters on Wayland? There's no need to wait, nor do these layers depend on another.
I didn't argue that there's one fix, I mentioned that there should be multiple layers. If you'd read what I write you'd have known this. Further, just being negative and calling names vs maybe making an argument isn't helping your case.
Sorry but the french Gendarmerie does not run on Ubuntu. They use a custom made distro called GendBuntu which is based on Ubuntu as Ubuntu is based on Debian.
I seriously doubt it's like you say. It's probably a lot more like a "custom made distro" that's based on Ubuntu as Linux Mint is based on Ubuntu: all the core stuff exactly the same, and a few things on top different (namely the DE in the case of Mint).
> ASLR: Debian: Work in progress (Stretch / 2017).
From the dpkg-buildflags manpage:
> Additionally, since PIE is implemented via a general register, some architectures (most notably i386) can see performance losses of up to 15% in very text-segment-heavy application workloads; most workloads see less than 1%. Architectures with more general registers (e.g. amd64) do not see as high a worst-case penalty.
Is this the reason why the adoption of pie is so slow? Does rust enforce hardening techniques?
I've read the OpenBSD developers poke fun at Linux by saying the same thing. During the 2000/XP time frame security became a serious threat to Microsoft's market dominance. Since then Windows has kept up with the best security practices and technologies better than most. It's very impressive considering I can still run Windows 2000 binaries on Windows 10.
It is important to decouple distributions from Linux itself. Some distributions do not place security as their top-most priority, but rather ease of use.
Then, there is no "one" Linux desktop. You have different X servers, different window managers, different desktop environments...
In Windows there's only one of everything, the configuration is less flexible in terms of what things you can disable, and once something is vulnerable that's it.
e.g: Vulnerability in fonts being rendered on the kernel? What can you do about it exactly? Nothing but to wait for updates... but then the Flame malware installed itself via Windows Update. It's fantastic.
If anything, Linux may benefit from relatively varied installation states in security scheme (SELinux, Apparmor, etc.), libraries included, and desktop environment. It is perhaps bit harder to pull off one-size fit-all attacks.
Things like data at rest protection seems to work better on Linux; as far as I know, there aren't out of box solution for Pre-boot authentication for Windows, for instance.
Edit: To the latter point, it looks like BitLocker has the mode to allow that, if you have Professional/Enterprise with TPM...
Windows RT was the playground for Windows security ideas. Every Windows RT device connected to a Microsoft account has device encryption backed by the TPM enabled for example...
Ironically, I didn't pay attention to the "[pdf]" part of the title, and as soon as I clicked the link, the PDF file got downloaded.
I have a pretty strict AppArmor profile for Evince (AKA Document Viewer on GNOME-based DEs), so I thought that automatically downloading PDFs and opening them in Evince instead of in the web browser would be safer. I didn't even thought about this kind of attack surface.
Why are they comparing bugs from Ubuntu 12 "unity" to windows 10 and calling it "linux security", this is like shitty statistics... you narrow your field enough and you will contrive the result you were looking for like a numerologist.
I'm irritated about the initial example, which targets "Ubuntu 12.04". Why this 5 year old version? Is it fixed already in newer ones? Because it will never be fixed for 12.04 and people are still using it?
You can make things more secure or less secure no matter the user.
Also: if one desktop doesn't check SSL certificates and the other desktop does, then one desktop doesn't even enable the user to be secure. Checking SSL certificates is a pretty recent thing btw. E.g. various mail clients accept any self signed certificates silently.
A random exe file you download won't be running through sudo, you'll see elevation requests. The elevation request is ~very spooky~ unless the exe is codesigned by a reputable certificate issuer. W10 turns the whole screen red and sometimes warns you that the exe itself is actively unsafe.
Of course, end users will just click OK on the elevation request, but regardless, it's not a fair comparison. Downloading random exes off the internet is more like 'curl | bash' ('still bad' level) than 'curl | bash | sudo' ('are you insane?' classification)
No, his objection is something like "I'm a poweruser. I can shoot myself in the foot, please neuter me."
To elaborate a bit: most of the times, you don't have to sudo-install. If you're someone savvy enough to use a command line and understand these commands (if you enter them without this knowledge you're just plain stupid and nothing will save you), then there's no problem: you are the firewall, and you apply the level of caution appropriate for how much you thrust your source.
Outside of architecture astronautical web development (usually done on a Mac while sipping some kind of coffee and milk blend) no it is not acceptable one bit.
I translated that to "the family of Linux Distro that aim to provide a desktop experience out of the box, e.g. ubuntu, mint, elementaryOS, pclinuxos, centos, zorin, manjaro, slackware, ...
Apport, gstreamer, Tracker/Baloo, Chrome/Chromium/Epiphany, ASLR isn't just GNOME. It seems investigation started on Ubuntu (Unity 7). It affects multiple desktops and as others notes, also Baloo.
There's security and there's safety. Linux desktop may well be less secure, meaning that it could be successfully attacked by an experienced attacker. At the same time it's far less likely to be attacked, so it's safer, for the same reason as macOS: less marketshare, few people are motivated to learn/research attack vectors.
They just don't have the money. They're behind because it never got the inertia of accumulated capital infrastructure, and it's finally starting to show. I bailed out to OS X late last christmas.
Also, it seems to be file types that would never be automatically parsed on Windows or Mac. I mean...a Nintendo music file? Why on earth would the desktop environment need to do that? (And, I say this as someone that composes chiptunes and enjoys listening to them, but I don't need my desktop environment to grok them).
And, I guess I like that Linux does things out of the box that Windows and Mac need third party apps for (much less so, today, but still a factor I notice when I reboot into Windows). But, maybe this is overkill?
And, yes, I think it's clear that Microsoft made a significant investment in security a decade or so ago, and it has paid off massively. Windows is remarkably more secure, stable, and reliable than it was a decade ago. I still prefer Linux, but the case for Linux over Windows is nowhere near as compelling and clear cut as it once was.