I have always been a big fan of Unity. The only changes I don't like are the ones introduced under pressure from users.
Namely, removing the "dodge" option for the launcher and disabling the scroll on it[1][2]. It now allows you to switch between windows of the same application on the same desktop which has to be in the running for the most useless UI function ever devised.
I haven't really noticed HUD for quite a while but a few days ago I tried it frustrated with GIMP's cleverly hidden sub-menus and was blown away by how easy it is to jump straight to the desired function if you know (or can guess) the name. Invoking "Arbitrary Rotation..." doesn't even take a second. I was like a magician, I wasn't using the system, it was an extension of my being.
I'm curious - as a long-time Ubuntu user who is totally resisting any effort to switch to Unity - how does one 'learn' the menu functions available for a new app with the HUD? I don't know anything about it, but is it possible still to see all the menus/submenus with the HUD somehow, or does one have to poke with the app to see the menus? As a naive outsider looking at Unity, I'm not sure I find the appeal of the HUD to be applicable .. maybe as an experienced user you could bridge this gap for me?
As others have said, you can look through the normal menus - the HUD is an addition, not a replacement. You can also guess: e.g. in GIMP, if I want to crop an image, I can just hit Alt, 'crop' to see what's available.
In most applications, I don't use the HUD very often, but for editor applications with complex menus - like GIMP or Scribus, it's very useful, and it works well.
Personally, I find the HUD useful in those situation where there is no keyboard shortcut associated with a particular menu item and one can't be configured either.
Plus I find it easier to just write a few characters than having my finger perform a meaningless dance involving multiple modifier keys..
The HUD is activated with the alt key (by default). Holding down the alt key for a while reveals the classical menu. That's what you would do to discover the options available to you.
I actually like the window switcher, but it would have been far, far better if Canonical had made it so that alt-scrolling on the appicons would have had that effect. Scrolling the launcher should have been standard.
> It now allows you to switch between windows of the same application on the same desktop which has to be in the running for the most useless UI function ever devised.
You don't use file managers much, do you?
Or work with more than one spreadsheet or written document at a time.
I quitted Ubuntu because the latest versions of Unity uses 3D acceleration that consumes a lot of battery in a VM. I didn't find a solution for it (except using it natively).
I followed some solutions that didn't work or turned the windows manager very unstable. I really liked Unity.
> I quitted Ubuntu because the latest versions of Unity uses 3D acceleration that consumes a lot of battery in a VM. I didn't find a solution for it (except using it natively).
I don't use Ubuntu, but I have to ask -- why did you quit it because of Unity? You can install anything else you want.
That's true; I use Awesome on my Ubuntu work machine. That said, if you're not going to use the spiffy UI, you're better off using a simpler distro, in my opinion, without all the crap that Ubuntu insists on putting on top of Debian.
Fair enough :-). Can you install Unity on anything else other than Ubuntu though?
I'm really not flaimbaiting, I swear. I actually used Ubunt 4.10 a long time ago (come to think of it, hell, I can't bleieve it's been almost nine years), but I long switched on to Debian and Arch. I kindda went out of touch with the Ubuntu status quo.
You can install Unity on Arch[1], but you also need to install several libraries with many Ubuntu-specific patch[2], since Ubuntu loves to patch stuff downstream.
I'm also not sure if it will work flawlessly, as I haven't tested it. But, for example, Ubuntu ship with also a patched Mesa, that it's missing for that project and it's supposed to contain some Unity-specific patches.
There were some projects to include Unity in OpenSUSE and Fedora[3], but it seems a dead project now and it's not available for the latest versions of those distro (last version supported: OpenSUSE 12.2 and Fedora 17).
Running Ubuntu in a VM, I've been unable to update to 13.04 due to them dropping Unity2D. My VM doesn't have enough video memory to run Unity 3D, and that's not going to be changing anytime soon, so I'm stuck on an old version.
If you're not too attached to Unity, try Lubuntu or Xubuntu. Both work rather well in a VirtualBox. The last Ubuntu that worked lightning fast in a VM was 11.04.
I moved on to Crunchbang. It's noticeably faster than even Lubuntu in a VM. I don't think even the older Ubuntu's were this fast. I dedicate 32M of video to Crunchbang and have had no problems.
> It now allows you to switch between windows of the same application on the same desktop which has to be in the running for the most useless UI function ever devised.
Simply asking... WHY on earth would you find this useless?!
Agree, you wouldn't need it if you just had the ability to select a "don't combine taskbar buttons" option (last time I checked Unity didn't have this) - the first option that I select on Windows 7/8. To me different windows of the same application are technically "different apps/task" and I want them to show separately on the taskbar or at least have an easy way of switching between them. The whole "task based interface" current is the biggest UI/X fail in recent history imho, and a misnomer on top of it - applications aren't task, if I have two different web apps in two different windows (or simply two different sets of task specific tabs), they are two completely different tasks to me, I don't want them grouped by application goddamit!
Even philosophically, in the "UNIX way" tasks emerge when one combines the use of a number of small applications to get a particular task done, while in the "Mac/Windows way" of "big feature-full applications" an application can be used to work on multiple task simultaneously in different windows. 1 to 1 mapping of apps to tasks and the "combine windows on taskbar" philosophy only apply to very naive and inexperienced users that don't understand even 10% of the potential of the "tool" sitting in front of them, and they should not be catered for, they should be pushed and stressed to learn more whether they like it or not, because they can learn more, and they will inevitably get better at whatever they do by learning more - even Microsoft gets this (they know that if you are forced to learn more about their products and by learning more about them you become more productive with them, you will grow more dependent on them, whereas if something is "very easy" but limited, you'll be able to switch to a competing product that could give you even a minor increase in productivity at any time), even if would be suicide for them to admit it and put it in any marketing material since they market themselves as "easiest to use" :)
...overall, I feel like Unity simply cherry picked the worst features from Windows 7 and Mac OS GUIs. At least the Windows GUI can be very easily customized in all relevant aspects and allows me to be very productive with it, while the Mac OS one is well polished and tested and after I get used to it has a nice "flow". If you want to build an "opinionated" GUI, first make sure you are at least as competent as Apple's UI guys are, if not, just drop it make something clean and easily customizable with a Right Click -> Properties on any element so I don't have to hunt down a controll panel that, for Unity, doesn't even have the options I care about!
> if I have two different web apps in two different windows (or simply two different sets of task specific tabs)
I would continue the argument and say that having "tabs" be a concept related to grouping together behaviors of specific applications into self-contained task units was an anti-pattern that we got used to only because window managers failed to innovate and application-specific tab containers was then a simple fix that could evolve from multiple-document interfaces: there is no good reason why I shouldn't be able to take a Terminal tab and a Chrome tab and have them be part of the same window.
Xmonad has a 'Tabbed' layout that does just that. I'm planning a tiling window manager for gnome 3 that takes the idea further, treating switching to a tab and opening a URI with a UI similar to Firefox smartbar.
Maybe you should look at i3, which allows you arbitrarily nesting tabs and tilings. (I myself use XMonad with tabs grafted onto it, but that's mostly because as a Haskell fanboy I can't not use XMonad.)
I have tried lot of window managers (including i3), but still the best one in my opinion is ion3. Now discontinued, but I still use it. I think it is years ahead of other window environments.
The best features of ion3 are in my opinion:
- dynamic tiling with vertical/horizontal splits and tabs
- easily extensible using lua including information panel (I used it to display useful info e.g. bitcoin prices)
- query module allowing to very quickly switch to other windows/workspace or bring any window to currently focused region
- query module can be extended to query different sources e.g. bookmarks or locate. I can even imagine adding hood like abilities to the system
- query module also supports vim input mode
- very good commands support, one can easily configure keyboard and mouse behaviour, including emacs like combos, and moving, dragging windows actions
- can be effectively used using both keyboard and mouse
- this model of dynamic tiling makes it very easy to move/resize windows using mouse. One can press some key and resize window using mouse without trying to hit few pixels of the edge of the window. Window can be resized instantly and precisely without any mental effort
- both tiling and floating modes supported
- scrapbook support - there can be panels shown using single key combination with many application tabs in it - accessible from everywhere
- session support - one can start wm with already running applications on preconfigured workspaces
There are probably other features I have forgotten. I am really sad that Tuomo Valkonen (the wm creator) left the project.
There are some other programs that in my opinion didn't get the credit they deserve (like ecco pro) and then other projects have to painfully reinvent them.
> if I have two different web apps in two different windows (or simply two different sets of task specific tabs), they are two completely different tasks to me, I don't want them grouped by application goddamit!
I don't really use webapps in that way but you can integrate them with Ubuntu directly and then they get their own icon[1]. You can even use system controls to manipulate options within those apps to a certain extent.
I started using Ubuntu because it was "Debian with a nice GUI". I started using Debian again when Ubuntu demonstrated that they don't care about privacy. I was pleasantly surprised to see how far a Debian Wheezy desktop has come. Out of the box, it seems as nice as Ubuntu nowadays.
I did the same thing, but I'm using xfce, default configuration, since both GNOME and KDE have decided to break Alt-Tab. (Alt-tab is really the only fucking thing I want from my fucking window manager. Well, and Alt-F4.) My only gripe about xfce is that I have to open a file-browser window to get it to mount USB pendrives.
They fixed Alt-Tab when I had never before realised it was broken. The new paradigm just makes so much more sense: Alt-Tab switches between applications and Alt-` switches between individual windows of that application.
When I have to use a desktop without this functionality I am now painfully aware of how cumbersome and slow the old behaviour was when I have plenty of windows open.
Listen, I don't know what you're using your computer for, but I have five terminal windows open. One of them is IRC, one of them is reorganizing my videos, and one of them is running mtr so that when I notice a network problem I can see how long it's been going on and where it is. These windows have nothing to do with each other. It is not beneficial for the mtr window to be lumped in with the IRC window. If I'm switching back and forth between IRC, reorganizing my videos, and my browser, I do not want to deal with mtr. I do not want to deal with the window where I was running the compiler yesterday. I am not interested in the fact that the window where I'm reading a man page is "one application", the window where I'm reading Server Fault is "another application", and the window where I'm looking at a PDF about LVM is "a third application"; I'm interested in what I'm trying to get done, not what program happens to have opened the window that I'm doing it in. If I'm switching back and forth between the LVM PDF and the pvs man page and the shell window where I'm actually running LVM commands, I do not want to have to deal with the other fifteen unrelated terminal windows I may have open. I don't want to have to think about which kind of windows I'm switching between every time I switch windows, especially on this ten-inch netbook, where I run everything fullscreen.
In short, for the way I use my computer, the Alt-`/Alt-Tab dichotomy doesn't make sense. It turns what used to be a zero-effort action — switch back to the previous window — into an error-prone action that is likely to take me to something unrelated to what I'm trying to do. And now it's a zero-effort action again, since I stopped trying to use GNOME or Unity.
How do you use your computer that this makes sense to you? I don't mean to imply that you're insane, since I recognize that people do not all use their computers the same way, but I can't imagine how you're using your computer that enforcing this dichotomy is actually helpful to you.
(Note: I do use tabbed browsing, because, sad to say, Firefox still takes five seconds to open a new window. That's seven billion clock cycles, one clock cycle for every man, woman, and child alive today.)
For what it's worth. Gnome is designed to make this choice easy for you. https://extensions.gnome.org/ hosts a large number of javascript extensions which change the functionality of the desktop.
Oh, thanks for that! I don't use unity -- I'm too prone to RSI to really use anything with a mouse -- and I'm too stingy on my screen estate (I run xmonad). But either way it's nice to be aware that there are extentions... the result page(s!) for alt tab was both encouraging and a little scary...
It correctly switches and changes order based on your alt+tab history, and it splits alternate workspaces up into an end section which switches workspace for you, so 'shift + alt + tab' becomes 'go to the last app on another workspace i used'. Very nice.
I do the same on my netbook, but use urxvt for a tabbed terminal.
That lets me use shift + left/right to move between tabs, ctrl+left/right to shift tab ordering, and ctlr+d to close one.
So that's one desktop pane, then one for chrome, one for emacs, and one for pgadmin. I also make very liberal use of fluxbox's 'sticky' button for dragging a terminal between desktop switches.
I'd just like to point out that using ` for your user interface seems a recipe for problems, as that key jumps all over in different keyboard layouts. In fact, both in Spanish and German layouts it is close to the backspace key, but in different positions.
Yes it is, more or less. This functionality is badly broken (by default) for international users. As the poster above you said, the ` key changes place in different keyboard layouts.
I'm using a ISO/Spanish layout and the day I discovered that Cmd+` hotkey I just couldn't figure out why the hell that key would be mapped to "switch between windows of the same app". After googling for a while, I found out what the ANSI/US keyboard layout is, and only then the hotkey made sense. Just for reference the backtick key is right next to the "P" letter in the ISO/ES layout.
Now the first thing I do when sitting in front of an OSX computer is to remap that hotkey to Cmd+º, which is the same physical key in ISO/ES than ` in ANSI/US. I seriously doubt that most international OSX users know about this hotkey though. I have a few (non-techie) friends using OSX now, and none of them knew about it...
Wow, I would've thought at least OS X mapped that to "key over tab" rather than literally alt-`... (Which is the key left of backspace with the shift modifier on a Norwegian layout...).
Incidentally, I've several times considered modifying that when writing shell scripts ... on the other hand, it does make one try to use the $(...) over `...` as much as possible, which when it makes sense, makes for easier/safer handling of backspaces etc ... so not every inconvenience is a disadvantage...
> ... I have to open a file-browser window to get it to mount USB pendrives.
Navigate to Applications Menu -> Settings -> Removable Drives and Media; tick checkboxes for "Mount removable drives when hot-plugged" and "Mount removable media when inserted" (and, optionally, "Browse removable media when inserted"). Problem solved?
> If we obey stubborn users set in their ways, user interfaces would never advance and evolve.
But neither would they catch genetic defects - the mutation rate would just go to 0. To be an advocate for evolution, at least to be one that I'll listen to, you need to point me to strong selection pressures in favour of a desirable outcome. How does this improve information management, how does it improve workflow? That sort of thing. What are you using to vet? And I'm not seeing that with unity.
Some hand waving about how it's 'largely ineffable' just isn't going to cut it. Let's be honest here: They stuck the launcher, which other OSes allow you to move anyway, on left hand side of the screen. I can get all my options on the same side of the screen just by dragging the launcher there. And yet... most people don't choose to keep their launcher on the side of the screen when given the option.
Talking that up as some big advantage is just... yeah. Not buying it. It seems to me like forcing a way of using the computer on people that they've elected not to use and that there are the capabilities that it ought to have naturally arisen as the way they'd use it if it was superior.
You are right that the "largely ineffable" feeling needs to be cashed out. But I use that phrase largely because the psychology of UI interactions and usability that would justify my like of Unity is a new, inexact, and developing science. So, I'm starting with the basic observation that after giving Unity a try, it grew on me, as it did with others, to the point where we actually prefer something new. Part of it is increased simplicity. Another part is more screen space. Another part is that the OS doesn't get in your way (like KDE does). Another part of it is that is the tendency to place things of importance in the upper left which is, in our Western culture, where we tend to start looking for things (like when reading a book).
As far as evolution is concerned, my point is that I often notice, say, when Facebook changes their interface that users complain because they have to learn something new -- they are lazy, but if you force them into something, they may end up preferring it later. It is true that we shouldn't redesign things if they work well. But the way things are designed the first time may not be the best possible way.
The new features in Unity might not be molded by some process like natural selection. But there is some argument to made that it is. Creating one platform for all devices and levels of users would have enormous advantages, and is a response triggered by changed in the computing environment -- mainly, the widespread use of touch phones and tablets.
Aside from that, all I can say is that Ubuntu does do some user testing, and I hope they are listening to their users. They also have to take some chances and try out new things and see what users respond to. But I agree that their interface changes should respond to real needs, not imagined needs.
Haha, no, but English is becoming the global language, I should have also mentioned. So Western culture is dictating global usability standards - be it good or bad, its happening.
I'm bored, so I'll bite. Yes, officially, Mandarin is the native language of the vast majority of China's 1.3 billion people. Unofficially, however, at most a large minority in China speak it natively. What is called a "dialect" in China often differs as much from standard Mandarin as, say, Romanian from Latin. Speakers of these dialects who live in large cities will often know enough standard Mandarin to get by, but usually only barely. Outside of mainland China and Taiwan the Chinese diaspora usually speaks either Cantonese or Fujianese languages like Hokkien. These languages are related to Mandarin, but certainly not mutually intelligible with it.
It's also a myth that the writing system is the same across Mandarin, Cantonese, Hokkien and all the other "dialects". While all "Chinese" languages use similar characters, written Chinese does not, in contrast to what is often thought, consist of pictographs. The written Chinese of a Cantonese speaker will therefore differ greatly from the written Chinese of a Mandarin speaker. Sometimes one version will merely seem "somewhat off" to a speaker of a different variety of Chinese, at other times it will make no sense at all.
With that in mind, the number of speakers of Mandarin has to be revised way, way down, to the point that it's no longer a serious contender for global language status. English is still supreme in that regard and will be for the foreseeable future, with Spanish coming in second.
There is a non-zero chance of China becoming a larger economic player. Also, China exports a lot of immigrants to other countries. I don't have any data to back this up, but it does not seem like an totally 'out there' idea.
I notice a tendency with new interfaces that you need to know "secrets" in order to use them effectively. Things that aren't visible in the user interface. E.g., in unity, to open two instances of an application, you need to shift-click the launcher. It seems to me that the more minimalistic the interface tries to become, the more you need to know to be able to use it.
It also seems to take more effort to get what you want. It's easier to pick a program from a menu that to open a search panel, type the name, if you remember it, and find it in the search results.
I use xfce on Ubuntu. The particular dislikes I have for Unity include the shifting of window toolbars so that the buttons are nowhere near the window that they relate to, and the launcher that doesn't function well as either a launcher (it doesn't have every program I may want to use) nor a way to quickly switch between open windows (without combining mouse clicks with keyboard modifiers).
You can middle-click to open additional instances of applications.
I think it takes longer to find something when one has completely forgotten its name and description, but this isn't a problem 99% of the time, at least for me.
Ubuntu makes much more sense when I'm using my laptop without a mouse available, primarily due to the super+# application switching, whereas I am usually more efficient with Xfce when there's a mouse around; I have both installed.
> you need to know "secrets" in order to use [the interface] effectively
I remember finding out, years after I bought the original Master of Orion, that you could use the F8 key to see which planets have incoming enemy fleets.
It totally changed the game for me, and also made me decide that UI's suck if the "secrets" aren't discoverable by just moving around the UI and looking at tooltips and the like.
The thing that bugs me the most about the moved toolbar buttons is that they didn't feel the need to move the scrollbars but were able to come up with a solution that works and preserves our understanding of the interface.
By all accounts, Unity continues to improve and actually innovate across releases, but the Canonical guys need to fix one simple thing - the out of the box appearance of Ubuntu is frankly hideous. They are doing some very interesting things w.r.t. human interaction, but the design is nearly an immediate turnoff to everybody I've shown it to.
I'm well aware that it's infinitely customizable, but that's not a solution. When a large fraction of your userbase makes their very first action after installation to change the color scheme/theme of your OS, you should go back to the drawing board with your designers. The actual paradigm they present to the user is fantastic, but the color scheme is repugnant.
In this regard, 10.04 was the pinnacle of the appearance of Ubuntu Desktop. Everything since has been a downgrade IMO and that tends to be the consensus on HN as well as other tech portals. Unity is a disaster, as is the forced Amazon search stuff (as you said, opt-out is not a solution).
Ubuntu were the folks who made a printer test page in colour that made giant filled boxes of colour - apparently whoever created it has never purchased printer ink.
I'm with you - I'm running a VM of 13.04 full screen 1920x1200 on a dual monitor machine. The Ubuntu desktop can fit about 40% of the content that the host MacOS install can in the same space. It's frustrating - I shudder to think what it's like in lower resolutions.
If Ubuntu hadn't abandoned Gnome 2, I might never have switched to XMonad. So thank you, Ubuntu.
I'm also a bit surprised that the author of the article finds Mir to be a good idea. If they think a new display server is needed, they should contribute to Wayland, IMHO. Or at least fork Wayland if there would be irreconcilable differences, but as far as I can tell that's not the actual problem, they just want to do their own thing, which for a display server seems like the worst idea. Display servers are pretty fundamental low-level building blocks of a system, and for that reason one of the few that has resisted fragmentation in the Linux world. (I don't think fragmentation is bad per se, I just think it's bad for display servers.)
I have a Macbook Air and when it first arrived, I thought I'd give Mac OS X a try. One day I lasted! I missed dear XMonad so, so much. Made me appreciate it a whole lot more.
I originally switched because I was so sick of using the mouse to arrange my windows all the god. damn. time.
I decided to give OS X an honest try after Ubuntu started crashing for no apparent reason.
I wouldn't have lasted much either, if a friend hadn't pointed me to a neat little application called BetterTouchTool [1]. It lets you re-arrange windows using keyboard shortcuts.
Ubuntu 13.04 is quite good. A bit buggy, but quite good. What I really want from Ubuntu is to be able to run it on my phone. I want Linux, real Linux, on my pocket computer.
I, too, am anxiously awaiting an Ubuntu that I can run on a mobile device, especially an x86 tablet, so that I can have a tablet to take on client visits and be able to use it for code editing and debugging if needed.
"Every time I came back to Unity I grew to like the interface more"
This is telling. I empathize with the author. Here is what he is saying: I really, really want to like Unity. I'm giving it a huge benefit of the doubt, and yet I simply cannot stomach it. I am compelled to go elsewhere. It seems to be getting better but I still cannot use it.
This entire blog post reads like someone trying desperately to convince themselves something of which they don't truly believe.
I think we can separate the design and the performance when evaluating Unity. The design is something innovative and gets a lot of things right. It doesn't mean its the desktop environment of everyone's choice, but it can still be an effective one. Then, there is the performance, which has been getting better, and it may catch on. Second, sometimes I've been annoyed with Unity, but I do like using it most of the time, some comments have shown a similar reaction. A lot of times the reasons I don't like using Ubuntu can be for potentially temporary reasons, like bugs, or lack of polish and maturity. But the reasons I do like it aren't as temporary, they are a set of good decisions already made. And the other reasons I do like it is the software availability, the stability and speed under the environment, using it as a web server, my own history of knowing how to use it -- all more established pros for Ubuntu. So, I can use that, or Gnome, or the Fallback, or any other environment really but in the switching back and forth it was ultimately Unity that seemed most nice to be on. I will, however, try out Cinnamon again. And I like using other environments as well, but there are negatives to them all. The ol' Windows 95 panel with a user's open app buttons (small icons and window titles) are arranged by when the apps were launched by the user. But they should be pinned icons that don't vary their location -- and launchers shouldn't be separate icons from these buttons, its superfluous -- that's my intuition now, especially after using Windows 7 and Macs, and Unity. I have my issues with Docky and Cairo and AWN is dead. So Ubuntu I think gets a lot of things right, adapting to some usable alternatives to the old Windows 95 desktop I discovered 15 years ago. It has a good dock, good global search, more screen space. But I like other environments too, I'll give it a fuller exploration when I have the time. The good thing about Unity is, it not being too customizable right now, I don't waste time trying to customize my desktop like I would on KDE.
As a long-time user, I must say they've come very far already. I'm one of those who always appreciated Unity and never got the whole hate thing. I really like the experience on my HP Elitebook 8740w.
But I recently thought about installing Steam, for which I needed to use the proprietary graphics drivers to be able to play its more demanding games.
I thought my entertainment to be just another 'apt-get install' away. I have never been so wrong. Canonical should start handing out medals to people who survived replacing their graphical drivers with those provided by AMD/ATI. It's a hellish experience, taking your heart and mind on a rollercoaster of emotions, only to see the tracks end right in front of you when the installer fails and then get catapulted into the dark void that is a system with a broken graphics driver.
Digging about in the root shell was the only option left. Adding insult to injury, their uninstall scripts usually only work halfway, breaking any further installs because it never uninstalled fully. Which then pretty much leaves no other option then to start 'forcing' installs and hoping for the best.
And lo and behold, after wrestling with some packages, purging everything that remotely looked like it knew fglrx, I had my desktop back. But what a chore it was.
So I would really love it if those hardware manufacturers provide a better and less-volatile experience when Mir launches. Perhaps the oft-rumoured SteamBox could accelerate said process even further...
Installing a graphics driver shouldn't be a fight between life and death for a system.
I'm a big advocate for the Linux desktop, but at this point, as long as Canonical has Ubuntu ship off desktop search results to Amazon by default, I would even advocate for users to use other operating systems, even Mac OS X. It's no bastion of freedom respecting software, but at least doesn't ship off your desktop search results to a third party for profit.
No longer freedom respecting? I disagree. Its no longer hardcore hacker respecting but it's still free. Canonical has been walking this fine line between being a serious Linux distro and an OS that has the potential for average people to adopt. At this point it seems that they're finding they can't walk that line without some sacrifices and they seem to be going after the "average person" market in a lot of ways.
As it relates to freedom, Ubuntu is still open source and licensed under a free license. Technically it does qualify as being free software. The Amazon results may not be something a lot of us like (including myself) but anyone is free to remove that functionality from the OS.
If you choose an OS based on philosophy then I'm not sure that Ubuntu has been on your side for some years now. I don't agree that Ubuntu isn't freedom respecting but I do agree that it probably isn't for you and people like you.
Their (Mint's) modifications are much harder to disable as well. If privacy is a real concern use another distro like debian or arch or ... etc. but don't use Mint which has had questionable browser modifications for a long time.
With products where you pay for them in advance, you are the customer. So the companies making money off you tend to try and please you. In apple's case, sometimes they make decisions for the good of the user base, and those can annoy users with specific needs, but at the end of the day spying on their users is counter productive to being a profitable business.
With ad/search sponsored products ie. google and now canonical, the business has 2 customers. The consumer and the people they are selling ads to. And they aren't even making money off you directly, so it doesn't take much logic to work out where the balance of interests lie.
Not really... pretty much every aspect of OS X is removable, it just isn't as straightforward as Ubuntu and may break portions of the system. You can remove Spotlight and Notification Center with ease, and I doubt it would be difficult to disable recording.
Although I highly doubt Apple would risk the backlash by putting a malicious feature like that in without an option to turn it off.
> Although I highly doubt Apple would risk the backlash by putting a malicious feature like that in without an option to turn it off.
Really? Wasn't Apple the one tracking iphone users wherever they went? [1]
I get the concern about Ubuntu sending data to amazon by default (i.e. can be disabled), I really do, but going out of the way to recommend something as proprietary as either Apple or Microsoft out of spite? This is what really bothers me about the OSS community. How about helping Mint/Fedora out and recommending them instead? I know some do, but even the OP here flaunted that an OSX recommendation seemed more viable than Ubuntu. As much as I would want not to be tracked, at least with Open Source SOFTWARE we can be more aware of it, or even have a say in it; not always so with proprietary stuff! So no, I do not believe either Windows or OSX are better than Ubuntu in this regard.
Not to mention that Canonical isn't a big company, so if they need help bootstrapping by doing some stuff with amazon for now, then so be it. I would hold off judgement on their actions until they're actually in a position of real financial power, cause right now it just seems they had to resort to such ad placement deals because the generous donations weren't cutting it. It's really not clear whether it's actually that big of a deal right now.
Much of OSX is closed source. The only way you know it's not sending out your information is if you continuously monitor and analyse every packet leaving the machine.
Consider the flip side of the coin. The money from Amazon gives Canonical the ability to employ more developers, improve Ubuntu and add features faster, etc.
At the cost of privacy? No thanks, instead I will gladly donate to Canonical. Sending it to Amazon doesn't help either. Amazon is the Google of shopping. The more data they have, the better they can guess as to what you want. Search for X on Amazon while logged in and the next day you will receive an email recommending similar items. They have also admitted to storing your watching habits. Every play, pause, rewind and fast-forward is logged while watching a prime video. The higher the analytics, the better the recommendation.
> The higher the analytics, the better the recommendation.
Sure, in a concave down horizontal asymptote kind of way. My impression from things like the Netflix prize is that improving predictive accuracy for recommendations is an extremely high-cost, low-yield endeavor. I remember one day being shocked at the quality of Amazon's recommendations for me (since they'd always been crap), and looking to see how they'd picked such winners. The result? All of those products were ones I'd personally added to my old forgotten amazon wishlist.
Well, sqrt(x) has diminishing marginal returns, but it's still unbounded. I'm pretty sure there's a hard limit to how good recommendations can get at all (think more like arctan(x)).
Why is it so terrible that Amazon know your shopping habits? Honest question. I've heard of cases where Amazon knew a woman was pregnant before her family and such things but I can't think of anything that would apply to most people that would be bad and come from Amazon knowing your shopping habits. Heck, even you yourself make a good case for why it's helpful for them to do so.
Sometimes I feel like people are against this as a matter of philosophy, knowing it'd be rare for it to have a negative effect on them. And I'm not saying that's bad or knocking it. I'm just trying to get a sense of why this is so distasteful. I've never seen an Amazon result when I search using the latest Ubuntu by the way. Then again, I'm only searching for applications or files on my machine. I've also never seen an ad on my desktop. I know these things exsits but don't know why I'm not seeing them.
> Why is it so terrible that Amazon know your shopping habits? Honest question... I can't think of anything that would apply to most people that would be bad and come from Amazon knowing your shopping habits.
I think you are putting the onus on the wrong party.
It's easy to imagine a scenario in which Amazon might be nefarious or incompetent with my data, or in which they may share my data with another party who is. However, I don't need to demonstrate that these scenarios are likely in order for me to desire privacy: If I am merely undecided about who I want to share my data with, then the rational thing for me to do is to not share my data.
To my mind, it seems immoral to release a product that shares search data by default and without notice.
> Sometimes I feel like people are against this as a matter of philosophy...
Absolutely. Are "matters of philosophy" a poor way to inform actions?
In my experience, you will enjoy life more if you think along the lines of "what's the worst that could happen?" Otherwise there's too many things to worry about or be unhappy about.
Amazon might even knowingly share some of your shopping habits with some organization you don't want them to, but what's the worst that could happen? Is it worse than the effort you'd put into ensuring that doesn't happen?
If you can't imagine anything bad happening, you lack imagination.
Assume Amazon will lose your data. Assume everyone else (facebook, twitter, g+, flickr) will lose your data, give them up -- or that they are already public.
Assume someone will correlate all that data, and sell it as a black market service to allow anyone to track you, see if you're in or out of your house, assume your government will put you on a no fly list if you're buying books on military tactics, left/right wing political opinion, assume your government will become oppressive in 15 years and mark up all your kids for "special attention" based on your shopping habits today etc etc...
I'm in exactly the same position, having switched to Ubuntu almost a decade ago.
There is just no way I can stick with a distro that a) abuses my privacy by selling my search strings by default, and without informing me, and b) thinks it is okay to sell advertising real-estate on my desktop in the first place.
For now I've switched to debian on my main machine, and am gradually migrating others.
Heck even Windows 8 started siphoning local desktop searches to its servers. Sure in both the cases it can be turned off but it is sad to see all OSes converging to a point where privacy is no longer a top priority.
Why do we see all OS makers as the enemy? I don't think they're trying all that hard to destroy privacy. What I see happening is that the average person doesn't care about their privacy to begin with. So OS makers give the people what they think they want. This isn't necessarily a bad thing. And beyond that, these features can be turned off. There's such a fine line to walk between things like privacy and convenience, security and ease of use, etc. There are no right answers either. These issues all come down to what each user personally prefers and OS makers are catering to the majority of their users. It's up to each one of us to pick the best OS for what we prefer, make sure we know what can and can't be disabled, and make your voice heard when you see something you don't like. There's never any guarantees you'll get what you want but if enough people speak up...
I have tried and failed. The simple bottom bar, app menu on one side and system tray on the other side is an unbeatable desktop paradigm. I don't think desktop, specially with large screens, should be forced to use full screen and one app at a time.
Ubuntu has done commendable work on 13.04. Power consumption and overall experience is phenomenal. But is hard to get excited about Ubuntu anymore. I was reading a bug report on Transmission torrent client bug tracker where they were discussing if they should support Ubuntu's appindicator or Gnome's message tray or just simple system tray. This divide is only going to increase putting both developers and users in an impossible position. When we free-software-guys don't give Android a hard time, there is no reason to create noise around Ubuntu.
I used to use Xubuntu, got frustrated with the ways that it hasn't really kept up integration-wise, and tried to switch to straight Ubuntu with 13.04 and... couldn't stand it.
I'm glad I found Mint after that. Cinnamon is by far the best experience I've had with the linux desktop yet.
I've also switched to Mint with xfce. My laptop got slow terribly slow somewhere between 11.04 and 12.04 and I just couldn't stand waiting for Unity anymore. The same laptop is blazing fast again.
I haven't tried just using the basic cinnamon packages from an installed Ubuntu system, but I have tried it from an installed Xubuntu system and it definitely doesn't work as well as a full Mint install.
At any rate they're all ubuntus in the end, but the whole package of Mint+Cinnamon is what I was referring to.
To design responsive web applications, you need to see the (browser) windows content on re-size. This is not default in Ubuntu/Unity, you cannot see a windows contents when you are resizing. You can enable it, but its slow and not very useful. The content in a window needs to re-flow fluidly and quickly, especially the browser.
Its as simple as that for me. Until Ubuntu has hardware accelerated window re-sizing, i simply cannot use it as a development platform. Responsive Webdesign isn't going anywhere anytime soon, and responsive apps/websites are a chore to effectively debug on Ubuntu compared to other OS's
Sounds to me like you have a particular and isolated use case. You describe a web developers problem, rather than normal use case.
Reflowing to the window, as you drag a corner, seems computationally expensive and a waste of time IMHO. It can occasionally have it's uses. I've always thought the default should be resize and reflow on resize release.
If people really care about responsive design and window resizing then they would probably happily forgo tabs and instead use windows. I'd rather this, but the window manager needs to be competent. (I'm not sure how much lighter tabs are compared with windows - system use wise ?)
In Xfce I'm happy to have the browser not draw the window as I move it. Just outline the window.
You can resize and reflow under Gnome3. (Alt+F8 is the keyboard shortcut.) Can you really not resize windows under Unity?
What version of Ubuntu? Last time I tried was 12.04, and even with a proprietary binary graphics driver properly installed, it was very slow and un-usable.
If this is something they've fixed in more recent version, color me excited to try Ubuntu again.
I'm running 12.04.1 at the moment. On a Thinkpad t60, which came out in 2006 or so. The screen redraws as fast as I can move it. Perhaps this has something to do with the fact that I don't use Unity or any other desktop environment.
I'll have to check these out, but I'm assuming these tools just help you view various anchor/media query points, it's not exactly what I need.
I need 100% fluid reflowing of every single pixel of width, not just where the media queries are. My designs always come out much better when I optimize this way.
I liked Ubuntu and Unity until I found this release to be so unstable on a Thinkpad T410 that I was forced to look for something else. First I tried Xubuntu but I quickly began to miss many of the niceties of a more featureful desktop environment. Feeling a general dissatisfaction with Ubuntu I decided to try Arch Linux. It was quite a task to configure a base system with video drivers, networking and other hardware working at a similar level that you would get out of the box with Ubuntu. But in the end I'm extremely pleased. I went with the Gnome 3 desktop because it handled a lot of things for me such as drive mounting and managing the network. I actually prefer Gnome 3 to Unity. It's more visually pleasing and minimal. The only thing I miss is the HUD for accessing deeply-nested menu items. The best thing about Arch Linux is that most packages are the absolute latest stable versions. You don't have to wait 6 months for an update to the official package. And if there's no official package, the AUR system for building packages from source is better than Ubuntu's PPA which requires you to trust binaries compiled by 3rd parties.
Just picked up a refurbed T410 with integrated graphics and installed 12.04 LTS. So far so good, but I'd like to drop an SSD in here, so maybe the distro will change again. But I figured that LTS would be a good place to start. 13.04 did seem a tad buggy.
I'd like to try FreeBSD, but am impatient at the moment, and not sure how good wifi control and hardware support is. Would love to know if anyone's running it on their notebooks.
While it had lots of cosmetic issues that I'm sure could be improved and will improve, what infuriated me was that it seemed to revolve around the assumption that I would only about five, well-known GUI apps. This might be a point of, uh, Unity for Newbies and programmers who "live in the shell" but it completely excludes me and I think it excludes some portion of Linux users.
Nah, you can have as many apps pinned to the launcher as you like. I have 18, and some of them probably have dozens rather than thousands of users. Plus, you can easily search for applications from the dashboard, though that's a bit laggy.
While this piece seems to be reasonably balanced, well informed and above all honest about the Ubuntu-experience, this one sentence about launcher-support for web-apps stood out as plain wrong:
> These are not only innovative but great ideas that are being offered from the world of Linux, not by Microsoft or Apple, in a new way never seen before the Unity experiment.
The ability to pin web-applications to the launcher was really introduced in Windows 7, with ability for the web-site to add custom-actions or shortcuts to the icon, accessible through a right-click context-menu.
Chrome allows you to create application launchers without all the browser Chrome, which to the OS looks like normal apps. Afaik this was even based on some third-party app which did the same thing (although with Webkit and Safari) on OS X.
We've also had this ability to add shortcuts to web-apps as "normal" apps on our mobile devices for quite some time.
To me his statement basically seems very uninformed. Or am I missing something crucial here?
Unity is a lot of things, but an "innovative experiment" is not one of them. Unity is more a collection of various ideas and concepts from other modern UIs.
Unity runs terribly on my laptop that runs Windows 7 perfectly fine. It's probably more important in a technical sense that I can just use a different desktop environment, and Cinnamon works brilliantly for me. In terms of a less technical audience though, that won't be an answer and performance woes really should be a priority.
I love Ubuntu. Been a fan from, lets see, 5.04. It's been a long sweet experience. But lately I've got many areas which I vehemently disagree with them.
I guess the most important issue with Ubuntu nowadays is the one of privacy. With the recent revelations of mass surveillance, exposing desktop searches to a 3rd party by default is a catastrophic decision. I can't understand how the folks at Canonical could accept such a feature. They could have made it opt-in, with flashing popups and shit, asking you to signup. But making it default is a nightmarish situation. I was able to recommend non-technical users (like my parents and relatives) to install Ubuntu by themselves, arguing there was nothing to change that would need someone with know-how. But not now with stuff like this. I don't think the major chunk of the user-base (home users), would understand the implications of this feature.
Canonical doesn't want me to use Ubuntu, because they won't allow me to put the launcher where I want, which has been a feature of umpteen different window managers for at least the last 10 years. I don't care that it can autohide. That feature has been around for well over a decade. I care that it isn't where I want it to be, and that they won't give me the option to put it where I want it.
I use ubuntu 12.04 for server, I don't get it why desktop is that important anymore, 90% of my time is with browser and vi these days, the rest is with vlc and a little open office, that's about it.
I just cannot stand Unity, I understand the whole tablet experience thing, but I'm not using it on a tablet. The first thing I did with Ubuntu 12 was remove Unity and install http://cinnamon.linuxmint.com/ for people coming from a Windows platform, it's like a nicer start menu.
Sadly, this article is stuck on GNOME vs Unity, when the base system runs so much better without all that extraneous UI stuff.
I'd rather start off with a minimal install, and build it up as I want, rather than tearing apart someone else's mess, in the quest for my preferred balance of frugality and productivity.
I just tried the latest daily build. For some reason Unity dash stops working smoothly once I change the resolution from 1024x768 to 1680x1050. Might be because of my dated graphics card (ati firegl).
Namely, removing the "dodge" option for the launcher and disabling the scroll on it[1][2]. It now allows you to switch between windows of the same application on the same desktop which has to be in the running for the most useless UI function ever devised.
I haven't really noticed HUD for quite a while but a few days ago I tried it frustrated with GIMP's cleverly hidden sub-menus and was blown away by how easy it is to jump straight to the desired function if you know (or can guess) the name. Invoking "Arbitrary Rotation..." doesn't even take a second. I was like a magician, I wasn't using the system, it was an extension of my being.
[1] https://bugs.launchpad.net/ubuntu/+source/unity/+bug/1173623
[2] https://bugs.launchpad.net/ubuntu/+source/unity/+bug/1149092