"Hard" is subjective, and part of good UI design is reusing visual cues and affordances that the user has already learned elsewhere. A good UI should be discoverable. Emacs is not that. Nor, for that matter, is FreeCAD.
Discoverability is a huge problem in lots of FOSS interfaces. Most end users will read exactly zero lines of dedicated software documentation in their entire lives because they don’t have to, and that is a good thing for end users. I find most developers don’t even notice the great interfaces in their lives. Compare how clunky online shopping was in the 90s with a modern food ordering terminal. How many people learn to use Slack without consulting the manual? Now how about IRC? Keep in mind that slack is vastly more complex, and for many things, uses the same command technique and notation as IRC... slash commands, hash tag channels, etc. The designers knew that was a brilliant part of IRC so they embraced it knowing advanced users would be right at home, and new users had alternatives, but would probably come around to the advanced way of doing things eventually.
One of the reasons chefs rarely have anything to do with cookbooks they write past the initial set of recipes is because it’s really hard to see things from an inexpert perspective. People ask us things like “how long do I cook [something] and we often have no idea how to answer that question. Knowing how much that can change depending on the heat source, initial ingredient temperature, how long it’s been unrefrigerated, the water content of the pieces you’ve got, the shape of the pieces, etc etc etc, we just say “uh, until it’s done?” But it takes a lot of skill and experience to realize when most things you need to cook are done, so recipe developers and cookbook writers do a ton of testing to figure out about how long it takes to get you 80% of the way there and then give some simple ways to approximately gauge doneness in that context. If they’d learn a few simple things that “aren’t that hard”, they’d have precise, bang-on results like I do, every time. But unless you cook the same things so the time, you’d need to repeat that across all of the different cooking scenarios that require specialized knowledge. Chefs run into that because people want us to tell them how to cook things all the time, so the skill gap is apparent, and we see the value in someone who knows how to address that. It was never really shown to me like that as a developer, so I see why so many get stuck in the “come on, it’s not that hard” mindset, generally.
Interface design is conceptually harder, because you need to really consider many skill levels that have different needs. The answer isn’t developers reading some article to “make nicer looking interfaces” or “dumb things down”— which we’ll just piss people off in the end and many of them will be developers assuming it’s an interface designer’s fault. The answer is to deliberately enfranchise designers into the FOSS process to figure out who would benefit from the software, and make an interface that can serve everyone’s needs: inexpert and advanced users alike, if that make sense. You do not have remove advanced functionality to make it useful to non-developer users.
So the first step is to put aside the dev nerd machismo for a minute and recognize that designers serve a crucial purpose that isn’t “dumbing things down” or “making things look nice” and that most developers have no idea how to do it themselves. Once that’s a thing, figuring out how to enfranchise designers into FOSS will be the next step.
>recognize that designers serve a crucial purpose that isn’t “dumbing things down” or “making things look nice” and that most developers have no idea how to do it themselves.
That's true, but also remember that not all designers are actually competent, or in agreement with your preferences. I'm definitely no MS fan, but look for instance at Windows 7, or better yet, Windows Vista: IMO, Vista was the peak for the Windows UI (forget about the performance problems it had at the time): it was pretty, but pretty easy to use most of the time too, with a highly discoverable interface. Now look at modern GNOME, which its backers tell us is created by "UI experts". It reminds me of Scientologists calling themselves "mental health experts". Calling yourself an "expert designer" doesn't make you one, and even any professional field, there's frequently wide disagreement. A lot of design is subjective anyway, so you can't just point to one self-appointed "designer's" opinion and use that as gospel. Even back in MS land, somehow the "experts" went from the highly-attractive Vista UI to the butt-ugly flat UI "Metro" UI in Win8, and that's at the very same company!
The problem is that "UI expert" isn't enough of a specifier. Someone who makes an interface aesthetically pleasing is not necessarily someone who is capable of designing according to known HCI laws. There are absolute truths in UI design, and there are things which, when you spot them, are a massive tell that functionality has taken a back seat.
Moving the start menu from the corner, or otherwise wasting corner or edge space, is one of them.
> That's true, but also remember that not all designers are actually competent,
And not all developers are competent developers, and not all bus drivers are competent bus drivers. That doesn't say anything about the profession itself.
> or in agreement with your preferences.
These things aren't art projects. Using research and testing within your core user bases to remove as much 'preference' as possible is a core tenet of interface design. I see lots of developers scoff at that idea, citing a million popular interfaces that they hate. But, look how many UX Researcher (as opposed to UX Designer) positions there are out there, which usually require a data-focused graduate degree: their entire field is devoted to basing design decision in reality rather than going with the whim of the designer.
Having a working mental model of software changes the way people look at interfaces, and the sorts of interfaces most developers like, most non-developers absolutely hate, so unless the core audience is software developers, it's probably not going to cater to their unusual usage style like FOSS often does. You can design for both. It's a lot more difficult, but if that's your audience, your only other option is to be less useful to some of them. As we can see from adoption, very few end-user-facing FOSS applications get any attention from non-technical users for that exact reason.
> Now look at modern GNOME, which its backers tell us is created by "UI experts".
Last I checked, GNOME was being designed mostly by one guy that didn't have any formal interface design training, but even if they did get a bunch of experts for a later version, it's not like they have carte blanche to make changes. I outright refuse to contribute design work to FOSS projects because you spend 80% of your time justifying your existence and defending every tiny contribution from a ton of misguided technical design criticism from people who don't realize they don't know the technical concepts in design. It's like a bunch of copy-and-paste wordpress plugin 'reviewing' an architectural refactoring by an experienced software engineer based on two minutes of a priori thought experiments and some stuff they read in some articles over the years. Good design always comes from feedback, pushback, and iteration, but that doesn't work if most people involved have no idea what they're talking about. If you asked me to revamp the GNOME UI, I'd run away as fast as I could.
> People ask us things like “how long do I cook [something] and we often have no idea how to answer that question.
Yes. Answers like "until it sounds differently" just cause more questions while being the actual answers. How the hell am I supposed to explain "that different sound". After some time you just start to feel it.
Yeah-- it's genuinely hard. I went to culinary school where we discussed this topic at length and one of my classmates (who already had an English degree) is now the managing editor for a household name recipe website, I worked professionally as a chef, I write professionally now, and aced a college food writing course taught by a long-time head restaurant reviewer for the Boston Globe, and I still have a really hard time writing recipes for people that don't have my knowledge. At first blush, it feels like a trivial task-- just like making interfaces did before I went back to school to study design-- but it requires a lot of specialized knowledge and experience that I don't have.
I get why developers are perplexed by everyone's insistence that designers take the lead in creating interfaces, but leaving Gimp aside to look at another problematic FOSS UX, consider Mastodon: folks around these parts were proudly and rightfully touting Mastodon as an interface for Activity Pub as an amazing piece of software and technological achievement, but incorrectly claiming its UX was polished enough to replace Twitter. When I'd bring up the near impossibility of non-technical users being willing to figure out how federation worked when there are free options, the dismissals were fast and furious "I explained federation to my [toddler/grandmother/nontechnical coworker, etc]: it's not that complicated", "there's a (ten million word) beginner friendly onboarding doc that explains it all." "Users don't even need to know how federation works if they just pick an instance and sign up." I'll bet a lot of friends of developers did a lot of polite smiling and nodding in those weeks, and Opera's Tweets(!) about not being able to find any of her friends on Mastodon was all the evidence you need to prove that most people's most basic use cases-- connecting with and keeping up with friends over the internet-- aren't easily satisfied by Mastodon's UX.
If my Grandparents decided they were going to branch out from "the Face Book" to try that hot new Mastodon a few years ago during the spike, the likelihood of their progressing through the "beginner friendly" wall of text on their onboarding website is about zero. If they did, the first time some mind-bending hentai popped up on their screen, they'd have taken their computer outside and burned it. They would not have taken it as an opportunity to get the prerequisite knowledge they needed to even understand what instance shopping was. My parents would have gotten further, but given how frustrated my engineer father gets with much less confusing ideas because he's used to a whole different sort of technology, I say they'd last about 5 days.
You don't get a much simpler task than making a classical French omelet. It's got three ingredients. I can do it in my sleep now-- it all seems very simple. But it was YEARS after culinary school before I got my perfect omelet success rate past like 70%. Being blind to our existing knowledge is natural. That's a good thing when we're working by ourselves, but it's murder when you're figuring out how someone without it approaches the same problem.
I really think these rants (and italics) are totally unnecessary. Most people see beautiful software every day of the week, and value it.
Having said that, no one owes anyone anything. If Emacs wants to be a power tool, let it. I don't expect a construction crane to be safe and easy for me to operate.
I'm curious: what about the italics, specifically, bothers you? Those are important points or modifiers and italics are a way I draw attention to them.
Everyone takes many of the interfaces they use for granted. I certainly do-- I regularly take my phone's simple, clean mail client for granted despite my first mail client being mutt, but it's really, really clean and gives far more features than mutt ever could. (And a phone is one of the few places I would not prefer using vim style editing.) Having had hundreds of discussions with dozens of developers about interface design, I can assure you that they are no different. It's just that developers often have competing needs in software projects, so it turns into a point of contention. In a regular development organization, the product manager or whoever has the final say. In FOSS projects, the maintainers do. If they're not keenly aware of how much more valuable a good, expertly defined interface brings to a piece of software, they're going to make a worse piece of software, unnecessarily. I've served in both roles professionally and have seen these issues from both sides, many times. An organization controlled by designers would probably be even worse.
So with the amount of shit I've had slung at me when acting as a software designer, I think what I wrote is a pretty measured addition to a conversation that is very important to me. I see most end-user-facing FOSS software getting absolutely no adoption among non-technical users because it sucks for them to use, and I see a lot of developers not entirely sure why that is and coming up with a bunch of folk explanations like "commercial software has advertising" or "their version is dumbed down and looks pretty and end users are dumb and refuse to read docs." I've got important insight into the real reasons why, and as someone that's probably contributed 10k hours of my time to writing FOSS code (admittedly, some of it was paid,) I think I'm entirely justified in sharing it. If you disagree, I'd be interested to hear why, but even if you're not interested in explaining why, I respect your stance on the matter.
I apologize if I bothered you with my wording or even just the what I was expressing (seriously), generally, but I failed to bring attention to these ideas presenting them meekly. The Mastodon situation is exactly the sort of thing I was trying to prevent. Not only did Mastodon repel a giant influx of users with it's usability problems-- big and small-- it gave many, if not most of those users the impression that FOSS software is weird, confusing, and exclusively for nerds, and that's the sort of thing that's going to keep open source alternatives alternative instead of having commercial alternatives to the default, free option.
> Everyone takes many of the interfaces they use for granted.
For example, I've seen many people in the HN comments over the years cite the HN interface as evidence that interfaces are better without designers involved, or something similar. In reality, HN has some of the best interface design out there. It does exactly what it needs to do, simply and intuitively, is much easier to visually orient yourself and navigate than a paged forum discussion, despite being compact it's completely obvious where individual elements begin and end using implied lines, type hierarchy, and gestalt rather than adding a bunch of boxes and lines, works great on multiple screen sizes, needs nearly no documentation beyond policy docs... it's a brilliant, clean, polished interface. It even keenly takes its audience into account-- many end users would be fatigued by the loooong lines of text when you fully expand the window and generally prefer a limited width for text blocks, but users that look at a computer text all day long prefer the flexibility of stretching out the lines horizontally to reduce the vertical footprint and get more on their screen at once. That doesn't just happen-- it was deliberately designed that way. Good interface design is very different from good branding and identity design, or similar-- it should be invisible. The choices should seem natural, or even obvious... but how many sites were set up like HN before HN did it? Compare it to Slashdot, the gold standard when HN got up-and-running. HN is incomparably cleaner and more focused on its core use case, even though it has fewer features and visible controls, which is usually what developers lament the lack of. HN is a great mix between newsgroup/email chain type visuals worked into a forum format, and I hadn't seen anyone else doing that. Many of the interfaces we use that look obviously designed and function poorly were either made by purely visual designers, or developers trying to make something look "designed" and using inspo from dribbbbble or whatever. Beautiful looking is quite distinct from optimally usable, and interface designers generally concentrate almost entirely on the latter.