Hacker News new | past | comments | ask | show | jobs | submit login
Not Smart Is Not Stupid (tedunangst.com)
158 points by tbirdz on April 15, 2016 | hide | past | favorite | 83 comments



I am in favour of software that tries to make your life easier, but hate, with a passion, developers that are so in love with their helpful feature X, that they ram it down your throat (You WILL use this feature and LIKE it!) and give you no easy way, or often, no way at all, to bypass or disable said "helpful"/"smart" feature.

Case in point: I recently bought an excellent large(ish) Dell monitor. Lovely bit of kit, but after a few days had me wanting to throw it out of the window. The reason was the power saving mode. The monitor is supposed to have the ability to "intelligently" understand you are not using it, switch the screen off and go into a sleep mode. All good and well until one discovers that the specific combination of graphics card, multiple monitor setup and video connectors one is using confuses the poor beast and it will happily shut your monitor down while you are actively using it.

The kicker that was driving me crazy was, of course, there is no way to disable this particular functionality!


Another monitor example is adaptive backlight (or whatever it's called), which might work well with slowly changing scenes, but when I open a bright menu on a dark terminal, the screen seems to pulsate. All hell breaks loose when I'm in a dark virtual console with a single flashing cursor. And just like with your misfeature, this one is paremanently on as well.


Personally I really hate Youtubes "annotations", which made me think about when a feature is a good one. My rule of sorts is: If this was disabled by default, would you turn it on?

A surprisingly number of times the answer is "No, I would not only not turn it on, I would also not miss it".


Annotations and autoplay. 2 features that I literally never want. It might make sense when watching a series of videos, maybe. But 9 times out of 10, it's "here's some other random video that has nothing to do with this one that you watched." Why would I want that?


Welcome to pure data-driven product development. Data shows that more people use our feature if we make it hard for them not to use our feature.


While I understand not wanting a feature "rammed" down your throat, there are some cases where using and understanding a feature is one of the largest factors in being an expert user. In this case, the program is trying to get you to do the "right" thing, and you should probably just comply.

One example of this is Photoshop. The single biggest differentiator between a Photoshop bad-ass and a noob is a deep understanding and copious use of layers. Because of this, Photoshop tries to ram layers down your throat at every opportunity, and that is the correct thing to do in my opinion. When I first upgraded to the version of Photoshop that went all-in on layers, it infuriated me because I felt like I had to do more work to achieve my goals; now I get infuriated when I try to use an image manipulation program without rich layering functionality.


Photoshop uses layers to cover over a lot of missing functionality. Ideally you should be able to undo any individual change while keeping all others and change the order or location of your changes.

Instead they cram layers down your throat which does work, but it's still a compromise.


Photoshop uses layers to cover over a lot of missing functionality.

In the scenarios above, both Photoshop and Dell are wrong, but they're in a market position to get away with being crappy to the user.


Could you elaborate on yhe use case that confuses this feature?

I used to plug my notebook to a big monitor on the side, and then I'd switch some things to one or the other. For example, the "external" monitor had better colour rendition, and so it was the prime candidato for photo-editing, while the "internal" monitor was usually closer to me, and better suited for reading, or opening terminals.

Would this kind of use trigger your "bug" and make the DELL monitor turn off? Can you think any use case where the feature works well? Some were asking in roder to avoid said beast in future/near purchases, and while you do have a point, if the use case works well for the user then this feature wouldn't be a deal breaker (and for other use cases, like yours, it defintely is).


developers that are so in love with their helpful feature X, that they ram it down your throat

There is a maturity problem in our industry. As in, the developers and product managers are a bit on the emotionally immature side, and have less ability to put themselves in an other person's shoes, when that other person has different goals.

I also use the Netflix streaming service with a Roku. As part of their ongoing campaign to convince me this is a mistake, the Netflix app auto updates itself from time to time.

This feature shuts off the screen without the user asking for it to shut off. What could possibly go wrong?


So, you RMA'd the monitor I hope?


Which model? I'm in the market for a couple of new displays, and if they've come out with a lemon, I'd like to avoid it.


I have a similar problem with a monitor right now. It's rare, only happens on startup, and is fixed by a 30 second reset with cables unplugged. So I still have the monitor for now.


Dell screens all across our office do this sort of crap.


yep-- proud owner of a DAC + speaker combo that decides to turn itself off because the "signal volume is too low" and then never turn itself back on again until i cycle the power. oh yeah and "volume too low" == "volume is at a level respectful to people in adjacent rooms"...


Please don't tell me it's the 24" 4k monitors, I was about to order a couple.


Nope, Dell U2415, although I would check if this permanent powersave "feature" is on any new Dell monitor, because if it goes wrong, like it did in my case - very frustrating.


> They really double downed on making the list feature a pain by autoplaying every movie upon selection. Now I can’t add or remove a movie without playing the beginning. Fortunately, I don’t have a data cap at home, but it still causes my receiver to switch audio modes, leading to annoying clicks and pops as it settles in. Not appreciated. My thumb was already on the OK button. If I really did want to watch the movie, pressing it twice isn’t so hard. In exchange for convenience nobody could possibly need, I’m forced to deal with aggravation I can’t avoid.

Boy isn't this annoying in Netflix too? I notice if you go to look at a show's information more closely (selecting it from the main screen), it automatically starts playing the first episode. I don't want that! I can manage that, really! Thank you!

I think we're hitting a critical mass of UX, where we're trying to dumbly pre-suppose behavior. There's ways to do this correctly (via machine learning), but blanketing behaviors for all users is just downright stupid.


> There's ways to do this correctly (via machine learning)

Please no. The actual users using an account may change over time (and rather frequently), e.g. in a family situation.

Also, in general, I like my systems to be predictable, and I prefer if they don't try to be smart.


> > There's ways to do this correctly (via machine learning)

> Please no.

Indeed, that's the second coming of Office 2000's "adaptive menus". It was an awful idea then, it's an awful idea now.

Even if the account has a single unchanging user and the feature actually works correctly and it doesn't impair initial discoverability, changes in software behaviour will break muscle memory and at least annoy.


Gmail does the Windows 2000 "adaptive menu" by hiding the folders you don't use often behind an expando.

I hate it. Hate hate hate hate hate it.

Naturally, there appears to be no way to override and say, "just show ALL my folders ALL the time! If I didn't want to see a folder, I'd delete it!"

But I don't expect a lot from Google, which seemly has zero UX experts and I'm not even 100% sure its products are developed by human beings. (See: Buzz, Wave.)


also for reasons I don't understand, when I search google, the links to "images, maps, shopping" etc. are in a completely random order. They will literally switch between searches. WHY?


There isn't 1 button for that, but you can either drag the separator or just set all folders to "Show".


Fair enough. I'm not partial to anything, but UX designers gotta eat.


...by justifying themselves to their bosses by annoying users.


> Boy isn't this annoying in Netflix too? I notice if you go to look at a show's information more closely (selecting it from the main screen), it automatically starts playing the first episode. I don't want that! I can manage that, really! Thank you!

This is a horrible feature. I was a Netflix subscriber for years (an early adopter in the mail-order DVD days), and after weeks of fiddling and trying to disable this mis-feature, I finally unsubscribed because of it.


Microsoft tried that with Ribbon, turns out having the machine change behaviour/move stuff because it is used more/less is also a bad idea.


Microsoft tried that before the ribbon (with adaptive menus dynamically showing/hiding items based on use).

The ribbon is not fundamentally adaptive, IIRC the initial Office 2007 implementation was neither adaptive nor customisable, although it was partially contextual (some ribbon tabs would only appear when they made sense e.g. formatting when selecting text or cells)


On Netflix it depends on where you click. I think the info symbol is in the bottom right corner.


The user should not need to learn that (on the internet) - it should be amazingly obvious. "Start Playing" and "More Information" are clear. "My Show" (it starts playing) is not obvious considering we have been using Netflix for years, clicking on a show or movie for more information.


I don't know what platform you're on, but on the Xbox app there's no info button, you select the show and it takes you to the info page and also begins playing the episode.


So then there was a gratuitous UI inconsistency across platforms? Management problem.


I assumed usual browser. Sorry.


>If we naively squish the curve, as if by pressing down with a finger, we end up with a bimodal distribution, where the task is now easy for one group of users and impossible for another. This is not an improvement.

This resonates with me. Sadly I think this happens more often.


Darn, I feel this way often - I guess this makes me an old curmudgeon but I find Spotify on mobile, and Snapchat, and many other apps, very confusing and hard to use.


In your defense, the Spotify app is a mess and it seems like they're adjusting their own UI enough to suggest that they don't quite have usability down quite yet. Don't get me started on how often it hangs or crashes, or how the timing is eerily close to when it tries to transition into/out of ads...

Snapchat just has a bad habit of introducing a feature you need to perform a gesture for and only telling you once and nowhere else.

I wouldn't blame your age on it, I just don't think a lot of these popular apps have particularly good design but get by because the service is solid and hits "good enough" for your average (usually non-technical) user.


I think one of the failures is that UX is designed by people who are comfortable with the abstract, we (usually) know about what is happening in the background, and can figure how action A leads to result B, but most people are just not good at this stuff. For them, the UX has to be way more concrete, since they just do not have a clear picture of what is going on.

This also leads to some tech anxiety, an example of which was an acquaintance not long ago being unable to save a missed call into her contacts list on a Samsung/Android and worried that she was somehow too dumb to use the (very costly) "smartphone". Just a huge collective failure of the "smartest" people around to account for other people who are very different.


I'm a colourblind, left-handed and hard of hearing interaction designer. Oddly enough, this is almost a benefit in this context, because I have a much easier time noticing "intuitive" redesigns that fuck up accessibility in favour of the latest graphic design fad.

More generally, any UX design that does not involve user testing (ideally with both new and experienced users) inevitably leads to the designers missing things. If there's one field where co-design is crucial to decent results, it's IxD.

https://en.wikipedia.org/wiki/Participatory_design


I bring up these issues in every design session I'm involved in, but people just don't seem to get it. How can you make people care about colorblindness, etc?


8% of males are colourblind [1]. If my maths is right that's roughly the population of the USA. A group worth worrying about I'd have thought.

[1] http://www.colourblindawareness.org


We almost that exact discussion at work yesterday. The question from one of my colleagues in customer service was: How is the customer suppose to know if the product is in stock, when they're on the checkout page?

Apparently our UI design decided to indicate that with red and green dots on the order line. If he had though about color blind users, normal users wouldn't have issues either.


> If he had though about color blind users, normal users wouldn't have issues either.

That's what you meant, right?

Anyway, that's exactly the argument I always bring up: if you design for the colour blind, the deaf, don't assume righthandedness, etc, and you do it well, the interface will end up more user-friendly for everyone.

In your example, adding a hint based on shape/position/lightness (or all three even) as well as a colour is easier to read for everyone. Similarly, using some verion of Cubehelix[0] is the more readable option for heatmap-scales, and again not just for the colourblind but for everyone.

[0] http://www.ifweassume.com/2013/05/cubehelix-or-how-i-learned...


>That's what you meant, right?

Yes, exactly.


Law or strong corporate policy tantamount to a law. They're too small a market otherwise.


Well my experience designing my App (an appstore for renting expensive software) both the UI and the programing side (My cofounder is an expert arquitect but it was hard for him to be in every standup as it was a side project for him), is that most programers are too centered in the internal workings. It's hard for them to see the picture from far away, as an user. So after talking long about a behaviour of the app in certain situations, I still found that some obvious (to me) UX pathways, were not obvious to them, because they were thinking in terms of internal functions and data base. This is not a critique, programing is hard, and while learning to code miself I've learned how different is the mindsetting needed to program from anything else. Also UI and UX are very hard, harder than I would have guessed. Specially hard is keeping an app light in steps and options. Removing all the innecessary guessing from users while keeping a familiar structure. I think that trying to refine UX is the first time I found misself tired of actually "thinking".


Right, it's up to the designers to communicate a useful mental-model for the user, and to provide whatever conventions/affordances help them find the controls to manipulate that model.


> A minor time saving feature designed to spare users the indignity of [...] now meant they were unable to browse the web at all. That’s some serious time savings!

Although this is was meant ironically, there's a lot of truth in it.


That link to the Xerox copier that changed digits in your copy gave me chills. It's like the dark dystopian hacker future, brought to us today by incompetence.



This tendency to show suggestions more prominently than other, more useful, UI is most likely because this enables Netflix, Amazon, et al. to make more money, i.e., payola.


It's not payola if you're paying yourself for placement. That's just a placement decision.

On a tangential note, I've never understood why payola was supposed to be a bad thing.


Bought and paid for "rankings" are noise, not useful information.

Payola is a feedback loop that rewards cash, not free consumer choice, and is more or less guaranteed to enable consumer-hostile monopolies.

At the very best you'll get a race to the bottom like the one that happened in online advertising.


So use the rankings that you approve of. Music rankings aren't, and cannot be, subject to American law, and they exist in infinite variety.

However, I'm most curious about the consumer-hostility that you envision resulting from payola. What is the harm that consumers will suffer?


Um because its corruption and in the USA's case Organized crime was involved


The same argument proves that labor unions should be illegal too.

Grocery stores accept money from food producers for placement within the store. As far as I can see, this is precisely equivalent to payola. Nobody thinks there's a problem there.


Actually, lots of people do think that's a problem. Especially since the food producers playing for placement are usually placing unhealthy things.


And bullying their suppliers this is well known problem.


Because the airwaves are public and regulated by the FCC.


I don't follow you. So what?

If the airwaves weren't regulated by the FCC, why would payola suddenly stop being a bad thing?


Paying someone to promote your work isn't bad on its own. People do it all the time, and no laws are broken.

The reason it is bad over public airwaves is because the airwaves are communal property. The spectrum is licensed by the FCC so that all of society can benefit from the limited resource. The FCC therefore sets fair usage rules for the use of those airwaves, and one of the rules is that you can't pay to get airplay.


You can pay to get airplay. That's the entire business model of radio stations. We call it "commercials".

Assuming you're not in the business of producing music, you can even pay to have your stuff included in public broadcasts as part of the main programming rather than the commercials. This is called "product placement". (If your stuff was music, it would be "payola".)

You have yet to advance any argument that payola is a bad thing; you've limited yourself to "it's against the rules".


This is the new design paradigm that I've been seeing for a long time: Make a minimal UI, and take away the "advanced" features (/me glares angrily at chromium and FF). There should always be a way to revert settings you don't want. Sure, make the user type "I understand that I'm disabling $important_feature, and the developer thinks that's a really bad idea, but I need to anyway", but at least have the option.


That mention of VMware sent little bubbles of rage burbling up through me. On a mbp VMware forces use of the discrete gpu if available. This destroys any hope of using the virtual machine without your computer plugged in, and worse, some programs I need (gazebo) don't play nice with nvidia cards (on Ubuntu at least). I could fix this with some hacky solution (you have to do some weird stuff involving gfxcardstatus) or I could just execute my script over and over and hope that the powers that be smile on me, and this won't be the time it decides to shit the bed permanently. Oh and also wtf gazebo? Is the idea of an nvidia card so shocking you suddenly have a stroke 5 out of 6 times?

Sorry about the rant. I realized I'm going to have to work with gazebo again soon and preemptive frustration is already building.


See Jevon'ss Paradox: https://en.wikipedia.org/wiki/Jevons_paradox, A.K.A. Wirth's Law, Gate's Law, Page's Law, or May's Law.

Fun fact: there's a non-linear corollary between the growth of compute speed and the growth of network speed. Network speeds do increase, but they are not doing so as quickly as compute and storage.

I have observed that interns at corporations are usually tasked with things nobody wants to do or thinks can't be done. I wonder if there is a correlation between Wirth's Law and interns (the process of learning).


Excuse my ignorance, but is the second "not" in the title left associative or right?


I read it as "(not smart) is (not stupid)" and thought the article was going to be about how intelligence is overrated.


I found the title to be difficult and unintuitive to interpret. After starting to read it, I figured this was the intended meaning.


I agree with that premise as well.


    !smart != stupid


Now that you mention it, I don't know. I originally read it as left associative because otherwise it becomes somewhat of a blanket statement.


What caused you to see that; concerns for other English dialects, international/translatation, or something else?

I'm pretty ignorant, but I enjoy trying to misinterpret my own text, which is often easy.

It was either Kernighan or Ritchie that mentioned that every few years they read Strunk & White's Elements of Style to increase their ability to be understood and understand others.


I didn't want to, but I'd admit: I have been spotting ambiguities a lot more since taking NLP classes :) This one actually bothered me enough to ask.


What do you mean by left or right associative, and which would mean which: `!= stupid` or `= !stupid`?


Overall a good article. I especially hate features that are aggravating and I can't turn off. The auto-play is somewhere high on my list. Although, I think it's to get more ads shown than for convenience.

Fun part though is that similar arguments could've been made against the tech behind his "plastic circles" or logistics of receiving them in the mail. ;)


> If we naively squish the curve, as if by pressing down with a finger, we end up with a bimodal distribution, where the task is now easy for one group of users and impossible for another.

How do you convince people not to be naive though?


> Apple screwed up Universal Links something fierce

One note that I'm not sure how many others hit - iPhones have been especially effective at DDoSing websites for apple-site-association.


Many of these "smart" devices are actually very dump, are of bad quality and actually harm the end users privacy. Some negative examples from recent news: "smart meters", "smart thermostats", "smart refrigerators", "smart TVs".

On the otherhand, devices and products that are not advertises as "smart" are often better and actually use very sophisticated algorithm.


And, by transposition, Stupid is Smart!


Why do all the comments use language indicating that people are forced to use software features they don't like? Are people truly so weak willed that the mere existence of something causes enslavement to it? There is always the option to walk away. In light of that fact, all this whining is just unseemly.


Because in many of the examples people are, in fact, literally forced to use software features they don't like. The ability to turn off broken/misguided features only exists if the developer included a special option for that (rare) or if the software is open source (also rare).


A wise person would take the lesson and ignore the whining. You can just ignore whining, right?


The arising bugs in most softwares I use daily seem like a result of features bloating. I post some idiosyncrasies i stumble on my twitter @shkesar feed as a reminder that I won't follow the same misdirected belief.


Features and bugs is a much better situation than no features and no bugs, which seems to have been the unfortunate trend in recent years. Post-minimalism can't come soon enough.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: