I am in favour of software that tries to make your life easier, but hate, with a passion, developers that are so in love with their helpful feature X, that they ram it down your throat (You WILL use this feature and LIKE it!) and give you no easy way, or often, no way at all, to bypass or disable said "helpful"/"smart" feature.
Case in point: I recently bought an excellent large(ish) Dell monitor. Lovely bit of kit, but after a few days had me wanting to throw it out of the window. The reason was the power saving mode. The monitor is supposed to have the ability to "intelligently" understand you are not using it, switch the screen off and go into a sleep mode. All good and well until one discovers that the specific combination of graphics card, multiple monitor setup and video connectors one is using confuses the poor beast and it will happily shut your monitor down while you are actively using it.
The kicker that was driving me crazy was, of course, there is no way to disable this particular functionality!
Another monitor example is adaptive backlight (or whatever it's called), which might work well with slowly changing scenes, but when I open a bright menu on a dark terminal, the screen seems to pulsate. All hell breaks loose when I'm in a dark virtual console with a single flashing cursor. And just like with your misfeature, this one is paremanently on as well.
Personally I really hate Youtubes "annotations", which made me think about when a feature is a good one. My rule of sorts is: If this was disabled by default, would you turn it on?
A surprisingly number of times the answer is "No, I would not only not turn it on, I would also not miss it".
Annotations and autoplay. 2 features that I literally never want. It might make sense when watching a series of videos, maybe. But 9 times out of 10, it's "here's some other random video that has nothing to do with this one that you watched." Why would I want that?
While I understand not wanting a feature "rammed" down your throat, there are some cases where using and understanding a feature is one of the largest factors in being an expert user. In this case, the program is trying to get you to do the "right" thing, and you should probably just comply.
One example of this is Photoshop. The single biggest differentiator between a Photoshop bad-ass and a noob is a deep understanding and copious use of layers. Because of this, Photoshop tries to ram layers down your throat at every opportunity, and that is the correct thing to do in my opinion. When I first upgraded to the version of Photoshop that went all-in on layers, it infuriated me because I felt like I had to do more work to achieve my goals; now I get infuriated when I try to use an image manipulation program without rich layering functionality.
Photoshop uses layers to cover over a lot of missing functionality. Ideally you should be able to undo any individual change while keeping all others and change the order or location of your changes.
Instead they cram layers down your throat which does work, but it's still a compromise.
Could you elaborate on yhe use case that confuses this feature?
I used to plug my notebook to a big monitor on the side, and then I'd switch some things to one or the other. For example, the "external" monitor had better colour rendition, and so it was the prime candidato for photo-editing, while the "internal" monitor was usually closer to me, and better suited for reading, or opening terminals.
Would this kind of use trigger your "bug" and make the DELL monitor turn off? Can you think any use case where the feature works well? Some were asking in roder to avoid said beast in future/near purchases, and while you do have a point, if the use case works well for the user then this feature wouldn't be a deal breaker (and for other use cases, like yours, it defintely is).
developers that are so in love with their helpful feature X, that they ram it down your throat
There is a maturity problem in our industry. As in, the developers and product managers are a bit on the emotionally immature side, and have less ability to put themselves in an other person's shoes, when that other person has different goals.
I also use the Netflix streaming service with a Roku. As part of their ongoing campaign to convince me this is a mistake, the Netflix app auto updates itself from time to time.
This feature shuts off the screen without the user asking for it to shut off. What could possibly go wrong?
I have a similar problem with a monitor right now. It's rare, only happens on startup, and is fixed by a 30 second reset with cables unplugged. So I still have the monitor for now.
yep-- proud owner of a DAC + speaker combo that decides to turn itself off because the "signal volume is too low" and then never turn itself back on again until i cycle the power. oh yeah and "volume too low" == "volume is at a level respectful to people in adjacent rooms"...
Nope, Dell U2415, although I would check if this permanent powersave "feature" is on any new Dell monitor, because if it goes wrong, like it did in my case - very frustrating.
Case in point: I recently bought an excellent large(ish) Dell monitor. Lovely bit of kit, but after a few days had me wanting to throw it out of the window. The reason was the power saving mode. The monitor is supposed to have the ability to "intelligently" understand you are not using it, switch the screen off and go into a sleep mode. All good and well until one discovers that the specific combination of graphics card, multiple monitor setup and video connectors one is using confuses the poor beast and it will happily shut your monitor down while you are actively using it.
The kicker that was driving me crazy was, of course, there is no way to disable this particular functionality!