This looks like the exact opposite of what I would do if I were to redesign operating systems. The job of operating system is just to run application and get out of the way, and most importantly be deterministic and reliable. I don't need my OS to "predict what I might need", I know what I need myself.
It pretty much embodies everything I hate in modern OSes. I know what I want to do and I know how I want to do it, get out of my way. Meanwhile modern OSes can't even handle simple "maybe don't steal fucking focus from window I am currently writing something in just because some other app decided to make another windows."
God, the fact that stealing focus still exists in 2023 is maddening. Please just let me open windows and apps in the background while I continue to work uninterrupted in the main focus window in the foreground. Even web browsers let you open tabs in the background, why haven’t OS’s caught up to the yet.
That said, I like the concept of this OS design. It reminds me a little of the theoretical concept behind Star Trek’s LCARS interface, where the interface adapts to the user and the task at hand. As a power user I doubt I’d use this myself, unless it could be installed on top of NixOS, but it might be more intuitive and accessible to a more general user base.
It will be hard to get rid of files and folders though, even for general users. As messy as that can get, it’s a simple abstraction burned into just about every human’s brain at this point.
I have the opposite problem. If I open too many Windows, the Windows window manager starts glitching out. When I open an app, it doesn't always come to the front and I have to go alt-tab through dozens of windows to find it. (Sometimes it ends up at the bottom, so alt-shift-tab switches to it... but sometimes it ends up in the middle...)
Also if I open too many windows, opening Windows explorer takes 1-2 or more seconds. Something about exhausting GDI handles?
How many are we talking about here, I'm curious about how many windows is too many for Windows?
I don't often have more than 30 open, and at that point I start thinking it's excessive and that I should probably tidy up to make sure I actually complete a task rather than get distracted.
Asked ChatGPT to write me a script to count the open windows :) I have 164. Although I have quite a bit of RAM left at the moment. The OS getting sluggish is usually what prompts me to close a few open apps. So having lots of RAM is bad for my productivity...
The annoying part is just that Windows explorer gets really sluggish -- to a comical degree. For example, I'll do a file operation (copy, rename etc) and it'll take that same explorer Window 5 seconds to register the operation that it itself performed. Meanwhile, Sublime Text open in the background detects it within miliseconds.
What you're doing with them ? Even on browser tabs I usually (I have vertical tabs addon so it is on the side of the screen) stop when I have full tabbar.
Why are you still relying on alt+tab in 2023? I haven't touched a windows for more than 5 minutes in a long time, aren't there a function to show all windows like in most Linux DE and Mac OS ?
I just looked it up, there's a thing called Task View (Win+Tab) which is this big scrolling view of open window thumbnails.
Curiously, Alt+Tab itself is very slow (half a second of lag?) but if I kill explorer.exe I get a different Alt+Tab screen which is extremely responsive.
(And Win+Tab is, of course, much slower than even the slow Alt+Tab...)
I wish I could just leave explore.exe killed, but it makes it a bit harder to do things. Maybe I should do that, to force me to program my own alternatives in Python. (They'd still be faster than what Microsoft made...)
I disable that functionality on Gnome. Alt-tab to a terminal and alt-tab back to the editor is much faster that displaying all windows and clicking on the one I want to go to. Even two or three alt-tabs are faster than that. Furthermore alt-tab don't move the screen. It only raises a window on top of the other. Everything is still. The Gnome way is to move everything on the screen and move it back to the original place. I don't suffer from motion sickness but windows should stay put where I placed them.
>God, the fact that stealing focus still exists in 2023 is maddening. Please just let me open windows and apps in the background while I continue to work uninterrupted in the main focus window in the foreground. Even web browsers let you open tabs in the background, why haven’t OS’s caught up to the yet.
I3 tiling WM does it reasonable enough; I use it pretty much in "app per workspace" model (with config auto-putting started app on its designated workflow) and if app on other workspace wants focus it doesn't get it, it just lights up a given workspace and app window; there are also config options to explicitly disable all or some windows from taking focus.
> That said, I like the concept of this OS design. It reminds me a little of the theoretical concept behind Star Trek’s LCARS interface, where the interface adapts to the user and the task at hand. As a power user I doubt I’d use this myself, unless it could be installed on top of NixOS, but it might be more intuitive and accessible to a more general user base.
That idea seems to only work where either you build the interface for yourself, or for a specific narrow set of tasks
Focus stealing 100% has a place, just not on app launch and while you're typing. Since switching to Wayland, which doesn't allow focus stealing, I keep thinking apps have frozen when a popup shows up in the background.
Not for me. It's by far the most annoying thing any OS can do, especially that (I think?) every single fucking one have not figured out that it is NEVER desirable to steal focus from app user is currently typing into
In I3 at least it just highlights window and workspace if it is not on currently active workspace, instead of switching focus, so you know app wants something but don't get diverter.
Only thing that should be able to is app that you're using opening another window. There is no case focus stealing of different app is desirable and couldn't be just a notification.
Totally agree. I'm sick and tired of apps "helping me," constantly interrupting with tips, "did you know .." rearranging my windows and so on. I want to get stuff done not start a journey and relationship with new code. And nobody knows better than the user what stuff they wanna do.
I think this is the best argument against "telemetry" in apps. If there is no way to measure people's reactions to such annoying prompts, there would be no way for a product manager to justify it by saying "metric X went up by Y% when we showed this popup".
It's like the corporate OS developers think they know better and need to constantly remind you how pathetic and insignificant your existence would be were it not for their omnipresent magnificence.
It's like perpetual chase of "new, inexperienced user", at cost of people using the OS for last 5-10-20 years, and no actual benefit to anyone using it seriously to do work.
In KDE Plasma, a window manager, you can configure "focus stealing prevention", as well as activation and raising of windows to be on click or on hover in a fairly precise way.
Still that is fairly far from the OS layer. Microsoft and Apple sell the 'OS' as a bundle with everything (a window manager, file explorer, etc) that you cannot really configure a lot, but that is a problem with their products.
How does it embody stealing focus when this design goal is specifically about focus?
"Focused.
The clutter we take for granted in today’s operating systems can be overwhelming, especially for folks sensitive to stimulation. Mercury is respectful of limited bandwidths and attention spans."
I think the entire concept of "non-techies" is condescending to the people you're referring to. Computers exist, the cat is out of the bag -- computer literacy is now simply a facet of literacy in general.
I would much prefer a world where we empower people to actually learn about the tools they use, rather than one in which we jump through hoops in order to try and design around predictions about what we think "they" want when, in reality, it's like trying to hit a billion different targets with a single dart.
We should be reaching for designs that give tools to and enable people to achieve that which they can think up with their own free will and imagination, not predictive trash that more often than not simply results in annoying the user with something that is, at best, close but not close enough to what they want to actually be useful, leaving them with the frustration of not being able to do anything about it.
That thinking is why Linux failed as an OS for desktop. The UI part is harder than the hardware part, because it deals with a piece of diverse, non logical and unpredictable piece of meatware, instead of simple and predictable silicon.
You've shared quite a number of misconceptions there (not least of all your definition of "failed" -- Linux is used by millions of consumers every day so if that's a "failure" then you have some rather unrealistic expectations of what is considered a success. But ultimately "Linux" isn't an OS anyways so it isn't really fair comparing it as one).
> That thinking is why
What kind of thinking? Acknowledging that writing an OS is a massive undertaking? I have written hobby OS kernels so I'm not speaking hypothetically here. I wasn't suggesting that the UI isn't important either (which is what I believe you were alluding to with your "that thinking" snub), just that the UI is literally just the surface layer in a much deeper stack of technologies -- all of which is important.
> The UI part is harder than the hardware part
Writing device drivers is much harder. The failure modes are much more extreme. No amount of UI polish will improve the user experience if your OS frequently trashes your file system because it doesn't handle buggy storage devices correctly. No amount of UI polish will improve the UX if your platform frequently crashes, or even just doesn't support your hardware at all. And to compound matters, the vast majority of hardware isn't documented either.
So yeah, UI is important but your OS isn't going to win any fans if you don't nail the stacks beneath too.
> [UI's] deals with a piece of diverse, non logical and unpredictable piece of meatware, instead of simple and predictable silicon.
Hardware is often anything but predictable. The fact that it seems that way is a testament to just how well written most operating systems are.
Having lived through 20+ “the year of the Linux desktop”s I can tell you the goal was clearly to become he dominant desktopOS that everyone used.
About that last part… the op was talking about people, it actual hardware, you seem to have missed that. Hardware is in fact predictable, unless it’s failing or just that poorly designed.
> Having lived through 20+ “the year of the Linux desktop”s I can tell you the goal was clearly to become he dominant desktop so that everyone use
Again, Linux isn't an operating system like Windows, macOS or even FreeBSD. Even the broader definition of Linux has it as multiple different collections of different individuals sharing different groups of packages. What one team might aim to do with Linux is very different to what another team might want to do. And "The year of the Linux desktop" wasn't a universal goal by any means. Case in point: I've been using Linux (as well as several flavors of BSD, Macs, Windows and a bunch of other niche platforms that aren't household names) for decades now and I've never once believed Linux should be the dominant desktop. I'm more than happy for it to be 3rd place to Windows and macOS just so long as it retains the flexibility that draws me to use Linux. For me, the appeal of Linux is freedom of choice -- which is the polar opposite of the ideals that a platform would need hold if it were aiming for dominance.
So saying "Linux failed" isn't really a fair comment. A more accurate comment might be that Canonical failed to overtake Windows but you still need to be careful about the term "failed" given that millions of regular users would be viewed as successful by most peoples standards -- even if it didn't succeed at some impossible ideology of a desktop monopoly.
> About that last part… the op was talking about people, it actual hardware, you seem to have missed that.
No i got that. It's you who missed the following part:
> Hardware is in fact predictable, unless it’s failing or just that poorly designed.
Don't underestimate just how much hardware out there is buggy, doesn't follow specifications correctly, old and failing, or just isn't used correctly by users correctly (eg people yanking USB storage devices without safely unmounting them first). The reality is that hardware can be very unpredictable yet operating systems still need to handle that gracefully.
The market is flooded with mechanical HDDs from reputable manufacturers which don't follow specifications correctly because those devices can fake higher throughput by sending successful write messages back to the OS even when the drives are still caching those writes. Or cheap flash storage that fails often. And hot pluggable devices in the form of USB and Thunderbolt have only exasperated the problem because now you have devices that can be connected and disconnected without any warning.
Then you have problems that power saving introduces. You now have an OS with hardware that shouldn't be connected and disconnected yet still able to power that on and off gracefully (otherwise your OS is borderline useless on any laptop).
...and all of this is without even considering external conditions (ie the physical nature of hardware -- the reason it's called "hardware"). From mechanical failures, hardware getting old, dusty, dropped etc. Through to unlikely but still real world problems like "cosmic bit-flips".
Now imagine trying to implement all of this via reverse engineering - because device manufacturers are only going to support Windows, maybe macOS and, if you're lucky, Linux. And imagine trying to implement that for hundreds of different hardware types, getting each one stable. Even just testing across that range of hardware is a difficult enough problem on its own.
There's a reason BeOS-clone Haiku support FreeBSD network drivers, SkyOS's user land was eventually ported to Linux, and Linux (for a time at least) supported Windows WiFi drivers. It isn't because developers are lazy -- it's because this is a fscking hard problem to solve. And lets be clear, using another OS's driver model isn't an easy thing to implement itself.
Frankly put: fact that you think hardware is easy and predictable is proof of the success of Linux (and NT, Darwin, BSD, etc).
I wasn't making any comment about the predictability of humans. However there have been plenty of studies that have proven humans are indeed predictable. If we weren't then dark UI patterns for email sign ups, cookie consent pop ups, and so on wouldn't work. The reason UI can be tuned for "evil" is precisely because of our predictability. But this is a psychological point and thus tangential from the discussion :)
To come back on topic. I wasn't saying hardware is unpredictable per se -- just that it often behaves unpredictably. And I say that because there are a set of expectations which, in reality, hardware doesn't always follow.
However the predictability of humans vs hardware is a somewhat moot point because that's only part of the story for why writing hardware-interfacing code is harder than human interfaces. I did discuss cover a few of those other reasons too but you seem fixated on the predictability of the interfaces.
"Operating system" is a frequent term in the news. I don't know where you are from.
My point is that you produced dozens of paragraphs about terminology but added zero value to the actual discussion and all that while totally understanding what others actually meant.
> “Operating system" is a frequent term in the news. I don't know where you are from.
Tech news sure. But not in average peoples news. Or at the very most, only on special tech segments. Either way, definitely not frequent.
> My point is that you produced dozens of paragraphs about terminology but added zero value to the actual discussion
I was discussing the complexities of writing hardware drivers after it was stated that it’s easier than UIs.
Where’s the value in your comments here? You are adding nothing, just arguing for the sake of arguing.
> all that while totally understanding what others actually meant.
“Misunderstanding” I assume you meant (given the context of your comments). Either way, and as I said to the other commenter (though I’m not convinced you two aren’t shill accounts for the same person given your writing styles are the same), I did understand. I just disagreed and cited some experience I had to back up my points of view.
I don’t really understand your problem here. This is a tech forum and we are talking about operating system development. Yet you’re begrudging me having a technical conversation about operating system development.
I didn't understand how to interact with anything using Mercury OS. One of the first things I'll do today will be opening a terminal, git pull, then edit some files in my editor, start a Vagrant VM with Django, check the result. Nothing of what I saw on this site gives me an idea of how to do it.
That may be why I thought "this might be cool to see a video game character interact with a tablet in a futuristic-theme rpg to make for like the menu screen or something."
I tried it, but it is just not there yet. My use case is switching between 3d cad design and printing apps (browser, slicer, notes) and between coding (jetbrains ide’s, terminal, browser, draw.io app)
But in the spirit of the parent article, it would be nicer if the os knew i wanted to do 3d cad printing and then organize my window with all cad/3d printing related modules. To let it create my perfect workflow because it gets my intents.
And when i switch from design to getting into the production stage, it should make the slicer module more prominent. Cad design usually are tweaks after this.
+1 to this, it's crazy to lose all (or most) contexts across a reboot or when switching between machines.
Each app has some sync capabilities (e.g. PowerPoint, web browsers, Google docs, chrome tabs, Firefox tabs), but there is still no universal solution.
MacOS is actually pretty good about this. It restores applications on restart, and it’s provided applications pick up where they left off.
I don’t use the workspace feature on the Mac, I assume they’re recovered as well.
Obviously applications need to be restart aware as well. I really like how the provided Mac apps (Pages, Numbers, etc. ) work with the first class document model in the system. I have dozens of Untitled documents across apps, some are years old. Never “saved” them. They just exist. Across reboots, app upgrades, and OS upgrades.
I wish more apps embraced my lazy house keeping.
I don’t know how the documents sync across devices, if at all.
I wanted to mimic the OS document model in Java with my own app, but that’s easier said than done.
Unfortunately not. After reboots, which are annoyingly often, MacOS places all my carefully organized windows and their layout into a messy pile on the first/primary space and I have to spend a significant amount of time to get everything back to the state I want. This is also a huge pain in the ass when moving between external and internal monitor, where my windows and their size ends up not how I want them every single time.
I presume you're referring to KDE Activities. I use Activities all the time, but they're still far too much of a static thing, and way too High Ceremony to create/curate/maintain.
I'm sure it's "well-designed" from a theoretical perspective, and unimpeachable from the perspective of Professional UI/UX Development, but it's so inflexible and non-extensible it comes off as something between a toy and a jail cell. "You will enjoy being productive in this environment focused on channeling productivity to Approved Ends. You will enjoy using Approved Technology in Approved Ways. You will ignore everything beneath the interface, for what the Approved Technology is doing is Approved, and therefore not for you to interfere with."
> This looks like the exact opposite of what I would do if I were to redesign operating systems
Me too but to be fair, we would probably call this a desktop environment, not an OS. It's a shell that could run on top of Windows, MacOS or Linux. Data could be stored in a standard filesystem or a database and managed by the shell and presented to users as "content and actions [...] fluidly assembled based on your intentions"