Hacker News new | past | comments | ask | show | jobs | submit login

That's because this OS is a speculative puff piece by a UI designer looking to drum up some business.



I don't know why you people are so touched about somebody's attempt at redesigning an OS.

Here's the thing: You are not the target audience of this OS and you are by no means required to use it.

In my opinion, this is a solid attempt at creating an operating system for non-techies. Way fewer concepts to learn, way easier interaction.


I think the entire concept of "non-techies" is condescending to the people you're referring to. Computers exist, the cat is out of the bag -- computer literacy is now simply a facet of literacy in general.

I would much prefer a world where we empower people to actually learn about the tools they use, rather than one in which we jump through hoops in order to try and design around predictions about what we think "they" want when, in reality, it's like trying to hit a billion different targets with a single dart.

We should be reaching for designs that give tools to and enable people to achieve that which they can think up with their own free will and imagination, not predictive trash that more often than not simply results in annoying the user with something that is, at best, close but not close enough to what they want to actually be useful, leaving them with the frustration of not being able to do anything about it.


> I think the entire concept of "non-techies" is condescending to the people you're referring to

It's not, stop being offended on behalf of others. "non-xxx" is as neutral as it can be to refer to a population.

And it does make sense to give different end products to techies and non-techies, because the needs are wildly different.


They’re not redesigning an OS. They are describing a theoretical UI.

This is about as close to designing an operating system as drawing a front cover is to writing a book.


That thinking is why Linux failed as an OS for desktop. The UI part is harder than the hardware part, because it deals with a piece of diverse, non logical and unpredictable piece of meatware, instead of simple and predictable silicon.


You've shared quite a number of misconceptions there (not least of all your definition of "failed" -- Linux is used by millions of consumers every day so if that's a "failure" then you have some rather unrealistic expectations of what is considered a success. But ultimately "Linux" isn't an OS anyways so it isn't really fair comparing it as one).

> That thinking is why

What kind of thinking? Acknowledging that writing an OS is a massive undertaking? I have written hobby OS kernels so I'm not speaking hypothetically here. I wasn't suggesting that the UI isn't important either (which is what I believe you were alluding to with your "that thinking" snub), just that the UI is literally just the surface layer in a much deeper stack of technologies -- all of which is important.

> The UI part is harder than the hardware part

Writing device drivers is much harder. The failure modes are much more extreme. No amount of UI polish will improve the user experience if your OS frequently trashes your file system because it doesn't handle buggy storage devices correctly. No amount of UI polish will improve the UX if your platform frequently crashes, or even just doesn't support your hardware at all. And to compound matters, the vast majority of hardware isn't documented either.

So yeah, UI is important but your OS isn't going to win any fans if you don't nail the stacks beneath too.

> [UI's] deals with a piece of diverse, non logical and unpredictable piece of meatware, instead of simple and predictable silicon.

Hardware is often anything but predictable. The fact that it seems that way is a testament to just how well written most operating systems are.


It definitely failed.

Having lived through 20+ “the year of the Linux desktop”s I can tell you the goal was clearly to become he dominant desktopOS that everyone used.

About that last part… the op was talking about people, it actual hardware, you seem to have missed that. Hardware is in fact predictable, unless it’s failing or just that poorly designed.


> Having lived through 20+ “the year of the Linux desktop”s I can tell you the goal was clearly to become he dominant desktop so that everyone use

Again, Linux isn't an operating system like Windows, macOS or even FreeBSD. Even the broader definition of Linux has it as multiple different collections of different individuals sharing different groups of packages. What one team might aim to do with Linux is very different to what another team might want to do. And "The year of the Linux desktop" wasn't a universal goal by any means. Case in point: I've been using Linux (as well as several flavors of BSD, Macs, Windows and a bunch of other niche platforms that aren't household names) for decades now and I've never once believed Linux should be the dominant desktop. I'm more than happy for it to be 3rd place to Windows and macOS just so long as it retains the flexibility that draws me to use Linux. For me, the appeal of Linux is freedom of choice -- which is the polar opposite of the ideals that a platform would need hold if it were aiming for dominance.

So saying "Linux failed" isn't really a fair comment. A more accurate comment might be that Canonical failed to overtake Windows but you still need to be careful about the term "failed" given that millions of regular users would be viewed as successful by most peoples standards -- even if it didn't succeed at some impossible ideology of a desktop monopoly.

> About that last part… the op was talking about people, it actual hardware, you seem to have missed that.

No i got that. It's you who missed the following part:

> Hardware is in fact predictable, unless it’s failing or just that poorly designed.

Don't underestimate just how much hardware out there is buggy, doesn't follow specifications correctly, old and failing, or just isn't used correctly by users correctly (eg people yanking USB storage devices without safely unmounting them first). The reality is that hardware can be very unpredictable yet operating systems still need to handle that gracefully.

The market is flooded with mechanical HDDs from reputable manufacturers which don't follow specifications correctly because those devices can fake higher throughput by sending successful write messages back to the OS even when the drives are still caching those writes. Or cheap flash storage that fails often. And hot pluggable devices in the form of USB and Thunderbolt have only exasperated the problem because now you have devices that can be connected and disconnected without any warning.

Then you have problems that power saving introduces. You now have an OS with hardware that shouldn't be connected and disconnected yet still able to power that on and off gracefully (otherwise your OS is borderline useless on any laptop).

...and all of this is without even considering external conditions (ie the physical nature of hardware -- the reason it's called "hardware"). From mechanical failures, hardware getting old, dusty, dropped etc. Through to unlikely but still real world problems like "cosmic bit-flips".

Now imagine trying to implement all of this via reverse engineering - because device manufacturers are only going to support Windows, maybe macOS and, if you're lucky, Linux. And imagine trying to implement that for hundreds of different hardware types, getting each one stable. Even just testing across that range of hardware is a difficult enough problem on its own.

There's a reason BeOS-clone Haiku support FreeBSD network drivers, SkyOS's user land was eventually ported to Linux, and Linux (for a time at least) supported Windows WiFi drivers. It isn't because developers are lazy -- it's because this is a fscking hard problem to solve. And lets be clear, using another OS's driver model isn't an easy thing to implement itself.

Frankly put: fact that you think hardware is easy and predictable is proof of the success of Linux (and NT, Darwin, BSD, etc).


Today I learned that humans are predictable and consistent, but hardware is not....


I wasn't making any comment about the predictability of humans. However there have been plenty of studies that have proven humans are indeed predictable. If we weren't then dark UI patterns for email sign ups, cookie consent pop ups, and so on wouldn't work. The reason UI can be tuned for "evil" is precisely because of our predictability. But this is a psychological point and thus tangential from the discussion :)

To come back on topic. I wasn't saying hardware is unpredictable per se -- just that it often behaves unpredictably. And I say that because there are a set of expectations which, in reality, hardware doesn't always follow.

However the predictability of humans vs hardware is a somewhat moot point because that's only part of the story for why writing hardware-interfacing code is harder than human interfaces. I did discuss cover a few of those other reasons too but you seem fixated on the predictability of the interfaces.


This is yet another thing that non-techies sincerely don't care about. For them, it is the OS.

Now you can walk around telling every grandma that they should not refer to it as OS, but you will unlikely succeed in that mission. Accept it...


Non-techies don’t wouldn’t even have heard the term OS before, let alone know whether the post above is an OS or not.

But we are technical people talking about OS design. So I don’t really understand your point.


"Operating system" is a frequent term in the news. I don't know where you are from.

My point is that you produced dozens of paragraphs about terminology but added zero value to the actual discussion and all that while totally understanding what others actually meant.


> “Operating system" is a frequent term in the news. I don't know where you are from.

Tech news sure. But not in average peoples news. Or at the very most, only on special tech segments. Either way, definitely not frequent.

> My point is that you produced dozens of paragraphs about terminology but added zero value to the actual discussion

I was discussing the complexities of writing hardware drivers after it was stated that it’s easier than UIs.

Where’s the value in your comments here? You are adding nothing, just arguing for the sake of arguing.

> all that while totally understanding what others actually meant.

“Misunderstanding” I assume you meant (given the context of your comments). Either way, and as I said to the other commenter (though I’m not convinced you two aren’t shill accounts for the same person given your writing styles are the same), I did understand. I just disagreed and cited some experience I had to back up my points of view.

I don’t really understand your problem here. This is a tech forum and we are talking about operating system development. Yet you’re begrudging me having a technical conversation about operating system development.


I didn't understand how to interact with anything using Mercury OS. One of the first things I'll do today will be opening a terminal, git pull, then edit some files in my editor, start a Vagrant VM with Django, check the result. Nothing of what I saw on this site gives me an idea of how to do it.


Not sure about the non-techies part. He wants to replace icons by command line interface and voice recognition.


That may be why I thought "this might be cool to see a video game character interact with a tablet in a futuristic-theme rpg to make for like the menu screen or something."




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: