This is a fantastic example of how many resources we waste with modern software. Sure it's not packed full of the kind of eye-candy we're used to, but lots of the basics are there, and it's thousands of times smaller than even OSs used on smart phones.
iOS 8 for example takes up 5 Gigabytes, Windows 10 x64 is 11 Gigabytes.
It's definitely a very neat project. It kind of reminds me of QNX Neutrino. It also fit on a single floppy disk, and had a GUI, web browser, etc. It's also a Unix-like with a microkernel design.
(I don't know the exact size of the OS today, but back in the early 2000's it was still extremely small and looked almost identical to the osnews screenshots. I want to say it was around ~80MB, and came with GCC and a bunch of other goodies.)
I could actually see myself running that instead of a Chromebook or on a server through VNC.
Unfortunately, QNX never really went beyond the embedded world: it's almost exclusively used in car navigation systems and such these days.
Hopefully at some point Menuet will grow as QNX did from its floppy disk days. The tiny tech demos are wonderful, but with a bit more polish they could see real-world production use.
QNX is an excellent example of OS design: minimal TCB, efficient, self-healing to a degree, and supports popular runtimes/libraries. You don't need to imagine what a consumer version is like: Google a Blackberry Playbook demo to see its power. To be accurate, that OS is a combo of QNX and Blackberry addon's. Most couldn't run Internet and two games at once without lag. ;)
I had a friend named "tommy" that hated bloated software. He followed guides like this [1] to experimentally delete files off his box. Backup, delete stuff, see if everything works. If works, new incremental backup and repeat. If not, restore and make note to keep that. The result over a few years: a WinXP install with Firefox, AV, Office Suite, etc that backed up fully on a 650MB CD. Mindboggling.
Note: It was a blessing and a curse. He was a Windows XP holdover despite security issues and his preference for security. He wouldn't transition. The reason? He invested too much time into his XP box to give it up. Plus, looking at the Win7 default install, he figured it would be less fun next time.
I wonder if it would be possible to do this semi-automatically. Boot it in a VM, check what is actually accessed during boot / a couple programs starting up / running. ("Just" automate this.) Remove the files / parts of files, check that everything still works. If yes, repeat. If no, try binary searching until you get something that does work. Repeat.
Someone got a Linux image that could run `ls` down to 6.12MB by intercepting file accesses and deleting anything unused... See "How I shrunk a Docker image by 98.8%" (https://news.ycombinator.com/item?id=9438323)
It's an overly simple method but a good start. Will work if the dependencies are static and load at runtime. I don't know enough about the tool to say if it works for more dynamic applications.
Nice thinking but I'm not sure it's a great idea. The reason, which I warned tommy, was that there's plenty of code in there for significant situations that are uncommon or even rare. You'd have to pick out about every way you'd configure, run, maintain, and so on a binary program to exercise all its relevant functionality. You'd also have to hit it with various errors to be sure they're within the same libraries. Otherwise, you're risking leaving out something critical and it will be hard to figure out by the time trimming is done.
For this reason, I thought about using Windows Embedded as it has configuration tools to strip out most unnecessary things while being compatible with whatever you want. That plus stripping guides. Alternatively, stick with tommy's stripping-style method. Either way, you eventually have a set of files you turn into an image with proprietary or open tools along with configuration scripts to make it unique. That has many advantages in addition to size in terms of administration, backup, and even security.
Something to consider is whether an inherent limitation of HLLs is that they simply cannot compete with Asm on efficiency, or if the common mantra of "the compiler can always do better" is ultimately true. The presence of projects like this certainly calls into question such points.
Put it another way, could the same results be achieved with C, or even something much higher level like Haskell or Lisp, if only compilers were better at generating code?
I've looked at a lot of disassembled code over the years, and it's extremely easy to tell whether something was generated by a compiler or hand-written by a human; the "texture" is quite different.
Remember that BASIC, Pascal, Modula, Oberon, Ada, and LISP were used in the past for OS's and system software on machines with almost no hardware by today's standards. Ada, Java subsets, and Astrobe's Oberon are still used in embedded systems today.
What makes today's software bloated is the crud built-up over time, standardization, security, reliability, a trend toward easier maintenance/productivity over raw speed, and so on. Here's [1] a simple program that got trimmed down to mere bytes. You can see how much overhead the aforementioned items add to C code which, by itself, produces very efficient assembler. For people wanting a middle ground, there are High Level Assemblers such as Hyde's HLA [2] and I've speculated we could do something similar with LLVM's bytecode.
"Sure it's not packed full of the kind of eye-candy we're used to"
This is like arguing that a car is a "waste of resources", because look at how much more metal it uses than this little red wagon. Sure, the wagon is missing "eye-candy" like "providing its own motive power" and "carrying passengers", but lots of the basics are there and it has hundreds of times less metal!
iOS 8 for example takes up 5 Gigabytes, Windows 10 x64 is 11 Gigabytes.