I recently removed all animations from my android and the zero latency blows me away, it's like it got instantly way faster. Wish i could do that for the web
Most of the sins of the modern web can be easily undone
What has happened to desktop UIs since the 90s has been tragic. We've made some minor specific advances in the last 20 years, but dropped the ball on some really fundamental concepts. It hurts whenever I have to slog through a shitload of monochromatic nondescript icons with no tooltips until I finally find what I need five minutes later. Mobile phones are slathered in useless animations that make using them feel like molasses, and this is starting to show up in desktop software too. Finding software that has an information density level appropriate for somebody over the age of 12 is becoming pretty hard.
Most of this could be easily undone, but it won't be. Rest assured that whatever is going to cause you the most pain is what will happen.
The lack of chrome is such a frustrating design trend. In Windows 10 in particular I find it so difficult to find the .. well, windows because everything just kind of blends together. There’s something to be said about the workaday utility of how things used to be. They weren’t necessarily beautiful in a conventional sense but they were so much more intuitive.
The real problem is the lack of configurability. You could change the window border width to anywhere from invisible to some insanely huge number on the older versions of Windows, and everyone would be happy. Now you're forced to use whatever some stupid "designer"'s idea is.
The funny thing is back in Windows 7 and earlier I would tweak my theme to have the thinnest borders possible yet I still didn’t have the whole “where is my window” issue. All those customization options went away starting with Windows 8 when some UI designers decided they knew better than me how my windows should look.
Thanks for those links - I’ll play around with those options to see if I can restore some sanity to my desktop.
It turns the "just shoving windows around and resizing them" from something normal into an act which only a surgeon could feel comfortable with. The mouse is not a scalpel, it's a tool, and if I need to hit a 1px wide border edge (or even worse the corner) in order to resize a window, then it's a terribly bad UI.
I loved SGI IRIX's brutally bold borders, but those of Windows 7 are still just as good.
Truth is, we had knobs and switches, then typewriters which naturally transition to command line, then we had window managers which are like a desk(top) covered in papers. Then we had... well that’s all we have. I guess the next thing is AR/VR and 3D UIs but until then...
A mobile phone or tablet UI I would say is like a spiral bound pocket notebook, as compared to the proper desktop metaphor of full laptop/desktop computers.
It’s like trying to reinvent the book. It _can_ be enhanced, but most of the time it doesn’t change the overall reading experience.
Animations on mobile are not always useless. They act as a feedback mechanism on a medium like touch screen where there is none otherwise. They can also work to compensate for delays. The perception of delay can be reduced by showing an animation while doing another task in the background.
There are better feedback mechanisms. Like, flash the button if it's being touched. At no point should an animation make the task take longer than it would otherwise.
How about just not doing any of that and making the task itself go faster?
I have a problem with this "animations hiding delays" pattern, because it's straight-up dishonest and disrespectful. If the computer has to take time, let me know; if it doesn't, just give me the results immediately. Stop running what would be a piece of background feedback about the state of my machine through a low-pass filter!
Adding animations to software tends to make it slower (a good potential for a self-fulfilling prophecy here). Often enough, the animations are of fixed "developer's best estimate" length, and can take longer than the task itself, slowing the user down.
The whole idea behind "reducing perception of delay" seems to be borne out of paternalistic and exploitative thinking, "how to make the user feel better about our app, despite its shortcomings". Trying to make the user like you more than they should. Instead of that, why not focus on ensuring the user is maximally effective in getting maximum value out of the application - that they're not confused, and not waiting for the machine?
For the web, if you have the right accessibility settings in your operating system or browser, then the prefers-reduced-motion media query should tell sites not to show you animations.
Not many developers explicitly add support for this, but it's included in eg bootstrap.
Oh how I wish I could do this on macOS. Yes there is a "reduce motion" setting but there is no remove motion setting and a whole bunch of animations remain most notably minimize/maximize of windows (I can at least change from the default Genie effect to Scale which is faster), opening an application it still "zooms" into the screen, etc.
I would happily pay for a third party app to nuke every single damn animation if I could but Apple make it impossible (or at least that is what is claimed).
And of course this doesn't even touch on how laggy resizing many applications on macOS is (yes even on the shiny new M1 models). The Microsoft Office suite, Chrome and a whole bunch of other applications are horrible when resizing. It isn't often an issue with Apple's applications for what that's worth.
Oh I know every single tweak out there, sadly many no longer work in Big Sur. But even before that you have not been able to fully disable every animation in macOS for as long as I can remember.
Windows XP and perhaps newer versions of Windows included a setting to only redraw a window after the user has finished moving/resizing. You would instead see an outline of the new window position/size. The same applied to (un)minimizing. I don’t know about such a setting in MacOS.
Even in Windows 10 you can still go to the "old style" (Windows 2000 days) settings screen and adjust for "best performance" which disables close to every UI enhancement Windows supports. Drop shadows on windows and desktop icons, drawing windows on resize (as you mentioned), taskbar animations, scrollbar animations and even small visual effects such as the subtle animations when you mouse over a control in a window.
Obviously not many people wish to disable everything but having the option to do so is great and something I wish macOS would let me do as there is way too much motion in macOS. Sure it was cute 15 years ago when the idea of a beautiful and fluid UI was impressive but in 2021 they annoy me or worse they make the whole system look crap when, for some reason, the animations lag.
If your OS has a "reduce motion" option (I know windows and mac do), many websites are starting to obey this as an accessibility requirement.
I actually noticed this when I accidentally disabled animations on a few websites I visit by disabling them at the OS level. Unfortunately the website animations actually made things easier to use, so I had to turn the painful OS animations back on to get the website animations back.
...in a browser that supports user stylesheets, which unfortunately means only Firefox and IE, by default. It's possible to work around that with extensions, however.
Thanks for this post. Just turned off all animations on my android phone and you weren't kidding, the difference is pretty incredible.
For anyone who doesn't know, the setting is in developer options on android 9 at least. There's several animation scale settings, they can be increased, decreased or turned off. Turning them all off removed the animations.
Careful, some apps seem to rely on those animations. Uber, I've found, has issues displaying the location of the vehicle if I set them all to 0.0. No problem if I set them to 0.5.
"Battery Saver" mode also automatically disables some of the animations, so it would be interesting to see if you got a boost from removing the animations without enabling Battery Saver.
Even though we're eons ahead in tech, especially mobile, 90's seem to have peaked a lot of ideas which we slowly iterated upon ever since. Not as many leaps since, or any, outside of mobile.
Endless browser popups just take a new form nowadays: cookie setting bullshit, websites asking for permissions to shit they don't need, newsletter signup popups, subscription popups, ads that relocate all the content you're trying to look at, ads that obscure the content until you tell them to fuck off...
This is a personal pet peeve that drives me nuts. Shifting content is irritating and just makes me want to avoid the site. The thing is, ads don’t HAVE to shift content - the site/app designer can account for these kinds of things with various techniques such as giving the ad area a fixed height so that when it loads it’s only filling in that dead space.
Ads that obscure content deserve a special place in hell.
90s were a time of ideas in tech. Not all of them were good, but it does seem like there were a lot more of them. Now it just feels like re-arranging the same furniture in the same room over and over.
I think “being ahead in tech” can mean two things: we’ve refined and improved iteratively in things like reliability, distributed computing and processor tech but, I don’t think we’ve really had a real novelty since the rise of the iPhone.
That's what I meant. Seems like, well the feeling is that, there was this cambrian explosion of ideas that were just then possible to be materialized due to tech finally being able to, well materialize them into that cambrian explosion. Right now I cannot think of much / any tech outside of real of possibility with tech we have. Take VR as an example, which is still not there, it was there in the 90's. Crude and various approaches, but there.
Since, we had iPhone/mobile and ML recently. I cannot think of other seemingly leaps. I might be wrong, of course, since it's all based on a feeling.
I agree, although I don't know that ML is actually a "leap": my general impression is that a certain subset of "AI" techniques became practical because of GPU advances, which lead to an explosion of practical applications: the technique itself isn't particularly new, afaict.
Props to this dude for actually getting most of the details right. Though I don't think the font is correct.
A lot of folks try to do 'retro' but they screw it up by getting key details wrong... maybe the simple bezels have the wrong light angle, maybe the colors are too high a bit depth, and so on. A lot of games mix pixel sizes or use free rotation without quantizing it to the pixel grid (especially annoying!)
If you're going to take the retro route, don't cheap out on the details. Do it all the way or don't bother.
As much as I love the authenticity the site would probably be a lot less readable/usable if it exactly simulated the rendering of the original bitmap MS Sans Serif. Old bitmap font glyphs were manually drawn per-size and were generally expected to be viewed on a CRT where the inherent characteristics of the display gave a kind of ClearType-esque blurring that would be expected by font designers.
MS Sans Serif was replaced with an OpenType version of itself (now known as "Microsoft Sans Serif") in Win2k but then you have the copyright issue of redistributing it: https://en.wikipedia.org/wiki/Microsoft_Sans_Serif
For what it's worth it does match the original bitmap exactly if anti-aliasing is off and the font size is 11px. I found this out just a week or so back; I just did a similar project emulating Windows 9x.
> A lot of games mix pixel sizes or use free rotation without quantizing it to the pixel grid
So happy I am not the only one with this pet peeve. Never understood why so called retro games simply not render at a higher resolution backbuffer and then scale it down to a low-res one.
I agree with you on that one. The only 'retro' styled game that I've played that used low res pixel art and high res rendering to its advantage was Hotline Miami. Every other one has irritated me.
> If you're going to take the retro route, don't cheap out on the details. Do it all the way or don't bother.
That's quite passionate of you. I like how the OP added their own things. It's quite creative. For example, Paint has multiple themes, rendering of the undo history as a GIF, and multi-language support. I don't remember Windows 98's Paint having any of that, and it's all awesome.
That's cool. I did a bit more poking around and found a few more inaccuracies, but they are not really visible from the surface so it's cool. (Namely... there's an Access Denied dialog box that uses forward slashes when it should be using backslashes for a path)
When I do a "retro style" game, I draw everything to a reduced size bitmap and then scale that up (without interpolation) when I render to the screen for nice chonky pixels.
Of course then I'll probably get complaints about the palette size, or the lack of per-scanline sprite limitations...!
It crashed for me. The images weren’t loading on a blog post, so I chose “turn off content blockers” on my mobile Safari browser, and the page reloaded showing only raw source code. Turning content blockers back on didn’t fix it. Revisiting the site now gives the same result. Very weird! I was half way through an article, too. Maybe I should reboot my phone.
LOL. You know, I was just expecting that to happen. But then, everything was running so smoothly with zero load times. I clicked the media player and it popped-up already loaded with a dozen videos on it. This alone, back in the day, would be something like: click-and-go-fix-your-lunch kind of experience.
Well if it makes you feel better, I did find a bug: Go Start -> Shutdown, then click "Cancel" instead of the "X", and it remains in the taskbar without a window.
Is there an HTML5 "make my cursor be good" library?
Because when I do stuff like this, it looks so appealing to just declare a cursor in a CSS rule. Then HTML5 is all like:
* ha ha. There's a default set somewhere else in the DOM that's overriding your cursor (as in the author's website where the titlebar triggers the text edition cursor).
* ha ha. I'll show your desired cursor for the DOM element, but when you drag the element with the mouse I'm going to flicker every time the mouse beats Javascript to a location that lies outside that element's bbox.
Especially when making a UI to drag SVG rects, this gets annoying. I doubt there's a pseudo class for "this thing I am dragging right now using a mousemove event listener," but maybe I'm wrong.
The approach I have seen is to apply the cursor you want to the entire body using JS so even if your cursor momentarily escapes the element the wanted cursor is shown. You just need to make sure to revert back to the default cursor when the users stops dragging or the window loses focus.
Thanks, jaflo. I use the JS body approach, but I was hoping I was missing a declarative option.
Even JS has the PointerEvent so that you can encapsulate behaviors like dragging inside a single DOM element without having to set state for a parent element like body. I was hoping CSS had something analogous (though I can't currently think of how it would work).
I miss the persistence of the phosphor and the simple elegance of the pre-PC fonts. I miss the cadenced hum of the teletypes, that 10 Hz suspense between silence and the racket of the type element hitting paper.
When using Windows 95, I realized I was living inside what would have been science fiction for most of my life.
This is really solid work (kudos, Ash), but I'm an older audience who pines for a different past.
One project I really want to build is an Apple II card that allows a Pi (or something similar) to use the II's keyboard and screen as a virtual console.
Ideally, one would boot the Apple II, type a "PR #1" to initialize the card's firmware and boot the Pi with the II as a console.
A terminal would have better fonts (the II has a 7x8 matrix) but would be limited to what the terminal's serial could do.
We were promised vacations on the Moon, space stations, and rocket belts. All that while keeping the simple life of the period.
What we got is a dystopian nightmare that could have been written by Charlie Stross (I know you lurk here) or William Gibson, except that the dialogues and plot are destitute of any interesting feature.
Nostalgia is a feeling and in this context, retro is a style. In the sense that they both refer to the past, maybe. Here to be nostalgic is to opine in the present about the retro style. However generally speaking, one could have a feeling of nostalgia towards a smell, feeling, colour, moment etc, nostalgia does not have to be retro(style), unless retro is specifically retroactive, and that isn't the case here.
I'm not sure I agree, but a lot of people classify my work under the retro nostalgia umbrella. Personally, I'd prefer to classify it as neo-retro nostalgia.
I was following along on fast cash money plus, at which point do you tell users to stop and that it is a scam ?
I mean, it's kinda clear, but do you let them go all the way ?
They can go as far as they want. I think I've made it difficult enough for people to follow along that if they actually get to the end then they're discerning enough to know what they're doing.
I wonder why icons and GUI elements look weird when scaled (I have 125% scaling enabled in Windows 10) whereas Windows 95 screenshot images look perfect in the same browser (Chrome).
The website seems to use images for Close, Minimize, Maximize icons so it's probably not a scaling issue.
Couple of behavioral tweaks could make it more authentic -- for example when dragging windows I believe the default behavior was to show the only the outline in the new position, not the window contents as it's being moved -- I may be mistaken though
For some reason, I had to close the website quickly after browsing for a little while. For a moment it felt like I'm looking and going through someone personal desktop (which has been left unlocked) without their consent while at the same time I was really well-aware that it's just a personal website for public. Weird feeling :)
Edit: On second thought, it looks like it did one thing really well is refreshing my own personal memory of Windows 9x (personally, I've good memory of Windows 98, before I switched to Linux based desktop -- Ubuntu specifically).
Really love the look and responsive feel. Also pretty easy to find content (blogs on desktop, games in start menu, etc).
One odd thing I noticed, when I navigate "back" to your site after clicking a link, sometimes the page renders HTML as text. For example. I somehow ended up on this page which just shows the page's HTML for me
Windows 95 booted into DOS first, loaded any DOS device drivers in CONFIG.SYS and then automatically ran WIN.COM to launch the Windows 95 shell.
At that point the system dropped into protected mode and was 32bit native operating system. However there was still 16bit DOS code and 16bit Windows code running. Windows 95 could easily be thought of as a very advanced DOS application.
I disagree with the author when he says "we didn't know better". Windows 95, as it was designed, was necessary. It had to be able to run on the hardware (and with the drivers) of machines at the time and run existing DOS and Windows 3.1 software.
> Windows 95 could easily be thought of as a very advanced DOS application.
"Easily"...perhaps, but not accurately. It would be like arguing that Linux is a very advanced GRUB application or that DOS was a very advanced BIOS application.
The fact remains that while DOS was an essential component for bootstrapping Windows 9x, Windows did still manage it's own memory, resources, etc independently of DOS. Sure, Windows managed it's memory in a way that was sympathetic of DOS applications (basically using DOS layouts wholesale for 16bit applications) but that was a requirement of the era. But you see similar examples of standards being used as a based (for compatibility) and then expanded upon in the Linux/UNIX world with Linux and macOS having POSIX components but support for so much more added on top.
> I disagree with the author when he says "we didn't know better"
I disagree with the author there to but for very different reasons than you. In the 90s we already had OS/2 (who's DOS emulation was so sophisticated that it could run Windows 3.1). Not to mention a whole boat load of better home computers than IBM-clones, such as Acorn Archimedes, Apple Mac, Atari ST, Amigas (funny how they all begin with 'A'). By the late 90s there was BeOS too. Windows 9x always felt like the shittiest desktop offering to pretty much anyone who'd used pretty much any other computer that wasn't an IBM-clone.
So yeah, some of us did know better. But that didn't stop the IBM/Microsoft partnership from monopolising the desktop market. :(
Windows 2000 was the first (and, frankly, only) time I ever felt impressed by a Microsoft OS. Then they went and ruining it with all the bling in XP (for those who weren't aware, the first release of XP was basically just a skinned version of Windows 2000 but which required literally twice the system resources to run. It wasn't until somewhere around service pack 2 or 3 before XP became the platform that everyone identifies with these days).
> The fact remains that while DOS was an essential component for bootstrapping Windows 9x, Windows did still manage it's own memory, resources, etc independently of DOS.
Windows 95 would round-trip some calls through DOS so that it was possible to use DOS device drivers in Windows 95. There were also many 32bit applications for DOS that used DOS extenders (which Windows 95 also was) to manage memory and resources and call back into DOS as necessary. They would also do their own hardware calls, etc. If those are 32bit DOS applications than I would still argue is accurate (but a stretch) to call Windows 95 one as well.
> Not to mention a whole boat load of better home computers than IBM-clones, such as Acorn Archimedes, Apple Mac, Atari ST, Amigas (funny how they all begin with 'A').
In this era, many of these machines didn't even use protected memory. There is no comparison -- these proprietary systems didn't stand a chance.
Well yeah, DOS drivers are clearly going to be run through a DOS subsystem. That shouldn’t be a surprise to anyone. But you could easily run Windows 95 without DOS drivers (and particularly if you don’t plan on running DOS applications).
You have to bare in mind that Windows 95 was the first time Microsoft had asked desktop developers to switch away from DOS and given Windows graphics and sound API calls were still pretty unoptimised and that the platform as a whole was new, it’s understandable that it took some developers a few years to turn their backs on DOS (particularly games developers). But please don’t mistake developer pragmatism for Windows being a DOS shell. Windows 95 was actually a very clever design evolution and an OS in its own right.
And yeah some of the early 90s competitors didn’t have protected memory, but neither did Windows 3.x. Some did have preemptive multitasking back when Windows didn’t (Win 95 introduced that), and other features Windows sorely missed too. I’ll agree with Windows 95 did up the game somewhat but frankly Microsoft needed to.
I disagree with the author there to but for very different reasons than you. In the 90s we already had OS/2 (who's DOS emulation was so sophisticated that it could run Windows 3.1). Not to mention a whole boat load of better home computers than IBM-clones, such as Acorn Archimedes, Apple Mac, Atari ST, Amigas (funny how they all begin with 'A'). By the late 90s there was BeOS too. Windows 9x always felt like the shittiest desktop offering to pretty much anyone who'd used pretty much any other computer that wasn't an IBM-clone.
I loved the ST and Amiga too but in fairness Windows 95 had to be able to run on any random PC whereas the superior systems tightly controlled the hardware. That’s why it needed the DOS underlay, almost like a HAL.
Your second sentence is the lie. It's actually a hypervisor that runs MS-DOS VMs, and thus quite close to Win3.x in enhanced mode, but the "system VM" is mostly still 16-bit code.
See the series of books by Andrew Schulman for details.
Seems like a staple for any frontend-focused person to redo their portfolio/personal site/blog in a Windows OS for aesthetic reasons. Luckily most grow out of it within a year or so
Most of the sins of the modern web can be easily undone