It's hard to remember, but even though Windows 3.11 was extremely dominant at the time, it was by no means assured that Windows 95 would be the success that it was. The very first version missed wildly in some big ways (MSN was a folder integrated into the desktop, for example, and no TCP/IP support [*Edit: yes there was - I misremembered.]), but the core, underlying redesign of the GUI was so profoundly good it propelled Microsoft into a new level of ubiquity. Compare it to other GUIs at the time, like CDE, IBM's Presentation Manager, or even Mac OS 8 and there's no comparison. Windows 95 solidified Microsoft's dominance, but could just as easily eroded it had they dropped the ball.
Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
Edit: Found this fantastic PDF "Chicago Reviewers Guide" which goes over all the new stuff in Win95. So much stuff I had forgotten - TrueType fonts, Plug and Play, registry settings, right-click properties, long file names... Basically everything that makes Windows what it is today.
The one thing I don't understand that Mac has never adopted is being able to use open and save dialogues as mini file explorers (move stuff around and rename, specifically). Having to switch to Finder to move or rename a file that has the same name as the file I'm trying to save is ridiculous. Of course I never need to do this anymore since I only work on text files under revision control, but it still seems odd to me it was never introduced. I really just miss Window Explorer A LOT since moving to Mac. I don't hate macOS, but Finder is a bit of a joke.
To be fair, Windows' file open/save dialogues are so far ahead of everything else that the competition seems like unusable garbage to me. I'm glad KDE/Qt chose to emulate these very closely on Linux. Wouldn't want a desktop where my only choice is Gnome's take at this.
(On the flip side, Windows' select-a-directory dialogue of the same vintage is such an utter piece of garbage that I can't imagine there being any overlap of designers between the two dialogues.)
@blattimwind: "Windows' file open/save dialogues are so far ahead of everything else that the competition seems like unusable garbage to me. I'm glad KDE/Qt chose to emulate these very closely on Linux"
I hadn't realized that KDE was copied from Windows 95. I'm surprised no one here has mentioned NeXTSTEP. Here's a demo by Steve Jobs from 1992: https://www.youtube.com/watch?v=gveTy4EmNyk
It’s awesome that his goal has always been to allow “mere mortals” to be able to use computers. I can’t believe it’s from 1992. Why is it still so hard to build a database powered app in 2018?
To remedy this:
File > Options > Save >Save to Computer by Default (yes, agreed that this is ridiculous).
A habit that I've developed from earliest computer classes in elementary school is to save the file in the location you want as soon as it's named, so ever after Ctrl+S saves it with no hassle.
Word 2016 also has "Don't show backstage while opening or saving files" in the same options dialog, which basically hides the Backstage UI[1] unless you specifically invoke it from the File menu, and shows a plain old Open/Save dialog instead.
Absolutely true. I use a MATE desktop too, I can't even rename files in the save dialog. And I don't want to look it up either because it is so inconsistent all the time...
I would extend your argument to Windows Explorer in general.
Gnome's save dialog allows you to rename or delete files. Maybe MATE should take some inspiration from the newer version of their desktop environment...
I don't really care whos fault it is, or which DE allows this. Such basic stuff MUST work.
Sometimes I feel like the developers don't want stuff to work similar to Windows. Because that would be admitting that Microsoft actually did something right.
Windows 3.0 didn't have any standard system dialogs, these were introduced in 3.1, but that is not the main point. In Windows 3.1 the directory selection dialog was exactly the same as file open/save dialog (Figure 8 in the article) but didn't let you to choose files (and there are still places in windows where standard open dialog is used for choosing directory).
What is typically used as "standard directory selection dialog" actually isn't even documented standard WinAPI dialog, but originally internal dialog of the explorer shell (and it would not surprise me it it was not present in original Windows 95 RTM but introduced in some slightly later version), also it does not select directories (ie. pathname as string), but shell folders (ie. ITEMIDLIST).
Because that is extremely non-intuitive. It's an "Open" and "Save" dialog. That is what it should do. Joe Public is not going to know it does anything else, yet it does.
Actually, you completely missed my point. The Windows dialogue is not only superior, because it allows power users to do what they need, but also because it's actually usable for regular users.
For example: I literally can't know how I can save a file using Gnome's dialogue in the general case, e.g. if the dialogue opens at /foo, and I navigate to /foo/bar using the dialogue, but then go back to /foo ("bar wasn't the right place after all"), I can't save the file there any more. "bar" will be selected. Clicking "Save" while a directory is selected will not save, but navigate instead. Now I'm in "bar" again. I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.
After I asked someone who uses Gnome as their main desktop they told me "that's easy: you just have to ctrl+click on the selected directory to de-select it, then you can save in the current directory".
That, my dear readers, is indeed unusable garbage.
As a long-time Gnome user, I've only realized how bad it is after reading this comment ;) (No irony here, I agree it's truly horrible now.)
But, I only realized it now because almost none of the apps I regularly use make use of Gnome's default file open/save dialog. Linux's extreme inconsistency has some benefits ;)
Oh, and talking about Gnome, the only reason anyone uses it is that after configuring a few basic stuff, you can completely ignore it, just run your apps, forget that there's an actual OS with GUI somewhere beneath them. Heck, if even the applications running on it ignore Gnome and its "standard widgets", its biggest strength is that it can be easily ignored!
> I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.
That is the most annoying part to me. Windows does this as well. It's such a rare thing to want to overwrite a file for me, I find it so irritating that if I accidentally click on a file name instead of a folder, suddenly I lose the file name that I wanted to save as. Usually I just cancel and start again. So stupid.
Whereas I would find it irritating if it didn't have that behaviour. Sure it's not the most common operation, but it's still quite common to want to overwrite an existing file.
This is the excuse that's always used, but I rarely see it play out in practice. Many people don't notice advanced features that are present, that's true. But they don't notice them so it does matter. For the people who do, they can easily discover advanced features through exploring the interface.
Apple have a tradition of being really bad at this. Many of the (slightly more) advanced features are completely hidden behind undiscoverable key combinations or very hidden features. The slide-to-reveal pattern on iOS (now mostly fixed with augmentations) is a good example. Middle clicking the titlebar in Finder to reveal the directory parents is another.
In Windows's save dialog, right-clicking on a file gives me the same list it does in Explorer. In Mac's save dialog, right-clicking on a file does nothing.
It's not like the Mac way saves screen real estate or anything. It's like the way Firefox lets me close a tab by middle-clicking on it. Most users don't know it's possible, but it doesn't actively harm them to have the feature there.
I have similar problems with the Finder. Right-click a file on the Desktop and you have the option to compress a file. Open the Finder to the desktop folder, that option isn't present -- it's grayed out on the Finder file menu as well.
I really like the fact that the Windows save dialog is basically Windows Explorer, and I really miss those features when an application uses the older file save API (which gives a more Windows 95 / 3.11 interface).
OSX Finder is horrible. Classic MacOS Finder was far better - but still had open/save dialogs.
RiscOS had it right - the Open dialog didn't exist - you would open the folder and double-click the file. And the Save dialog didn't exist - the document would reduce to an icon which you would then place into the correct folder. (You can sort of do this on the various versions of macOS by dragging the icon from the title bar but it's inconsistent)
A few years ago, I actually searched around for an implementation of the classic finder, so this is very amusing for me to see actually implemented!
What I was interested in back then was the idea that there was a direct correspondence between the folder window and the data structure on the hard disk, and that to the user these concepts should be indistinguishable. One part of the illusion is that a folder always appears in the same place with the same window size, to give the sense that the folder is a tangible thing with permanence.
It's got a loooonnggg way to go (I never realized how many small but vital details are in a file explorer application), but it's strangely exciting to see it draw itself on modern macOS, especially on a retina screen!
And yes, the spatial UX! I'm still working through getting all of that implemented (I just completed persisting window locations/positioning days ago). I have recently been reading theough some of John Siracusa's turn of the Mac OS X era writings and that's been hugely insightful and helpful. The level of detail (both of the original Finder and his writings) is impressive and sirprising.
I also used to love the fact that the equivalent of kernel extensions (totally forgotten the name) could be disabled by moving them to a different folder. The file system IS the computer.
Coming from the other direction, it kills me when I'm using windows and have a save dialog open, then drag the file I want to open into it from the explorer. On windows it moves the file to a possibly new location, while on OSX the save dialog selects the file you dropped on it (without moving it).
I have the opposite experience WRT Finder vs Explorer. I use a Mac as my primary machine, but have a PC for games. Trying to use a file browser without Miller columns[0] drives me crazy.
I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.
> I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.
While Miller columns certainly support that, so does the tree view in the left panel in the default Windows explorer layout.
Honestly I didn't think of tree view before seeing this (and neither did the friends I asked who are more regular Windows users than I am). I'm gonna give it a shot.
I've been using bitCommander, which is a Windows file manager that supports (among other things) Miller columns.
Compared to the tree view, Miller columns use more horizontal space, but have less vertical overflow. Which is superior probably depends a lot on your usage patterns, but both will support “drag to (grand, etc.) parent of current folder” fairly simply.
Another good approach might be to do it the other way around: Get rid of the pseudo-file-manager dialogs entirely and let applications integrate with the regular file manager.
I think RISC OS (?) did this. Open documents in applications had icons representing them which you could drag to the file manager to save. (And perhaps to other applications to open?) Mac OS also has (or had?) this to some extent – many document-based applications show an icon in the title bar, which is for dragging and dropping the document in question.
Even Windows has an example of catering to this way of working, in Explorer, where the folder icon in the location bar represents the current folder and can be dragged and dropped. They even have some custom behavior to prevent the window from being raised when you drag from it, so that it works more like classic Mac OS and lets you drag to an overlapping window.
Windows Explorer can be extended; the example I remember is browsing a folder of email message files, both the metadata attributes displayed in list view, and a functional content pane for viewing the message itself.
Be OS did this, too; I got the impression that Windows was inspired by that, both for BeFS (NTFS) and the desktop filesystem UX.
You can use most open/save dialogues to do this - a right-click is all it takes in most places. But I definitely agree with your sentiment. Finder feels like a badly-designed toy. The latest things they've added - tabs and labels iirc - require more steps to do the same tasks in the new way versus the old way.
While a fair point, it’s important that feature-rich OS components be designed correctly (including security) and this wasn’t the case with Windows.
How many exploits were just a matter of tricking an extra-fancy OS dialog into popping open something it’s not supposed to, escalating permissions alongside it?
I would normally agree with you here except that I've accidentally renamed files on Windows so many times only to lose them because something was in focus that I didn't intend to be. I use a combination of the mouse and keyboard when browsing and the fact that I can accidentally click on a file (which renames my current file to that file) and save or click enter thinking that I'm saving which then just starts to rename the file is utter garbage in an Open or Save dialog box. I feel like the only things I should be able to do are Open a file or Save a file but not rename other files. Neither is a good solution so it comes down to preference but I definitely see the reason for it because it happens to me on Windows all the time.
From someone traveling in the opposite direction at the moment: the Windows folder selection dialog is the crustiest open/save relic of them all and "open files are locked even to administrators" is a UX catastrophe. Lesser pains: it hurts to have missing consistent "jump to enclosing folder" semantics everywhere (command-click on mac -- only get an equivalent context menu 30% of the time) and to not have the ability to snap open/save dialogs to a particular file or folder with drag and drop.
The door swings in both directions, in other words :)
Windows Explorer, on the other hand, still has no “expand folder” functionality on the right pane. I have to go back and forth to sort garbage trees, opening multiple windows or temporary pinning all folders into favorites. Given that their names may intersect, problem gets worse. In Finder you just expand interesting folders and DnD until it’s done.
What really is a joke is a left pane that combines favorites, libraries (die, die, DIE) and disk trees. It was never usable, except for favorites.
Haha, damn ok, it's been so long since I've had to do this, I didn't realize it is now possible (I used to want this all the time and was the bane of my existence in the late 90s early 2000s when using Mac). You still can't seem to move files into sibling directories, though, only to directory links in the sidebar. Anyway, I LEARNED SOMETHING TODAY (about macOS dialogues and about testing things before I open my mouth).
There absolutely was TCP/IP support in Windows 95 from the very beginning. It was not installed by default but it was trivial to add via the Network applet in Control Panel. SLIP/PPP was also supported and you had basic utilities like Telnet and FTP included so you could connect to the Internet right out of the box.
No web browser, though. Internet Explorer 1.0 shipped with the optional Plus! pack.
Since a large part of my work involved building networks of pre-DOCSIS cable modems at the time, I can tell you that the first Windows to support TCP/IP was Windows 3.11 (Windows for WorkGroups). This was carried into Windows 95 and by Windows 98 there was even Internet Explorer.
But in windows 3.11 the TCP stack was not part of the installation or even on the installation media, it had to be installed separately together with win32s. In Windows 95 IIRC everything you needed for TCP/IP was on the installation media.
...and it had preemptively multitasking 32 bit drivers, we built a Mac (AppleTalk) server product on top of that with pretty good performance. 3.11 was a huge step forward.
Probably inertia. Windows 3.0 and 3.1 didn't ship with a TCP/IP stack. You had to either have something like Trumpet or install Internet Explorer 2.1 (which, AFAIK, was a separate purchase) to get Winsock. Or you could try to use DOS drivers. Even Windows for Workgroups 3.1 only shipped with NetBEUI and IPX/SPX. It wasn't until Windows for Workgroups 3.11 that the OS shipped with TCP/IP.
> Compare it to other GUIs at the time, like CDE, IBM's Presentation Manager, or even Mac OS 8 and there's no comparison.
Well, I agree that there was no comparison with System 8, but not in the sense you mean. I think that the Mac back then was head-and-shoulders a better system than Windows. It might still be, but they're both so painful to use now that it's very difficult to pick a winner.
The Macintosh system was very understandable, very clean. Extensions were an easy-to-understand way to extend one's system, and easy-to-disable too. The window system itself was better-thought-out and less-confusing than Windows's was. The Finder was much more straightforward than the Windows equivalent (was it called the File Explorer back then?). The way that the Mac associated programmes to files (with an application code & a file code) was much better than the extension-based naming of Windows. The way that the Mac used its files' resource fork was great.
Programming a Mac back then was very clean & straightforward. I don't think there's anything today as nice, except maybe Cocoa, maybe. Certainly not the Windows 95 API!
I was using Macs back in the late '90s, and none of the things you say ring true.
Extensions could easily bring down the entire system because there was no memory protection. Full OS crashes (what modern macOS calls kernel panics) were a daily occurrence for the typical Mac-using professional who ran complex software.
The window system was often difficult to understand because apps tended to use a plethora of little panel windows that could overlap even from different apps. Windows preferred large windows that contained the entire app UI, and users typically maximized them. The Windows 95 Task Bar was much better for actually keeping track of your tasks than whatever the MacOS 8 thing was.
File extensions were always a hack, but one that Apple adopted too for Mac OS X. The days of Mac's file-specific associations were numbered when the Internet happened, because Unix servers wouldn't keep track of that metadata, so you needed file extensions anyway.
Besides, the file-specific associations were often super annoying because they were created by the editor app even for exported files. You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer. This would happen even when you copied the file to someone else because the association was in the file metadata.
Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.
(And programming in Mac OS 8... Ugh. No memory protection, no multitasking, APIs originally designed in Pascal.)
> Extensions could easily bring down the entire system because there was no memory protection.
Yes, they were quite unstable. I didn't say that they were stable; I said that they were easy-to-understand and easy-to-disable, which they were: each extension had a distinct icon displayed at system boot; disabling one was as easy as dragging it to another folder; disabling all was a matter of, IIRC, holding down Command as you booted.
> Windows preferred large windows that contained the entire app UI, and users typically maximized them.
As a Mac user at the time, I much preferred the multi-window mode: it meant that I could customise my desktop as I liked. The Windows single-window mode was terrible, as it meant that I couldn't layer windows properly.
> You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer.
We considered that a plus at the time: it meant that different files of the same type could be opened up by different apps by default. One could always, IIRC, Save As if one wanted to change the file type — or use ResEdit.
> Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.
Stability, probably. Performance, maybe. But usability? Never! That was back when Apple cared about UX.
Windows NT 4.0 had the same GUI as Windows 95 and was released in 1996. That's the version of Windows that sealed the deal for professional applications.
The late '90s I remember had a mishmash of NT, NetWare, and some (or maybe several) variant(s) of Unix on the bits that lived in the room with all the air conditioners.
Windows 9x and maybe OS/2 were far and away the dominant OSes on actual workstations. Macs and even aging Amigas at some creative shops, some more Unix workstations at places where most people could recognize and identify the purpose of (if not actually use) a slide rule. But what I essentially never saw on any desktops was Windows NT. Lack of driver support and inability to run many business applications kept it out of that space.
I suppose it depends a lot on geography and business. I remember Windows NT/2000 rapidly displacing Macs in the creative fields, and being widely used as developer workstations.
The funny thing I have always thought about mac vs win is that mac was so set on doing it “their way(think different)” that they ignored UX that really was intuitive and “just worked”
Mac was so set on being different, that they aschewed UX tropes that were natural.. and had to spend ridiculous amounts of resources trying to convince people that their way was the right way, but clearly it was not.
This, IMO, is where the “fanboi” concept evolved.
Brainwashing.
Who cares - in the long term - where the UX and UI elements came from, the point is to make machines immediately accessible to humans creative desires, not to mold workflows to a corporate ego...
So, Apple figured out how to develop products that were managed through the extension of the desire of the user, but they are still struggling with the requirement from Jobs to be “different” - and Ives’ perception of a common user is skewed toward “Ives’ has stated that this is how it should be done” type design, which I find completely ironic given the whole “think different, so long as it’s exactly how I am designing you to think” campaign is a hypocracy that goes up to an 11
What Apple have done with both first and second generations of the Macintosh -- System 1 and OSX -- is pick a basic GUI metaphor and stick with it for at least fifteen years.
The original Mac released in 1984 and was produced through 2000, sixteen years. OSX released in 2000 and is now in its eighteenth year. Whilst each system has seen some evolution over time, the general metaphores and interfaces have remained consistent.
Apple have realised and internalised a core concept of GUIs: change is bad. There is a far higher cost to changing interfaces than can be gained through efficiency, and the retraining and unlearning costs are exceedingly high relative to benefits.
This is a message apparently lost on Microsoft and most of the leading Linux desktops.
Mind: I write this as someone who whilst using a Mac presently doesn't much care for the interface. My preferred desktop remains WindowMaker (itself based on Aqua's predecessor, NextStep), which has a key advantage of having changed almost not at all in the 20+ years that I've been using it. It's also configurable in ways I find useful, and I schlep around a configuration directory to new systems as needed.
Microsoft hasn't been too different in that regard. In terms of interfaces with any real userbase to speak of, the only real Microsoft UI systems I can come up with could be characterized as
* MS-DOS/Windows 3.1-like
* Windows 95-like
* Windows 8-like
Windows 8 basically was born and died in a couple of years, to be replaced with Windows 10, which is very much the same basic set of metaphors as Windows 95. There's a start menu, a little dock of pinned icons next to it, a taskbar, a clock and some little icons for what are basically background processes. There's a maximize and minimize and close button on windows. There's a File/Edit/Whatever set of menus. You can right click for context items, many of which are consistent with Windows 95 20+ years ago. That basic system is now something like 21 years old now.
You can maybe argue that Microsoft forgot your message five or so years ago, but they obviously rediscovered it.
I'm largely familiar with the DOS -> Win2K period, and have made little use of Microsoft operating systems since.
Windows 3, 95, NT, and 2K each saw significant changes in where and how major system functionality was presented.
During the same period I was using numerous Unix and Linux platforms (and still do). Those have largely seen far less substantive change at the shell and system level, with a few notable exceptions.
I'm not discussing Linux GUIs, which have been all over the goddamned map. I've used twm, fvwm, fvwm2, VUE, CDE, WindowMaker (my preferred option), GNOME and KDE through multiple generations, Enlightenment, various of the 'boxes (black, open, flux, ...), ion, xfce4, ... And those are the ones I've trialed to some significant extent. I've at least fired up and looked at virtually all the options mentioned on the XWinMan page: http://www.xwinman.org/
There are several fairly central components which have changed fairly markedly. The shift from telnet to ssh, multiple iterations of firewalling, various scripting languages of preference (bash, perl, python, an oddment of others), mailers (sendmail, qmail, anything reasonably sane, mostly exim and/or postfix now), and of course, the whole init replacement clusterfuck.
But the notional concepts of files, filesystem, shell, utilities, pipes, etc., has remained consistent, and even across several utility / server replacements (particularly ssh and mailers), command-level compatibility has been preserved to a remarkable extent with previous options (e.g., rsh and sendmail syntax).
If Microsoft can only be relied on for five-year stints of "having learnt this lesson" then they have not learnt this lesson.
I fear though that command level compatibility is under attack these days, as fewer and fewer see shell scripting as something positive (never mind trying to do more and more via dbus rather than pipes and such).
And already with 8.1 the "start screen" could behave like a very large start menu.
Hell, you can today configure Windows 10 to behave much like 8.1. The one thing i see some people miss with the 8.1 to 10 transition is the charm bar. In particular that it gave easy access to printing and such.
Seem to me that Apple and MS focus on different kinds of change.
While Apple may retain the UI across time, they are more than willing to change APIs etc on a whim.
MS on the other hand may change the UI (though outside of 8.x, the core layout and behavior has remained much the same, and even 8.x could to a large degree behave like the older UI) but they bend over backwards to maintain APIs across time.
I get the feel that stable APIs are undervalued as a user retention element.
Being able to get a new computer but install from the same software library (i can hear the _sec people getting hissy already) as was used on the old one makes people more likely to pick the same "platform" over time.
Right, I see that point and would have acknowledged it more explicitly had I time earlier. That's the interesting part of this.
The counter is that Apple caters to a smaller software development community, though several of the tools also see extensive use and support (particularly photoshop). But there's a heck of a lot fundamental functionality on Apple's platforms that you can get without relying on third-party software, or at least, third-party proprietary software. Given the dynamics of proprietary software markets, particularly toward adware, nagware, and malware, this seems a possibly positive development.
(I've made much the same observation in recent years about the Android marketplace, which I see as a growing cesspit, and of the Windows application space, particularly at the peak of its crapware / spyware / adware period in the decade of the 2000s.)
Linux solves the software compatibility problem by allowing for recompiling of software for which the source is freely available, for the most part. This isn't a perfect solution, and there are complex systems which tend to not be particularly forward-compatible. One possible argument is that such complex systems are themselves inherently problematic and ought perhaps be avoided. You may not agree with the argument, but I'd expect you'd admit to its existence.
Microsoft was addressing a different space, and one in which there was a massive focus on desktop-distributed client software, much of it aimed at very specific business applications. This is a major application area for computers, though it's also one that's shifted significantly toward client-server Web-based solutions (or app-based, now). Which presents its own set of features and limitations.
And again, all this is what I was hinting at earlier with noting that you'd presented a very interesting point. I'll be thinking about this for a while.
Windows 95 was no paragon of stability when you started loading it up with tray apps and running big nasty applications or trying to use the damn printer with its piece of shit driver. We are spoiled these days by how stable our computers are.
I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension, but sadly the world zagged on this one when Apple zigged. I especially like that you could have two files of the same type associated with different applications, so if you created a text file in your IDE it would open back up in that IDE instead of launching the word processor. And you could change it (somewhat clunkily) at will.
Win95 did have some advantages though. The Start Menu was a better organization system than Apple's Application folder for example. Macs of that era were also slow and badly overpriced.
> The Start Menu was a better organization system than Apple's Application folder for example.
These were not actually that different. The original Start Menu was just a menuized view of a folder hierarchy (which mostly contained shortcuts but could contain any document).
The resource fork idea is great if everyone agrees on it and everyone preserves them, even across OSes.
Or if you're going to Very Deliberately Ignore the Other OSes and go do things your own way. VDIing like that seems to be a very Apple trait.
My point is, I'm not sure how well the resource fork model could have ever survived prolonged and sustained contact with the Internet and modern pervasive networking.
> I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension
Since we're talking about Win95, then the 3 characters limit doesn't apply (long filenames was a major new feature of this OS after all). .jpeg and .html were relatively common for example at the time, and worked fine.
I find the extension system still kludgy, but arguing than it was worse in part because of the limited pool, is incorrect starting from Win95.
Yet for the same compatibility reasons most people stuck with .jpg and .htm, at least in the Windows world. Even now it's very unusual to see a .jpeg extension on a filename.
The metadata that Macs kept in the resource fork go way beyond file type and creator too. It included things like the file's Icon, creation/modification information (so it would survive a trip over the Internet!), loads of stuff for applications (menus, graphics, sounds, etc...), formatting for plain text documents (so they fall back to plain text on unsupported systems), and so much more.
Fun fact: NTFS supports the concept of a resource fork on files, but almost nothing in Windows uses it. I think I've seen more malware hiding stuff in there than legitimate uses in the wild. Worse, even the obvious case of loading a Mac file on a Windows machine it usually fails and falls back to creating the clunky separate directory instead.
NTFS does not have resource fork in the MacOS sense nor extended attributes in the unix sense. Instead it allows for file to have multiple named contents that are accessible by same file IO API (in essence the file can behave like a simplified directory). There is no distinction between data and metadata stored this way. In the late 90s MS even intended to not use OLE compound storage fileformat (ie. what office 97/2000 formats are built on) on NTFS drives and instead write the objects into separate streams (reportedly it was not implemented because then windows would have to somehow transparently reconstruct the compound storage when you copy such file to non-NTFS drive or upload it to the internet). Today apart from malvare hidding only major usage multiple streams have are the "this file was downloaded from internet, are you sure you want to open it?" prompts which store the internet-ness of file in secondary stream.
Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.
One thing I loved about old MacOS apps is opening them up in ResEdit and so much of how the thing was built.
> Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.
Sort of. It would be more accurate to say that each filename was a 4-letter type code and a 16-bit ID. Each resource could also have a name, but that was less frequently used (and didn't have to be present, let alone unique).
More importantly, resource forks didn't exist in isolation. They were loaded into a chain of active resource files -- for instance, while working with a Hypercard stack, the resource chain would include the active stack, the Home stack, the Hypercard application, and the System suitcase. A stack could use resources (like icons or sounds) from any of those sources.
The filename extension is the bare minimum of metadata for a file and not easy to extend.
Even then Unix systems will skip even that minimal metadata and force you to messily search for magic numbers at the start of the file and make a guess.
> I think that the Mac back then was head-and-shoulders a better system than Windows.
I completely disagree.
First of all, Windows 95 had preemptive multitasking (the Amiga was the only computer that had this at the time), Mac OS was single task and used a terrible scheduling and it would be years before Mac OS gained preemptive multitasking because of terrible architecture choices that made this extremely challenging.
From a GUI standpoint, Windows 95 was a total revolution and made Mac OS look completely antiquated: rendering, scrolling speed, font types, menu items and dialogs, etc...
The Amiga was exotic hardware, like the BeBox. If you want to include those, you might as well include SGI and Sun hardware. Preemptive multitasking was common on non-PC hardware.
Cocoa is pretty painful compared to web UI alternatives like Electron. I feel stupid for mastering it considering there’s no jobs there and the Mac App Store is probably impossible to make any money off of anymore.
Windows has always been head-and-shoulders above the Mac and that's why it won the 90s desktop wars and that's why just about every business on earth uses Windows and not Macs.
You are in a tiny minority if you think Mac window and file management are any good. File extensions are the most pragmatic way of dealing with associations and that just about sums up the difference between Mac and Windows: Microsoft made Windows to be practical whereas Apple has always focused on style above all else.
> I still get frustrated on OSX when I minimize a window and have to hunt around for it
Windows user here. Honest curiosity: does anyone know why the minimize / maximize works on the Mac the way it does? I mean, what's the rationale to design it like this?
Different ways of working, mostly stuck in our own ways of doing things.
I rarely, if ever, use Minimize on the Mac. Minimize comes from Windows (and other windowing systems) where the window minimizes to an icon or button on the task bar.
"Maximize" also comes from Windows (and other windowing systems.) As others have noted, in newer macos (which I don't use,) I think it oddly makes the window go full screen. Full screen is a recent macos feature -- I once asked about making my Application go full screen and Apple developer support said that going full screen did not follow their Human Interface guidelines. Something seems to have changed at Apple since I asked about that decades ago.
There was no Maximize on the Mac, it is called "Zoom." The idea is that the window has two sizes and you zoom between the two sizes: One size is the size the user has resized the window to (often with much difficulty) and the other size is an ideal compact size ("optimally fit content") without hiding anything and hopefully where scroll bars do not appear -- it is a UI feature that is/was rarely, if ever, done very well by applications other than the Finder.
By the way, Command-Tab to switch tasks was once an add-on from Microsoft for Mac OS. Go Microsoft! (Mac fanboy here :)
All correct. In daily use, I almost never use those buttons - I end up manually sizing/placing windows.
The the MacOS scheme of doing this leads a sort of organic, emergent window layout - I always end up with windows staggered to display relevant bits. With Windows (and with most window managers, X), things always end up either strict tiled or stacked.
I'm very used to the Mac way and prefer it, but that could just be the result of long use. It doesn't waste space (Windows apps always seem to have lots of dead space to me and makes me work to be able to see parts of other windows) and forces me to switch windows far less. But it is a fiarly subtle thing.
The amount of tedious manual work required to resize and arrange several windows on MacOS (6+ to latest OSX) bothered me all the time I had a misfortune to have to use it. It does not even have something like edge snapping.
My WM of choice is now Xfwm, which allows to rearrange windows easily without gaps, make windows fill available space (vertical and horizontal separately), or tile them by dragging to corners.
On OSX, things like Spectacle and Magnet sort of help.
The lack of window and edge snapping are exactly the reason I prefer MacOS behavior. If it did that, partly revealing lower windows would be far more tedious/impossible.
For me, the goal is not to maximize available space to the frontmost windows; it is to maximize the use of the monitor to display what's currently relevant. It allows me to do things like keep a Finder window open with just a bit peeking through to drag things to, keep an eye on a few lines of a terminal tail -f, see the mailbox pane and room pane in Mail/Slack so I can see if anything new happened I should respond to, etc. all while working on whatever I'm working on.
With the Windows-ish "maximize the window", that is replaced, almost invariably, by useless window background.
Again, I expect this is mostly what one is accustomed to, and the Macish approach is more idiosyncratic to the user. Works for me.
Edge snapping does not prevent the behavior you describe; sometimes it even makes it easier to do. You can still rearrange and resize the windows manually as you wish. But when you want two edges to fit snugly, it's easy to do.
Also, you can maximize a window for a moment, look at it, and then unmaximize it back to its previous size.
Interesting. Now that you've made me think about it, I've realized that I use Windows more like a single-threaded operating system. I only ever flip between applications (ignoring my second monitor). I never, ever, combine multiple applications on the same monitor - this is crazy because this was the Windows 95 promise. Not that it's a bad thing, I'm used to the workflow and love it.
I absolutely agree with your guess as to why this is.
in Maximized state the window is set to maximal available size (so you are not wasting any part of the screen) while you are still provided with fast and easy access to relevant OS UI elements
in full screen you explicitly tell the system that you don't want to be distracted by OS UI elemtents (typically in situation when you know that you won't need them for extended period of time, or if you REALLY need every single pixel of the screen)
I feel your pain. I was using BetterTouchTool to remap the default behavior of that green +, but eventually decided it was silly to use an add-on for something I should be able to change via `defaults`, so I just trained myself to hit Option when I wanted to maximize.
I'm came from ~18 of Windows & Linux usage and everyone always told me macOS is THE OS with the best usability.
But I can't confirm this.
Minimization of windows is shitty and maximization even more.
When I maximize, often just the height is changed, when I go back to "normal" the height and the width is changed, so I always have to adjust the width manually.
When I minimize a few windows, it's impossible to get back the right one without luck.
Know what you're talking about. Since I've switched back to Linux I can't imagine working without workspaces - each one dedicated for specific task/app - and every time staying with specific order so e.g. 1 workspace: Browser, 2nd: Code editor, 3rd: Terminal, 4th: File explorers etc. I used to it so much that I automatically use shortcuts to access it immediately - switching between minimized windows using alt + tab keys is a nightmare.
Windows 10 actually supports this feature, still can't get used to it though... win+tab -> switch between virtual desktops, or ctrl+win+left/right arrows.
and if you have many windows open (in win 10 at least) you don't have to press alt+tab 10 times in a row to choose your desired program, you can hold alt+tab and use arrows.
Yes you can, it was added in the Creators update. Hit win+tab and right-click the window, you can choose to show that single window across all desktops or all windows from the app across all desktops.
You can accomplish (mostly) the same thing with Spaces on Mac. Granted, I don't believe you can really script any of it. All manual, but it still works pretty well for me.
Some days I still really miss dwm, but having Photoshop, Ableton, and several other things Just Work™ makes it worth it.
The problem with spaces is that any time you cmd-tab between spaces, those spaces will be repositioned relative to each other to be adjacent. This also happens whenever a window opens a dialogue that forces you to switch to it. This means that you can't reliably keep spaces in a strict order.
Yes, this is something I do to on linux, and it's great particularly when you're doing something that gets quite messy with lots of windows open - when you're working with lots of files, or you the program (like GIMP) opens up several windows. If you need to do something else it's so nice to just leave it all and move onto a nice clean workspace without having to minimize everything.
You do have multiple workspaces on Windows 10... Though if you mean "automatically assign application X to workspace Y", which many Linux WMs are able to do, you're out of luck.
I used to use VirtuaWin[1] on Windows, which adds virtual desktops to it. It's possible it doesn't work in the most recent Windows (although, given Win compatibility, it could just as well work), but until Windows 7 (when I stopped using Windows) it was a life-saver. I used Enlightenment (E16) on some of my computers back then and after working with multiple desktops I just couldn't live without them. I mostly use 3x3 layout, with the main application I work with at the center, and other applications to the sides. Works great for me!
After using a mac for ~6 months now, I finally understand what happened to gnome3/unity, clearly designed by mac users.
I mean it's perfectly usable once I got the basic gestures down and installed the app that let me independently set the track pad and mouse wheel scroll directions, but will go so far as to say that both xfce and MATE are objectively better at window management.
Partly I think their hands were tied by the too late to change decision for the always there contextual top menu bar. Or this is just 25 years of using win95 clones talking and I'm set in my ways.
I wouldnt say Gnome is by or for mac users. In fact in many ways it is way more similiar to Metro than to OSX. Gnome 3 just broke the common Windows 95 workflow, however many others did before.
Personally i love it but it took me way more than 10 minutes to realize why :) IMO it is highly underestimated, mostly because you need to change your workflow which takes time, but when you did it feels super productive.
You wouldnt judge i3 or other completely different approaches after only a few minutes.
I feel absolutely the same about Gnome 3. One of the biggest things, I think, is that it puts workspaces absolutely in your face, so using them is a much more natural part of the workflow than in Gnome 2. It's also more keyboard-friendly than Gnome 2 (though it still could use some work in this area). Despite being rather large (gnome-shell on Wayland is typically the second-biggest RAM user on my laptop), it feels minimalistic, and is almost always fast, and stays out of the way of whatever I'm working on.
> it feels minimalistic, and is almost always fast, and stays out of the way of whatever I'm working on.
absolutely :)
What i like the most is you are just a super key tab away from basically everything so focus on a single thing feels natural and right. There is no way to lose anything either, its all there. Always.
I'm going to have to give it another go then, though I do love my MATE desktop.
The last time I tried it, I got frustrated when working with a lot of pdf sources - hitting the super key just presented me with a myriad of white rectangles where open windows would frequently rearrange requiring a slow manual search to find the file I was looking for. This can be less of a problem with a taskbar, as the filename is the main identifier, and being 1 dimensional it is easier to scan and preserves its position better.
What's strange on a mac is somehow having the ability to completely lose windows.
Another frustration is trying to view two apps together on the screen at the same time, if one of the apps itself contains multiple windows. I'm not at my desk to try this but let's say you have multiple chrome windows open, all with their own tabs, and you want to view your current chrome window overlayed on a window from another app. To do this you have to manually minimise all of your chrome windows one by one so they will all move of the way, to allow you to switch between apps and view them both at the same time.
Four fingers up, expose.
Drag the windows you need up top into a new desktop.
Command (or control? Or option? Or a combination?) Plus the right arrow to switch desktops.
Try all the combinations until I get to the right desktop or throw the damn thing out a window.
Somehow it is able to open up the device manager and THEN give the window in the background focus when I switch to my editor and back to Xcode with cmd+tab.
So I have to move the Xcode window down, to grab that backgrund thing what seems to be part of Xcode.
Classic Mac OS (System 7 was contemporary to Windows 95) didn't really have any concept of minimisation or even maximization as such.
There was no task bar or dock or anything else really to minimise to. IIRC there were addons for 7.1 that added "window shading": a button on the window title bar that reduces the window to just the title bar.[1]
The closest thing to a maximise button in classic Mac OS was more like a size-to-fit button: the application gave the window manager a hint which was the appropriate size for the document displayed, be it a file folder, a word processor document or whatever. Having a single window fill the entire screen wasn't as common as it was on Windows.
None of this was particularly strange to me back then.
Sure it did, namely Windows 1.0, 2.10/2.11 (actually Windows/286 and Windows/386), but very few people used them.
Windows 3.0 was the first one to have some diffusion, but it had very limited capabilities, and it's adoption was slow because of the increased specifications for the PC, and in any case not comparable with the later wide adoption of 3.1.
> Having a single window fill the entire screen wasn't as common as it was on Windows.
Yup! For the longest time I liked to work with a half-width browser window to match my half-width editor & word processor windows. It drove me crazy the number of websites which set their body text to some fraction of the window width, which looked good with a fullscreen window but terrible with a halfscreen one.
Eventually I just gave up. The whole point of the web was device-independent information transfer, but somehow we allowed device-dependence to sneak it.
MacOS System 7 did have a function like minimize for Apps (not individual windows). In the finder menu on the upper right, you could "hide" or "show" a program and use the same menu to switch apps, similar to the task bar in Win95.
macOS has Hide & HideOthers in addition to Minimize. I go weeks at a time without using minimize because of those.
IMO there's a whole generation of people who did their early computing on MS Windows (including myself) and so internalised that that is how GUIs are "supposed to work". When moving to something else later in life there's a feeling that it is "wrong", but it (e.g macOS)'s way of doing things is also correct and is just a divergent evolution to MS Windows. Research, open-mindedness and experimentation are necessary when using something different.
I'm new to macOS. Thank you. I need these tips. It's really annoying in the differences, but I'm sure thre are more hidden things that are useful I suspect I am not alone
Thanks, this is the first answer to this question that sort of makes sense from a Windows user point of view. I mean I still prefer the "Windows way", but this is at least viable reasoning for the "Mac way".
Depending on how you've configured your Dock, Minimize is pretty simple. Either the window just moves to the Dock, or it minimizes into the application icon (then you can right-click on the Dock icon to view a list of the windows that are minimized, or click to open the last minimized window)
Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents. E.g. maximizing a Preview window with a PDF document changes the width of the window to the PDF page width. (Good tip for getting along with your Mac: stop trying to maximize everything.)
In a recent macOS version, Apple changed the green "maximize" button to full-screen, which is very different from maximize. Now double-clicking on most window chromes will execute the old maximize behavior.
> Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents.
The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me.
I've been using macOS for about as long as I used to use Windows now, and at this point, macOS seems to have largely abandoned the concept, which is great. Applications get either a full screen in their own isolated context, or option-click for a full-screen in a regular windowed context. The options now are a very windows-95/98 like: "either go completely full screen, or resize to whatever you like," which gives me full control of what I find optimal for any given application.
The difference is that Windows never really embraced universal drag and drop and the Mac did.
Macs used drag and drop for file management between windows representing separate locations on disk. Windows users tended to select files and choose cut or copy then navigate to the second location and paste.
The same held true for moving content between documents in an application or moving content between applications. Mac users preferred to use drag and drop, while Windows users relied on copy and paste.
The problem with keeping every window maximized is that you're giving up system wide drag and drop as the primary user interaction method.
That's a great distinction I hadn't thought of before, and it definitely makes sense - if you're focusing on drag-and-drop, you want as many windows visible somewhere on the screen as possible to maximize possible destinations.
Personally, I find drag-and-drop handy sometimes but it's very constraining. You have to go through non-standard motions to complete any more that is more than trivial, always holding down the primary mouse button and thereby losing your primary way of interacting with the interface. In other words, sure, if you have a clear view of your destination, then yeah, drag and drop is fine, but in all other instances, it becomes clunky.
Cut/paste is incredibly quick and doesn't sacrifice usability of your interface or input methods between the two ends of the transaction. Windows seemed to balance this out well, where you could drag and drop most of the time, but you could also ALWAYS cut/paste. I despise that I can't cut/paste in finder. Which is why I use PathFinder instead.
The danger cut/paste DOES pose is it fundamentally unlinks the start of the transaction and the end. In between, you can do literally anything, which may mean losing track of what's in your paste. Still, I'd call this a fair trade-off, specifically because it is non-destructive for files. You won't lose a file to to paste. It just stays put.
> In between, you can do literally anything, which may mean losing track of what's in your paste.
On Windows Ditto, and on Linux CopyQ (among others, and there has to be something like that for Mac) solve this problem, by giving you a preview of what's in the clipboard as well as the history copies you made.
I've seen users cut files from one location and forget what they are doing before they manage to find the destination they intended to move those files to.
Then they are shocked later when they paste those files into some random location and can't figure out where they went.
Dragging and dropping does not have that issue. Users find it much easier to learn.
Open the source window. Open the destination window. Drag.
>The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me.
And that's the crux. The whole feature (in its original incarnation) rested on the false assumption that there's a singular "optimal" state at any given time.
The Apple Human Interface Guidelines have been a state-of-the-art reference for good UI for a long time, but the part about the zoom button always baffled me, as it directly contradicted several Core Principles laid out in Part I of the book.
The reason this is done is of course that most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless - and hinders the usability of the system.
It makes more sense to leave some space over for other apps than have a big empty area on both sides of the screen.
Interestingly, I noticed that it's only Windows-switchers who complain about this. People who've used Mac for a long time don't give this any thought.
> most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless
It's a matter of where you place responsibility. It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps." But most would laugh at this and say it's the responsibility of the website/webapp to build a responsive layout. Why should we hold desktop applications to a different standard?
I completely that it is probably almost entirely windows-switchers who complain about it. I'd, obviously, self-aggrandizingly suggest it's because we've tasted something better. People don't complain about the taste of food they've never tasted ;)
>It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps."
I think that's a misrepresentation. It's not a static size. It's not an artificial limit. If the document that's open has content to fill the entire screen, the window will fill the entire screen.
For minimize I can't say. But the maximize works the way it works because (and this is according to the platform ideology, not some general truth) you are not supposed to maximize windows in the Windows sense of the word.
The macOS interface is based around floating and overlapping windows. If you put a window over the whole screen then it could be as well maximized. This gets a bit hairy on smaller screens but really shines on huge monitors. In general macOS is more optimized around having one big screen rather than a multi-monitor setup.
I do not know how the interface looked back then (I was not even born), but I can imagine that at that time a lot in interface design had yet to be discovered.
But seriously, minimising windows is a reflex learnt from Windows. In Windows you often need to minimise one thing to find something else. Especially the desktop. On a Mac you can usually find something more quickly in the dock. In Windows you’re far more likely to have an app maximised by default, and minimising is a natural way to switch tasks. On a Mac, minimising is not a natural way of task switching.
It drove me nuts when they updated the maximize button behavior to full screen.
I use ShiftIt, a neat little Open Source tool that help me manage windows sizes and positions (including minimizing and maximizing): https://github.com/fikovnik/ShiftIt
I've recommend it to pretty much any Mac user I've met.
Haven't used ShiftIt, but I've been using Spectacle to accomplish the same thing. They seem pretty similar, overall. Works pretty well and I've had no issues.
I used Spectacle for years than was forced to switch to ShiftIt at a new employer. ShiftIt default hotkey combos didn’t conflict with other apps like Spectacle.
No idea... And a few versions ago Apple changed the maximize button to go full screen, which made it useless for the 95% of us who don't use one app at a time. Minimized windows also used to show up at the bottom of the screen when using "Expose", but then Apple changed the name to "Mission Control" and removed them. Just one more example of them slowly but surely driving its Mac customers away.
It depends on the definition of 'at the same time', but I think that many people in a work environment at the very least use an e-mail client and a productivity application at the same time. Also throw in a calendar for good measure.
I think that Apple thought that many regular users would switch to full screen apps on the Mac, combined with Launchpad (it's just like an iPad/iPhone). But virtually all non-tech-savvy Mac users that I know do not use Launchpad, nor fullscreen apps.
I think the problem with Launchpad as with Spotlight search [1] is that they are not very discoverable on the Mac. Having search in an application menu (like recent Windows versions and some Linux desktops) is far more discoverable.
I guess people don't use fullscreen apps because they equate desktops/laptops to the 'WIMP' interface paradigm.
[1] If I received a penny every time I see even experienced Mac users launch applications by clicking on a Dock icon or by navigating to the Applications folder in Finder, rather than using Spotlight, I would be rich.
Only after reading your comment did it occur to me that I should set up applications in the Launchpad the same way I have them organized on my iPhone (or like Win 3.1's "Program Groups" or Start Menu folders) and stop using the Dock to hold the applications I use most often. Maybe that would help keep track of which applications I have open, and where my minimized apps keep going off to!
You sure you are not confusing it with iOS? That's where you always have strictly only one app in fullscreen!
(Yes, yes. It is a lame joke. But who uses only one Window at a time? What is the point of that? Though I remember when I used OS X you could swipe left (or right) to switch back to all other apps. So I think that was good enough.)
That's a very un-Classic-Mac-like way of using windows.
In the olden days, Mac users would tend to have lots of overlapping windows. Dragging anywhere on the window edges would move the window, so it was easy to arrange them as you wanted - almost like shuffling bits of paper on your desktop (strangely enough) and if you wanted to move something out of the way, you could fold it up (window shade). There was no need for maximising or minimising and "zooming" just meant "make this as large as makes sense for this particular document", not "expand to take up all the space on my desktop"
As OSX/macOS has developed, all that document/desktop-style behaviour has been lost.
They're talking about a window sized to the maximum space as if you click and dragged its edges out as far as they can go or just used a tool like Divvy.
You're talking about the "fullscreen" feature which I always found very weird. For example, ever forget your video was "fullscreen"ed as you try to alt-tab to it only to realize it's a 4-finger swipe to pull it back up. Making the user have to differentiate will always be bizarre to me.
Speak for yourself. Spaces in macOS is one of the best window management features I've encountered. Of course, it's been common in Linux window managers, but since I use a Mac for work, its a godsend that spaces and multiple desktops is implemented. I'm a developer, and being able to organize my windows and split them (fullscreen isn't limited to one app per screen anymore) is so freaking useful that I feel lost whenever I try to use Windows again.
> Minimized windows also used to show up at the bottom of the screen when using "Expose", but then Apple changed the name to "Mission Control" and removed them.
You may not recall, but when win95 came out - there were lines blocks long awaiting to buy it, much like when the iPhone came out.
I recently sold several windows 95 shrink wrapped original copies on eBay which was the OS on 32 3.5” floppy disks. As an original piece of computer history.
Win 95 was monumental and great. Aside from outlook, and excel, the greatest product MS ever made.
I'm predominately a Linux user and I switched away from Mac OSX specifically because of the underlying file system. I'll admit that it was annoying finding windows sometimes but the case-preserving, case-insensitive file system made no sense.
Case-preserving,case-insensitive is good for less sophisticated users: prevents accidentally misplacing or duplicating files with capitalization; and simplifies through this constraint media listing and sorting — no worrying about jpg vs JPG.
When a file refuses to rename because the OS thinks it is the same name. When you have multiple files that differ in case, but they overwrite each other because the OS thinks they are the same.
Really, the only place case-insensitive filenames makes sense is when you are searching. It makes no sense for any other reason.
Hmmm... You're right of course. I didn't describe that well. I meant Unix file names and way of working with files vs. Windows. In other words, forward slashes, symlinks, mounts, sane permissions, etc. I hate dealing with drive letters, etc.
What's a good name for this?
NTFS/Windows actually has all of that stuff you want, too. NTFS's permission system, for example, is extremely feature-full and integrates nicely with the user system (ACL support by default rather than an add-on, for example). The octal user-group-all permission you're probably used to is pretty crude by comparison.
It's more likely just you're unfamiliar with it rather than it's actually missing anything.
But if you want you can just pretend C:\ is equivalent to / and mount all your other drives at C:\mnt\whatever, that's completely doable (with a GUI to configure it if you want, even)
>Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
OT but have you tried Witch[1] as a task switcher? It switches between windows, which made my life SO much easier
Wow, looks like a very polished Witch and then some. The addition to the search looks interesting. Any area or use case thst witch is better? Is it as snappy? I like that Witch is instant.
I remember using Witch, but a very long time ago. I remember stopping using it, though the reason escapes me.
I've been using Contexts for what feels a year, and I keep using it. It's that good.
Not sure why you’d have to hunt for minimized windows because there is a separate area on the dock reserved for them, and the animation clearly shows the window moving towards it.
> I still get frustrated on OSX when I minimize a window and have to hunt around for it.
OSX took the NextStep/OpenStep interface and dumbed it down to great detriment, then added new things back (spaces, zooming) which were inferior logically but required less 'thinking' of how one works and more 'shiny looking' to potential customers..
IMHO hands down the best mouse-oriented window management paradigm to exist to date is the NextStep/OpenStep style over and above windows and osx, though I will admit windows has improved things with their sort of hybrid 'classic' windows+'macish' updates, and some of the newer ui things (e.g. window thumbnails) haven't made it into the current flagship of that lineage which is the open source WindowMaker..
Since the 'official' lineages are dead, am hoping the WindowMaker people continue to innovate/move this paradigm forward as they have been doing subsequently for the last N years...
it was by no means assured that Windows 95 would be the success that it was
I remember these times well. It was considered a huge break. People were whining about how stupid the Start menu was compared to just seeing your apps in front of you all the time :-D I love that we eventually came full circle to a Windows 3.1 Program Manager-esque approach with iOS nowadays!
To me Windows is bloatware. But I also make the OSX dock as small as possible and autohidden. I launch everything through spotlight though as I abhor unecessary point and click (synonymous with hitting the windows key and typing a couple letters of the application to launch.).
I remember Windows 95 as a complete disaster of crashes, data loss, failing installations, incompatible applications, missing drivers and countless other problems which were only fixed with the release of Windows 98 (maybe even SE), which was much much better. I have memories of people sticking with DOS and 3.x, only having 95 as a nondefault boot option in case they wanted to watch the Buddy Holly video or launch the new Encarta cd-rom.
Most of the above are talking about the UI which is arguably separate from the underlying OS. You can like a UI even if it's a UI to a system that crashes a lot :)
> still get frustrated on OSX when I minimize a window and have to hunt around for it.
Learn to use the dock? The taskbar on Win 95 was evolutionary rather than revolutionary, and the Dock from NeXT was one of the influences. Which is the same dock was have today in macOS.
You can still have personal preference, of course. But if you have trouble using macOS to find minimized windows, that's because you haven't learned to use it not because it's not possible.
"not understanding how folders could exist inside of other folders" -- My mom is 70 years old now, and I easily get frustrated whenever she's stuck with seemingly simple tasks with her computer. I usually scold her and yell at her, "This is so obvious, how come you don't know?" -- I always regret doing that afterwards.
After I'm calm, I ask her why, trying to understand it from her perspective. Every time I do this, I'm always surprised, because she gives valid points, and I end up cursing the developer :D
So, whenever I design UI/UX for an app, I ask my mom to test.
Rant: In my opinion, there should be an option in Mac/Windows to disable file drag and drop. Every time I check her computer, I always find dislocated files simply because she accidentally drag them.
>So, whenever I design UI/UX for an app, I ask my mom to test.
I have an informal rule that I will try to get someone at my job that has never seen or used the application to be the one to test out new features or UI changes. Generally just asking when they have some time, handing them a phone or laptop, and asking them to do a task in the app (with a small amount of background about the task if needed).
There has never been a case where this hasn't monumentally improved the application. Questions like "what do I do here?", "how do I get it to start?", and "did it work?" were extremely common for quite a while before we managed to get the UI in a good state. You just don't see the implicit assumptions you make at so many places.
Sadly it's hard to "formalize" something like this (at least in my experience), because the benefits seem to be greatly reduced if the person testing has seen or used the application before, and I found it works best the "further away" someone is from software development.
This isn't the only thing we do for UI/UX stuff, it's just one that I've found can really help, and most developers should be able to do it in some fashion.
We have plenty of analytics and user testing, but they tend to miss the case of users that are unable or unwilling to learn the application, and end up perpetually confused.
I don't think it has anything to do with "humbleness".
It's that there isn't really a way to formally gather and have people that haven't really ever used the application before, but know enough about the problem domain to be useful in testing. Not to mention that it's a non-renewable resource, once I've done it with someone for a specific part of the app, they are "burned" (at least for a few months). (that last sentence came across really... shitty sounding, but I can't figure out how to reword it to get the point across without sounding like I'm treating people like old computer parts, so I'll just add this disclaimer...)
Any kind of "formal" process ends up just looking like QA, and they end up making the same kinds of assumptions that the developers and designers do since they work and know the application just as good if not better than them.
Once you've learned something, you can no longer remember what it's like to not know it. Not fully, anyway, and certainly not without deliberate effort. It takes quite a lot of mental effort to stop knowing something, and approach a task with the mindset of someone who's never known it.
Good QA people should be able to do that, but it must be hard even for them.
I used to get frustrated explaining what I thought were simple computer tasks to my mother, but as I've gotten slightly older I am getting just as stuck as she was with some basic computer tasks! I totally get it now.
She also reminds me that she used to work at Digital (DEC), on the cutting edge of tech, in the 80's, but left her job to raise me- so I'm the reason she's so far behind, technically!
There's a world of a difference between VAX and today's modern operating systems. Twenty years in the future, I predict user interaction will largely be predicated on artificial intelligence. I can certainly see an older version of myself, frustrated at trying to figure out what sentences will trick the machine into doing what I want it to do rather than simply speaking to it.
OT but related to Drag&Drop, my IntelliJ at work is set up with the "drag files with ALT pressed" setting, mostly due to my "lazy" clicking that ended up trying to move around classes across packages disrupting my flow.
Would love to see that option in any file manager as well.
Also I hate with a passion the "drag selected text" feature
The two metaphors are not as tightly coupled as you suggest. Even pre-computing, a "file" meant a collection of information and a "filing system" was a way of organizing it - that didn't neccesarily involve paper folders. In fact, card files, from which we ultimately derive our computer term, were more usually stored in boxes.
I too prefer the term "directory". Apart from all UNIX tools using this terminology, it also more accurately reflects what it actually is - not a physical container for information, but a list of references to it - an association of names to addresses. A paper file cannot be in more than one folder, but it can easily appear in more than one directory. On a filesystem that allows hard links, this is a more useful conceptualization.
>Apart from all UNIX tools using this terminology, it also more accurately reflects what it actually is - not a physical container for information, but a list of references to it - an association of names to addresses.
Yes, but "directory" doesn't describe the way the UI appears to behave in a way that's more intuitive to the average non-technical end user than "folder." A "directory" could also describe a map or a phone book for most people.
>"directory" doesn't describe the way the UI appears to behave
That rather begs the question though, doesn't it? To a nontechnical user, the ship has sailed - everyone calls them "folders", the icons are ubiquitously pictures of file folders, and it would be terribly confusing to try and change it now. That doesn't mean it was the right choice though. I could imagine a world where the icons were little phone books, and the concept of a hard link didn't require a half-hour explanation.
I don't understand that, the folder was picked for the filing cabinet metaphor.
But in a filing cabinet system you have different cabinets, different drawers, different hanging files and various lower level arrangements - non-hanging folders (or drawer dividers), paper clips, staples.
Perhaps they never saw a filing cabinet? Or never put something in a box, then in another box?
When they put food in a cupboard do they empty it out of the current container first?
<<Your house is a drive, every room is a "folder", every container in every room is a "folder", every container inside another is a "folder", every object is a "file".>>
I've come across it, just assumed that no-one explained the metaphor to them.
The places where the metaphor gets stretched can be frustrating, especially in Windows. You can put Folders on the Desktop, but there's also a "Computer" icon on the desktop which allows you to browse into other folders.. including the desktop?
Then there's your Personal folders.. which are in fact collections of other folders themselves, but appear transparently to be folders- but somewhere on the file system they are actually different folders. But without the benefit of breadcrumbs to help you keep your place in the structure, where did that file actually go?
Don't get me started on shoehorning OneDrive (or was it OneDrive for Business?) into the mix.. where'd I save that file again?
>You can put Folders on the Desktop, but there's also a "Computer" icon on the desktop which allows you to browse into other folders.. including the desktop?
That's a specific failing of Windows, not something inherent to the folder metaphor, though.
It's one of those absurd things that happen when nobody pays any attention to coherence. That particular thing has driven me BATS for years.
I'm encouraged, though, that my Win10 machine here doesn't seem to continue this foolishness. WinExp used to always show the Desktop as the "root" of everything, but it doesn't now. The quick access pane shows This PC as the root, with quick links below it for Desktop, Documents, etc. It's almost like MSFT is learning.
Well, yes. Windows' implementation of the metaphor is my complaint. In its current form, they may have cleaned up the "desktop" but also made it more confusing to tell what is a special folder/collection/Library. The way it appears now, there's a "Library" called Documents, which includes a folder called Documents but may include others as well. Where am I at a given moment, not sure.
"Folder" makes perfect sense to me. At my first job, we used a paper-based filing system, and we'd sometimes put folders inside of other folders. It's perfectly legit. You wouldn't nest them 5 levels deep, but that's simply a physical limitation: you wouldn't put 100 files in one folder, either.
"Directory" I don't understand at all. The only thing I can think of is the felt letter boards at the entrance to office buildings that say "DR SMITH - OFFICE 555". Not only are those a more obscure metaphor, but I've never seen them nested even one level deep. I don't have any idea how I'm supposed to use that metaphor. "Create directory": am I creating a new building? Is my building going to have two directories in the lobby? How is one 'inside' the other?
I was completely lost on Unix until it was pointed out to me that "directory" was just its weird name for folder.
I agree with the downvotes in so far as it is silly to still get upset about this (instead of silently admitting defeat), but I am 100% guilty of that folly myself.
It's highly relevant to the discussion because it illustrates just how deeply ingrained these basic UI expectations are. UI (re-)designers take note, in some cases even decades of forced retraining won't make everybody accept gratuitous change.
However, in the Windows shell a "folder" is not necessarily a "directory" - it's just something that may contain something. Control Panel, for example.
And indeed I remember that as one of the more confusing things when I've got my first win95 PC as a kid. I used amiga and some DOS before, and I knew what a directory was, but now all of the sudden there's something called a folder that seem sort of similar?! To add to the confusion the computer manuals that were written in my mother tongue used in parallel both word 'folder' and the literal translation of it. Took me a while to understand that it all means the same thing.
I like that folders icons on most GUIs are still that light brown Manila colour, but plain brown folders went out of style with leg warmers and mullets.
OP didn't want to remove the functionality, just provide a way to disable it for users that often perform drag & drop operation randomly.
Especially older people have problems with their motor skills so instead clicking on a file they often perform drag & drop instead (they can't hold the mouse steady enough, so the system recognizes movement and at the same time their "click" is too slow, so OS recognizes it as holding the button down)
Cut & Paste for when you really need to reorganize? But usually it would probably be enough to just list the most recent files somewhere and provide full text search for everything else.
Seeing how my elderly parents use the computer: you don't manage files. At all.
Right click is evil, keyboard shortcuts are evil, drag and drop never occurs to them. Mom usually just reads her emails (and attachements open in the browser nowdays), so it's not a problem, but father saves files.
If he finds something important then he saves it multiple times, so in the file list view there will be multiple similar items, that's how he knows it's important (visually it's very distinctive). When he has to copy a file somewhere else he usually fires up his CAD software, opens the file and then saves it elsewhere.
I just gave up "educating" them (after countless attempts). It's pointless. I remove file duplicates half a year, copy stuff to an usb key when he asks me (I visit them once a month) and avoid the machines like the plague, unless they explicitly ask me to do something (I installed ubuntu for mom, which works wonderfully for her, one less problem to worry about).
There's something different about cut & paste vs drag & drop, though it won't impact most people.
If you're using RDP and copying files from the host computer to the remote computer, it's natural to cut/copy the file on the host and paste it on the remote. Drag/Drop isn't as natural because you have to mess around with the RDP window size and scroll position to be able to see Explorer on both machines.
The file copy occurs very slowly, and requires a lot of memory. Copy/Paste uses the clipboard, so the entire file gets read into memory before it can be written to the remote filesystem. It goes slowly because there is apparently an un-optimized loop copying bytes.
Instead, on the remote system you should open two Explorer windows, one for the destination and one for your local filesystem, which RDP adds for you. Then you can drag & drop the file you want to copy. This skips the clipboard and also seems to use a much better optimized byte-copying loop. The speed difference is very noticeable.
Pro-tip: if you're copying lots of files, zip them first and copy the zipped file. That can reduce a multi-hour copy operation into a couple of minutes, even if the compression ratio is awful. Windows is really, really bad at multi-file copy operations.
Yup, I ran into this issue enough as a kid that I stopped using cut/paste/undo in explorer. If I needed to copy or move files around, I’d use the command line—which also had the effect of making files copy much faster (seconds instead of minutes) for some reason.
> How do you manage files in windows at all without drag and drop? Do you just never reorganize anything?
Ctrl-X, Ctrl-V.
Are you a Mac user? Apple disabled Cut in Finder ages ago, basically for aesthetic reasons. It's a constant irritation with my Mac that I have to manually change the folder view just to move a file up one level; on Windows you can do it in less than a second with Ctrl-X, Backspace, Ctrl-V.
It's a UI design decision to avoid Cut in the filesystem. Cut would behave differently from Cut in every other app -- namely, the file is not deleted immediately after Cut. It seems better to avoid calling this "Cut" than to have it behave inconsistently from other apps.
> How do you manage files in windows at all without drag and drop?
Most people I've seen use Ctrl-X/Ctrl-V, or right-click-Cut/right-click-Paste. It has the advantage that you don't need both the source and destination visible at the same time.
You can actually drag a file from one window to the other without having both showing (such as between two maximized windows). While dragging a file, you can still manipulate windows around using the keyboard, like with alt-tab.
For example, you can begin dragging a file, then alt-tab to another window or tab or even open a new program or whatever using the keyboard, and then finally release the file into the destination window.
It's not obvious that this would work, but it is convenient if you find yourself wanting to drag something with the mouse for some reason.
I know it's not exactly the same but the developer tools on all modern browsers will display the current state of the DOM in the "inspect" tab. It's true that SPAs will have a lot more clutter in the form of JavaScript hooks and hidden elements but it's all there in glorious <html/>. :)
The fact that the basic elements of Windows 95's GUI have survived for so long shows, I think, how well designed it was.
For its time it was a great design that was intuitive to understand, relatively lightweight and did not get in my way. About the only changes I think improved things notably were the search field in the Start menu and Aero Snap.
The replacement of actual Search with this Cortana-based attempt at a "smart" search or Q&A interaction is absolutely agonizing to experience as a user and just a completely boggling decision to me in general. Why on earth did a search feature that will return something different from what you typed in, with the exact thing you typed in the secondary results, ever make it out of testing for public release? I have faith it'll improve over time, but right now Cortana in general feels like a step directly backwards in every category "she" touches. The Edge browser is actually pretty slick on the Surface, but instead of being able to right click selected text and search it, I have to "ask Cortana." I don't want to ask Cortana, I want to search that in Google.
I have not had much contact with Windows 10 so far. But a former coworker had a Windows phone, and there, Cortana was pretty good actually. We once had a "competetion" where we asked Cortana, Siri and "OK Google" to "tell a dirty joke".
Cortana won hands down by actually telling a joke (it was about dirty laundry, but it was ambiguous enough that I think Cortana actually "got it".
OTOH, I feel really weird literally talking to a computer, and I don't think I will ever overcome that.
Search got the google bug, so that it will change the results in the xx milliseconds while you are pressing enter or clicking. This is the new boundary for all interfaces, and I know it's just being stupid - sometimes it really makes me angry. But snap works just fine, yeah
> Search got the google bug, so that it will change the results in the xx milliseconds while you are pressing enter or clicking.
I only use Windows at work, and my work computer is sufficiently slow that I have not had a problem with that. Maybe it is the version, too, my work laptop runs Windows 7 - and will hopefully continue to do so until Microsoft stops supporting it; as far as Windows goes, Microsoft really nailed it with Windows 7.
(I have to admit, though, I have only very little experience with Windows 10; I really disliked Windows 8/8.1 and the corresponding server versions because Microsoft butchered the UI.)
I think it's beautiful. And, compared to Windows 10, Windows 95 was at least somewhat consistent design wise.
After 12 years of macOS I recently got a Windows 10 machine. There's plenty of Windows 10 bling on top of the OS, but you don't have to dig deep before you encounter the embarrassing remnants from very early versions of Windows. Running the latest version of 10 it still feels very unfinished which I hope Microsoft intend to do something about.
I don't have too high hopes though, considering it was released two and a half years ago.
There are things from earlier Windows I wish they'd kept. Like the perfectly functional (actually more functional than their replacements) settings panels.
But I think the biggest thing I miss is the start menu being just a view of a folder hierarchy. The Windows 10 Start Menu is tied into the appstore and uses some kind of database that can easily get corrupted and cause it to stop working seemingly randomly. Sometimes performing the right voodoo magic can fix it, but usually it means an in-place upgrade (aka reinstall). Go on, look for "Windows 10 start button doesn't work" on google. Fun reads.
It's an example of how complexity meant to make things easier often just makes them rigid and unfixable when shit inevitably goes wrong. The original start menu was so stupidly simple that it was almost impossible for anything to go wrong in the first place, and if it did, it was easy to reason about because it was simple. I miss that kind of design in my software.
Except for three, humongous improvements that the Windows 10 start menu has over 95:
* It is flush bottom right with the screen, making it a true "mile button". In 95, there was a tiny non-clickable border around it, so if you just slammed your mouse into the corner, it would not work; someone who worked on Mac OS described MSFT as "narrowly snatching defeat from the jaws of victory" compared to the mac's menu bar, which was perfectly flush to the top of the screen and thus a "mile high bar". I think this was actually done with XP?
* After clicking it, you can immediately type the name of the program you want to run, instead of having to click the "Run" entry. I switched off of Gnome onto Windows after learning that this aspect of my workflow wouldn't need to change. I think this was actually done with Vista?
* Subfolders are opened by clicking instead of hovering. Nothing makes me more frustrated than accidentally mousing over the wrong part of the menu and closing a sub-sub-folder. [hyperbole]Whoever invented mouse-over menus should be shot.[/hyperbole]
XP has a hack where if you click the bottom row of pixels on the screen, your mouse cursor is magically moved up enough pixels so you hit the buttons instead of the dead area.
> immediately type the name of the program you want to run, instead of having to click the "Run" entry
Why click anything at all if you're going to be typing anyway? Pressing Win + R has worked since the beginning, taking you directly to the "Run" dialog.
> mousing over the wrong part of the menu and closing a sub-sub-folder
The classic start menu (since Windows 98, or Windows 95 with IE 4) allows you to easily rearrange the entries by drag and drop. If you organize it such that it doesn't have any sub-sub-folders, it works much better. :)
On the other hand, if you just let things stay where the installers put them, it's pretty terrible. Each application takes something like 5 clicks: Start -> Applications -> SomeVendor -> SomeApplication -> SomeApplication. I guess this is the way most people had it, so that's why Microsoft gave up on structure and focused on search.
> Why click anything at all if you're going to be typing anyway? Pressing Win + R has worked since the beginning, taking you directly to the "Run" dialog.
Even fewer keystrokes, and less typing with Win 10: hit Win key, start typing Word (or settings, or mouse, or whatever), hit enter. The Start menu pops up, and is searched/filtered as you type. Compare this with the previous option (which still continues to work as well): Win key + R, type msword, hit enter. Not much in it with this example, but it's a neat way to access Start menu items rapidly.
For reasons I haven't yet explored, it occasionally fails to find an item, which is more puzzling than annoying. An item can be right there, and the keystroke search fails to find it. I miss the simplicity others have mentioned of the 95 Start menu simply being nested folders.
I found that Windows 10 search is not quite as good as 8.1. I know the problem you're talking about, and it never seemed to happen to me on 8.1, only 10. They jacked the program search indexing somewhat in 10, it was perfect in 8.1.
> Running the latest version of 10 it still feels very unfinished which I hope Microsoft intend to do something about.
Might depend on what particular parts of the OS you come across. I use Win10 on my work notebook [1] and besides the long startup times [2], I haven't noticed anything "unfinished". In fact, Windows very subjectively feels more polished than macOS which I used before (until December).
Then again, I don't use much of it: I only run Firefox, Slack, Outlook, and VirtualBox where the actual work is going on in a Linux VM.
[1] Probably not the "latest" version though. I get to use whatever our corporate update server hands out.
[2] Compared to my private Linux machines. macOS also took several minutes from turning on the machine to everything having fully settled in.
Pretty sure he is referring to the configuration structure. You get this fancy 8.1 sidebar configuration screens that may solve 80% of a average users tasks, but once you start digging you'll find configuration Windows in the style of W7, and if you dig even harder you'll even find stuff that looks straight out of 98.
I am a linux users myself, configuration inconsistency gives me nightmares.
> You get this fancy 8.1 sidebar configuration screens that may solve 80% of a average users tasks, but once you start digging you'll find configuration Windows in the style of W7, and if you dig even harder you'll even find stuff that looks straight out of 98.
Yes. And another really mind-boggling example is the Control Panel. There's actually two of them (or more, depending on how you see things). We have both the old Control Panel and the new Settings user interfaces for configuring various settings in the operating system. Some configuration options are in Control Panel and Settings, and some options are only available in one of the interfaces. That's a UX fk up if you ask me.
Imagine that situation in macOS, there being two System Preferences apps with completely different looks. They would both have some commonalities, but many options would only show up in one of the apps. Would. Not. Happen.
To be honest my impression of Mac is not to different. You have Gui only and terminal only config settings and some that mix both concepts in a painfully unobvious way (like xcode license agreements)
However it is still a lot cleaner and more consistent than win 10. Agreed.
Honestly I much prefer the old control panel and use it whenever possible on Windows 10. There is very little you actually need the new Settings dialog for. But I completely understand why it exists, the control panel is way too complex for the average user.
Some comparisons to the contemporary version of OS/2 (2.1 was the current version when Win95 was announced, and 3.0 was released immediately before Win95):
* Applications minimized to a special folder, which was located on the desktop.
* No start button or task bar (they were added in OS/2 4.0).
* Shredder on the desktop (did not offer restore files like Mac Trash or Windows Recycle Bin).
* Hierarchical folders on the desktop that could contain either shortcuts or files.
* Shortcuts couldn't get "broken" as long as you did all of your file management through the Workplace Shell.
* Folders and file types could be subclassed in various ways to change their behavior and appearance. Simple changes didn't require programming.
* You could mark a folder as a project, and all the programs and files associated with the folder would open/close/hide along with the folder.
At the time, I felt that the Workplace Shell was immensely superior to the Windows 95 desktop. But it probably was quite a bit less friendly to new users.
Windows 95 and its immediate successors had a lot of problems regarding stability, memory usage, performance in certain cases, etc. That and the comparison with the NT OS line is a separate discussion however. What is important about Windows 95 was the design of the GUI. At the time, it was a huge leap forward in desktop computing.
Even though I jumped on the Mac OS X bandwagon from the very first moment in 2001 and was happy to leave the Windows world behind, the fact remains that for a few years time in the mid 90s, Microsoft showed a strong ability to design GUIs that were easy to use, relatively consistent, and flexible enough to suit a large array of first and third-party application designs. It's a shame that, IMHO, Windows XP took things in a highly negative direction after that, and Microsoft never fully recovered. With the possible exception of Windows 7, every OS release since XP has been a mishmash of competing ideas and confusing discrepancies, and macOS has continually outpaced Windows in usability.
I still hold out hope that there's a solid future for Windows when it comes to UX/UI design, if only because I want macOS to have real competition on that front.
Microsoft in the 1990s had a strong penchant for striking the right balance between power users and the average user. Their products remained full-featured but most of things you didn't use day-to-day were quietly tucked away in areas only power users would care about. There wasn't any real notion of compromising usability for simplicity. If you wanted to customise finer aspects of Windows 95, all it would take is a registry entry. If you wanted to use an advanced feature in Office, all it would take is a deep trip inside the menu bar. If you didn't want to do either of these things, they stayed out of your way.
Apple later pioneered the notion of making products so radically simplistic, they sacrificed functionality on the pretense that power users wouldn't be bothered by the absence of features. There was almost no customisation. You got what you bought and liked it. Finder, the worst file manager I've ever used as a developer, endlessly gets in your way in the name of not letting grandma mess up her system.
Microsoft later badly copied Apple, and the result are horrid and unusable "Metro" style applications.
The Task Bar and Start Menu metaphors have held up well. At the time on the Mac, it was harder to see an overview of running Applications and windows at a glance.
True. I didn't have much experience with the Mac back in the Mac OS classic era, but that was one thing I found very annoying coming from the Windows side. While it took a while to get the Dock right in OS X, it was a big improvement over the previous Mac UI.
I disagree that OSX completely beats Windows GUI in all areas. I find Windows' window manager, Explorer and taskbar more usable and capable than the Mac equivalents. Not to mention everything is more configurable, after many years using OSX I still get confused with the unchangeable, low contrast, light-grey on top of light-grey, window theme where you can't tell which window has focus. Also miss on OSX how you can tab over to everything in Windows without having to use your mouse.
Ah, remembering Win95 makes me yearn for a simple, clean GUI again. All of the major operating systems have been in a downward slide in terms of UI/UX since the early 2000s.
As far as UI/UX is concerned:
Windows peaked with Windows 2000.
MacOS with OS9 (Why didn't they just throw the classic GUI on top of Darwin)?
At least with *nix you have choices and can go with one of the several variants of Gnome 2 (Xfce, et al).
Win7 was still internally consistent (as opposed to the newer tablet / mobile / pc / washing machine UIs of later versions)
and provided few additional UI enhancements over 2k.
The most internally consistent design was that of NT 3.1 - it was a true classic in many respects. As far as general usefulness, performance, and versatility nothing can compare with Windows 10 (except Linux, of course).
I am not arguing about NT 3.1, but what inconsistencies were there in Win7 UI?
Starting with Win8 you have basically a random choice if some setting is set in "classic style" or new fancy "settings" dialogs. You can have all the performance in the world if your users spend most of the time just looking for the right place to do something.
Windows 7 has basically the same problem with settings. You have the dumbed-down, XP-style control panel that doesn't expose all the settings, and the hidden classic versions of the settings.
For example, the useless user settings in the control panel and the more useful old version in "control userpasswords2".
oh, OK - I entirely forgot about this :). But it's still just an easily modifiable default for beginners that power user doesn't have to use (or indeed remember in my case :)).
In Windows 10 you have two different kinds of UI and settings that can be changed in one but can't be changed in the other and vice versa.
Win7 is nice too, but I admit I didn't use it much (as I was a Linux user almost exclusively when 7 was released). I got Windows 10 on a new laptop and gave it an honest try (along with the new Linux subsystem), but the little annoyances and inconsistencies piled up, and back to Linux I went.
I spent a lot of time with Windows 2000 back in the day.
Windows 2000 was the first OS I bought retail -- my new PC came with Windows ME and it was so terrible I ran out and actually paid for an OS ;)
Windows 7 adds some nice things like search and the explorer shift+Rclick menu, but it also hides some settings panels that haven't changed since 2k behind more clicks.
Nitpick: Xfce most definitly is not a variant of any Gnome. It's a lightweight (and in my personal opinion far superior) alternative to the heavyweights Gnome and KDE, which just happens to share the underlying Gtk toolkit with Gnome.
It's impressive how long the "Desktop" paradigm of Windows 95 has stuck around, particularly if you're in the Linux Desktop world. Most of the popular desktop environments--Xfce, Mate, and Cinnamon come to mind--still follow that pattern. The last major one to go in a different direction was GNOME 3, and the backlash against it was so fierce that several other major DE's forked an earlier version in order to keep consistency.
It's difficult to explain succinctly. I will say that it's unique, insofar as it's not trying to copy past paradigms or macOS. But it can be a bit confusing. Minimize/Maximize buttons, for instance, aren't even included by default.
In early UI we can see a Wastebasket, which ended up being changed to the Recycle Bin.
As a user I would say that Recycle Bin is a misleading name because it has nothing to do with recycling a file / folder. However it has a more positive sounding than the weird Wastebasket.
Meanwhile classic Mac OS already had a Trash. Simple, clear and short.
I wonder why Windows could not simply name it Trash? Could it be that they did tried to stay away from copying as much as possible?
> As a user I would say that Recycle Bin is a misleading name because it has nothing to do with recycling a file / folder.
On the contrary, I'd say recycling is a much more fitting name for what happens when a file is deleted: The existing disk space is reclaimed to be reused for something else.
That was part of Apple’s GUI lawsuit in 1994 against Microfoft. Apple lost all parts of the suit except for the exclusive rights to the trash icon. Microsoft was forced to replace it with the recycle bin.
I have to think that the common anti-pattern of naive users storing files they mean to keep in the Recycle Bin wouldn't happen so often if it were still called Trash.
This is a common user behavior on Macs as well. It's an easy place to put things and find them later (as long as nobody helpfully cleans the bin in the interim)
I doubt that Microsoft has really departed from the UI design methodology they used. The dubious decisions are most likely pushed from either the marketing department or from above. But I think Windows itself, even the newer versions, is quite well designed.
Apple probably also has a pretty solid design process.
Have we actually reached the point where we idolize something that was equally mainstream to bash when it came out? Remember "Winblows" &c?
Suddenly, faced with hyper-spy mega-corps, the dumb simplicity of the evil-yet-cute Windows 95 is desirable. Like the lesser of two evils, or the evil you know.
Any day now, a post will come up extolling the illumined joys of mainframe COBOL programming.
There were many, many things wrong with Windows 95 (and its successors). But the design of the Shell was solid.
(Also, considering how hardware requirements have skyrocketed, it might seem remarkable to some that it could run on a computer with 8 MiB of RAM and a CPU that makes today's low-end mobile phones look like supercomputers and still feel snappy.)
I know people used to joke about using the "Start" button to shutdown the computer, but I never understood what is so funny about that.
I get that, I just never thought it was very funny. Start might not have been the best name for that button, but the idea of having a single a menu to access all OS functionality and applications was good. Still is good, in fact, which why so many people got upset when they removed it in Windows 8.
I'm no luddite but several areas of computing have had serious regressions. I still have Windows 95 machines kicking around and I know how fickle they are, but you will never hear me saying a bad word about the UI.
There was even a parody song about how much Windows 95 sucked, but the criticism seemed more popularly aimed at its resource requirements than the UI design.
The requirements were practically negligible for its era, and the leap it represented. I have installed and used it at length on 386 with 8MB on a 210MB disk. It wasn't pretty, but it wasn't pretty bad either. Perhaps (on appropriate hardware) it wasn't as solid as NT 4, but before XP (which is 2000 which is NT, even if simplifying it) there weren't many "polished enough" _and_ "affordable enough" windowing systems for the masses. Classic Mac OS was very polished but not very affordable, and it didn't even have pre-emptive multiprocessing. UNIX had either a high cost of entry if you're talking workstations (Suns and SGIs were polished but expensive) or man-hours to acclimatize (try running X11 in 1995, then compare to Windows 95).
Eh.
In any case, to bring it back. It was good enough. Nowadays, I often hope for a minimal Windows 10 that will be out of the way enough to approach Windows 95.
> The requirements were practically negligible for its era
Your 8 MB was the "recommended" requirement for 95, with 4 MB being the minimum. That wasn't pretty. If you have used 95 with only four megabytes of RAM and a small, slow hard drive, you'll learn that the minimum requirements are quite lower than comfortable, probably in order to more closely match what was actually a typical home computer at the time, some 386/486 with 4 MB RAM and a small hard drive. Where I'm from, Windows 95 pretty much meant getting a new PC for the average consumer.
So you go home with your newly bought copy of Windows 95 to your 386 with 4 MB that you bought 1-2 years ago, perform the minimum installation to your 100 MB hard drive and find out that it's super slow and constantly using virtual memory making the loud disk sound like a Geiger counter throughout the session. You compare it to Windows 3.11, DOS, whatever you had before and have a pretty solid basis for complaining about its resource usage.
Or worse, you read about Windows 95 and decide to finally sell your increasingly irrelevant Amiga 1200/3000/whatever now that you can also have preemptive multitasking on a PC, buy a cheap PC matching the Win 95 requirements with the money and install. Only to learn that it's 100x slower than Workbench, BSODs nearly as often as the Amiga gurus out, uses megabytes of RAM instead of kilobytes.
Or you have a Macintosh, couldn't care less about how exactly multitasking is achieved, and wow, PC seems like a nice option now that that too has a nice, user friendly GUI. And it works on cheap, affordable hardware! So you buy the cheapest, most affordable hardware that'll support Windows 95...
> Our testing data told us that the main problem was windows not being visible at all times, so users couldn’t see what they had open or access tasks quickly. This realization led us fairly quickly to the task bar design
Hmm, I wonder if that should be reworded to “and then we saw RISC OS and it had a task bar design that we really liked”. I can’t believe that they wouldn’t have known about it.
Yup - I remember that Windows PCs at work became far easier to use after Windows 95 (actually 98 I think) reached them as I was using an Archimedes at home.
From my reading, twelve designers and twelve programmers without a lot of overlap, and a significant investment in iterative useability studies. Interestingly they dropped the design documentation as too much trouble to update versus letting the design live in the latest implementation.
Amazing to see all of this design work happening starting in 1992.
I always thought of iterative design and development as becoming popular staring around 2001, and usability studies only becoming popular around that time too.
> the design documented in the spec was suddenly out of date. The team faced a major decision: spend weeks changing the spec to reflect the new ideas and lose valuable time for iterating or stop updating the spec and let the prototypes and code serve as a “living” spec.
> After some debate, the team decided to take the latter approach. While this change made it somewhat more difficult for outside groups to keep track of what we were doing, it allowed us to iterate at top speed. The change also had an unexpected effect: it brought the whole team closer together because much of the spec existed in conversations and on white boards in people’s offices. Many “hallway” conversations ensued and continued for the duration of the project.
As the Win95 team formed in 1992, I’d argue the Amiga OS was an excellent alternative at the time, albeit one oft forgotten due to its subsequent market share.
Extremely easy to use, extend and navigatee, at the time it’s only missing piece was a built in “file explorer” but there were also so many 3rd party options by then, too (Directory Opus being my personal favorite).
I still miss the dual window "Explorer" from Win 3.1 every time I move or copy files. The old File Manager had this by default. Installing Midnight Commander etc. on client computers is not possible, so I have to open two Explorers and get them to a convenient size.
I've often thought Windows 95 borrowed ideas from Nextstep.
Looking at this retrospective, I can see how it could be possible, especially if they started design in 1992. NeXT had been winning praise for their UI for years by that point, and Microsoft were consulting with Susan Kare (an Apple alum and NeXT employee).
If you compare Nextstep 3.3 and Windows 95 or NT, you can see startling similarities in title bar size and format (to the pixel), window borders, 'rectangularity', tabbed elements and more. "Great artists steal" and all that..
Soon after the Windows 95 release I heard that one of the members of the Windows 95 team had a copy of NeXTstep on a computer they used, and that it was a source of ‘inspiration’. I can’t find any reference to this now, but the 3D look and grey shades definitely are similar. Windows 95 looks much more like NeXTstep than Windows 3.11.
> Beginning users and many intermediates relied almost exclusively on visible cues for finding commands. They relied on (and found intuitive) menu bars and tool bars, but did not use pop-up (or “context”) menus, even after training.
This is a lesson the Android team re-discovered decades later, resulting in Android dropping the "Menu" button. Apple still hasn't gotten the memo yet (3D Touch). The biggest usability negatives with context-menus are poor discoverability and inconsistency in different contexts.
This is a perfect use case for using microformats to make the document more semantic, especially since archival for future reference is the author's explicit goal.
An easy win would be to use h-entry categories[0] for the tags in the document.
If you're writing stuff on the web, please consider adding some microformats to make your stuff more accessible and archivable.
It was great to read about the early iterations that didn't pan out, and it reminds me of how far the average person has come in terms of computer knowledge (even though I know several people that _still_ have trouble double-clicking).
I would love to see a similar Windows 8 document, because I have no idea how they convinced themselves that non-tablet users would like it. Things like charms and even the freaking start button were hard to discover.
Windows 95 (and later) always seemed to just miss the mark: they added new things but were just so close to having something that would work much better. It was as if they saw useful features to add from other systems but only copied what they could see without really understanding the behaviors that needed to be copied too.
Examples...
The original Start menu was in a corner of the screen but without the crucial zero pixels of separation from the physical corner, turning what could have been a massive target into a tiny one. (Fixed in XP though.)
Menus were sluggish as hell to open, and they lacked hysteresis to make diagonal traversal a lot easier. Also, there was no consideration for how to handle a task that might take awhile, such as locating the names and icons of dozens of items; the menu would not show anything, you’d just wait. Sadly they were experts at efficiently making menus go away so one accidental mouse movement and you start all over.
The ordering of frame buttons, “minimize, maximize, close”, on Windows does not clearly separate the most-dangerous action from the least-dangerous action, nor are the actions ordered by similarity. Instead, a very common action on Windows (“Maximize”) is right next to the most dangerous and polar opposite action (“Blow This Away Forever”), with zero pixels of separation. On the Mac, the order is “close, minimize, maximize”: if you mis-hit Minimize on a Mac while moving toward Close, the window will still go away (more or less what you wanted) instead of becoming gigantic and still visible (polar opposite). Also, on a Mac there is significant pixel space between the distinct options so it is harder to mis-click.
The ordering of dialog buttons, such as “OK, Cancel”, meant that it was not possible for memory to take over. In some dialogs “OK” was in the position that Cancel would be in, in others it wasn’t. Also, Windows tended to have very generic names (Yes, No, Cancel), probably because entire APIs for opening messages had only those options; this required reading every word of a long-winded message to understand the options, rather than just clicking something obvious like Save.
Windows 95+ tend to add hierarchy in lots of places that don’t benefit at all from hierarchy. I hate having to remember some obscure vendor’s name so I can find “Unnecessary Company Name, Inc. >> Unnecessary Product Suite Name >> App Name” in a menu for example, when “App Name” in a flat list is the only sensible option. (Fortunately, Search was a reasonable way to avoid this. Until it became slow and couldn’t actually find things that clearly exist.)
Sorry to wish against you but I hope Explorer never gets them, not even as an option, because I think they are a garbage idea for browsing files or anything else since they're too busy and constantly make you change your focus.
They're bad for browsing, I agree. I do a lot of file management and reorganization across network drives, and it would benefit me by immediately exposing one or two directory levels above a file location for drag and drop simplicity. Multiple windows or tiling tabs in Windows explorer could accomplish this as well, of course.
I have a feeling that Windows 95's window minimisation was at heart a workaround for the fact that the architecture couldn't support workspaces and therefore they had to find a way to 'hack' Windows to hide windows at will.
For the user experience level they were designing towards, workspaces would have been a recipe for "I broke my computer, all my work is gone" support calls on a constant basis.
For users that can't grasp that minimized windows become icons on their desktop or that directories can themselves contain directories, having everything switch to a new workspace will look to them identically to either one of an "I just broke it" or "It just deleted/erased/destroyed everything I was working on" result (and which result occurs depends upon whether they blame themselves or the machine as the cause).
Sadly, the result is that us advanced users, who do understand workspaces and can make productive use of them, were left without them being available until win10.
Even better, on Windows 3.1 a minimized program could draw to its "icon" (it's actually a tiny window). For instance, a minimized clock could still show the current time. That was one of the things lost with the Windows 95 UI changes.
And don't forget MDI, best shown by the Windows 3.1 Program Manager: a MDI program would have several inner windows within it, and each of these inner windows could be resized, moved, minimized or maximized, all within the boundaries of the main program window.
I remember someone who had an all-in-one Compaq PC with a version of Win 3.1 that had a shell with tabs on the top, not too unlike the "ribbon" of modern MS Office.
Some SVGA cards provide drivers for Windows 3.1 that added "virtual screen". It allowed to use a virtual screen bigger that the actual video resolution. To screen along these virtual screen, was only necessary to move the mouse to the edge of the visible screen.
Also, X11 allowed to do this.
I think it like a poor's man alternative to workspaces.
Ah yes, the introduction of the start menu. A widget that even 20 years later, when Microsoft dared to change it even a bit, people absolutely lost their minds.
That tabbed beginners UI gets me thinking of the netbook UIs that popped up for a short while before Intel and Microsoft buried the segment in red tape.
Except apart for maybe "recycle bin" it was nothing like Mac OS. Central element of Windows 95 UI was taskbar and start menu, which don't have counterparts in Mac OS. Central element of Mac OS UI was the shared menubar, which doesn't have counterpart in Win95.
Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
Edit: Found this fantastic PDF "Chicago Reviewers Guide" which goes over all the new stuff in Win95. So much stuff I had forgotten - TrueType fonts, Plug and Play, registry settings, right-click properties, long file names... Basically everything that makes Windows what it is today.
http://tech-insider.org/windows/research/acrobat/940601.pdf