Hacker News new | past | comments | ask | show | jobs | submit login
Designing Windows 95's User Interface (2018) (socket3.wordpress.com)
274 points by awesomekling on Nov 22, 2019 | hide | past | favorite | 180 comments



I remember booting Windows 95 for the first time when it came out and being completely gobsmacked at how good it looked and felt.

I was still a heavy Amiga fan back then, even though I was painfully aware that my favorite computer of all times was slowly falling behind. But it was still able to do preemptive multitasking, something that was still widely unavailable across all OS at the time (except for Windows NT). AmigaOS was still definitely better than Windows 3, of that I was still very convinced (and quite distraught that despite this technical superiority, Windows 3 reigned supreme).

All of my convictions got shattered that fatal day I booted Windows 95. The UI was beautiful, preemptive multitasking was working fine despite the memory and CPU conditions at the time. I just couldn't get enough of launching various apps on Windows 95 just to see how they looked.

On that very day, I thought "This is it, Amiga is truly dead".

I sold my Amiga and bought a Windows box in the weeks that followed, with a heart that was both heavy and excited.


My dad was a huge Amiga fan. I always felt that the Amiga was a bit ahead of its time. It was doing SNES graphics and sound in the NES era.

It's a shame it never caught on beyond the Video Toaster being used in TV/movie studios.


You're right but to be a nitpicker a more appropriate comparison would be the sega megadrive/genesis - it had very similar hardware and capabilities to the original amiga (1000/500/early 2000s).


Then I burned my first CD on Windows 95, ended up with a coaster because I moved the mouse or Windows played a sound, and my "obsolete" Amiga would end up sticking around for another half decade... Amiga simply crashed far less and could do pre-emptive properly. A screen saver or music app didn't ruin CD burning. I ended up finally migrating to Win with Windows 2000 as Win was consistently worse until then -- at everything.

The Win 95 box, bought at insane cost as I spcced top end everything including SCSI and 10k drives became my first FreeBSD server - v2.1 IIRC - within months of (finally) buying into Windows. Kept the subscription to BSD CD releases until they shuttered it to support them...

Granted I was on 040 A3000 and A4000 with 24 bit retargetable graphics by the time of Windows 95, and was no longer earning the living off Amiga by then.


An Amiga / Windows flame war in 2019, yay!


I'm fascinated by how many of the things they identified as helpful to new users were later reverted. Looking at my Windows 10 desktop all of the work they did to find a solution where all users identified a program as "running" has been undone - it's back to icons with only a subtle indicator to distinguish between those that represent shortcuts and those that represent a running program.

The Start Menu is now assumed knowledge that the furthest left icon on the taskbar is "special" and does something different to all the other shortcuts. I guess the modern equivalent of "Start" is "Type here to search" in the Cortana bar but that's not a great experience for new users. We all know the propensity for it to decide to search Bing at the slightest provocation but if you try some natural "never used a computer before" things like searching for "power off" or "shut down" you get some quite unhelpful results. (The former wants me to set up a power plan, the latter directs me to Add or Remove Programs)

It feels like computer use is assumed knowledge in 2019 - that everyone who buys a computer already knows what the Windows logo represents, how minimising and overlapping works, what the difference between an app icon and a notification tray icon is, and so on. Microsoft no longer feel the need to design so much for people buying their first ever family computer, having never even used one before. Probably true in the first world, but I wonder if this holds out globally?


I think one key factor is that, whereas this article talks about making Windows easy for home users being a priority because of the huge untapped market there at the time, home users are no longer a real concern of MS: they make their real money from large corporate deployments of tens or hundreds of thousands of copies of Windows, not from individuals. MS basically assumes that, once everyone is using a Windows computer at work, they will have a Windows computer at home as well, and that they will learn how to use it at work because they will have to, and their employer will train them as needed to be able to accomplish work tasks, which will then give them enough basic knowledge to use their home computer. In other words, MS has outsourced the job of training people how to use computers to their corporate customers, so their design process no longer worries about it.


In my experience it is definitely a problem that comes up a lot introducing Windows 10 to people who have never used a computer before. They don't know what is running and what isn't, they don't know which window is focused if there are several on screen at once, they are afraid to poke around in the start menu in a way they weren't when it was simpler and clearer, and in general don't understand the meanings of icons and menus that used to be properly labeled. Windows 95 was a lot more clear and discoverable for someone who had never used a graphical computer.


> ... people who have never used a computer before. They don't know which window is focused if there are several on screen at once

This isn't just a problem for beginners: I'm old hat and I sometimes can't tell what Windows 10 has given focus to at times when I have several things on the go, especially over two or more screens where windows overlapping isn't the obvious go-to clue. The distinction between focused and not is sometimes so close to non-existent it might as well be completely non-existent (like the titlebar text+icons being a slightly different shade of grey), and it varies from app to app (even amongst Microsoft's output) so there is not one set visual cue to follow.

It definitely used to be better than this, including in Windows land.

When I get around to it (so probably never!) I intend to write a little tool that scans for the top-most window and draws a bright border around/over it somehow. I know this is possible (and probably not difficult) as I did some similar hacky window decorating back in the Win2K days[‡], but I've been almost entirely a database+infrastructure fellow for more than a decade and my desktop dev knowledge has rotted terribly.

[†] an always-on-top window positioned so it is a line across the top of the focused window would do, four such objects, one for each side, would be the easy hacky way to achieve a border, a single drawing surface with transparency and mouse click-through would be cleaner but with my current skillset more faf working out the relevant API jiggery-pokery or finding a library that wraps that nicely already

[‡] Using Delphi. Anyone else remember that? Does it still exist in a similar form?


In Settings > Colors you can choose an accent color and turn on the "Title bars and windows borders" option - I think that would give you exactly what you want.


Not everything does anything useful with that setting or even respects it at all, not even everything from MS, not even everything common from MS.

It does cover a fair few things, but not enough to calm my irritation!


I have found this option to be surprisingly inconsistent, both with built-in windows apps and with third-party apps. Many UWP apps don’t honor this at all, in addition to many electron apps.


Delphi is still around. A former coworker at my previous job (left this Jan) used it from time to time to make some quick little utilities. Not sure how well it's aged as I've never used it myself.


Delphi is still around in spirit through an IDE named Lazarus.

https://www.lazarus-ide.org/


> people who have never used a computer before

To be fair, back in 1995 the number of people who had never used a computer before was huge, so it made sense for companies like Microsoft to cater to their needs.

Today though, the fraction of people who truly have no computer experience must be utterly minuscule, so I can see that MS does not see the beginner experience as a priority any more.


Counterpoint: There was a headline recently in Japan about the fact that (IIRC) 40% of "kids" between 10 and 20 who do have a computer at home never used a computer. All they know is smartphones, Instagram, and Tiktok. Maybe Twitter. Some may be old enough to have used Facebook.


> They don't know what is running and what isn't

And they shouldn't. That's the idea Windows 10 tries to convey. Windows Store apps already stay in the background even after you close them. You shouldn't care which app is running or not. Clicking its icon would bring it up and that's the only thing the user needs to know.

iOS and Android already work that way.


> You shouldn't care which app is running or not.

I know that there is a strong culture that believes this, but this sort of thing drives me nuts. There are very good reasons why you you would both want to actually close programs and know what programs are running and what are not.

> Windows Store apps already stay in the background even after you close them.

This is one of the many reasons why I don't use Windows Store apps.


> There are very good reasons why you you would both want to actually close programs

Other way around: modern OSes don't want to make it look like programs aren't running when they really are (which has been possible ever since installable background services were created); modern OSes want to be able to close programs behind your back, while they seem to still be running, and then restore them to their previous state when you come back to them.


And it's insanely idiotic. In the mobile world we have this due to a vastly different UX, and traditionally very limited resources. Remember the first iPhone didn't allow more than one app running. Today phones have fast CPUs and plenty of RAM, it's mostly about not draining the battery. None of these constraints apply to the desktop. Even on laptops power consumption is much less of a concern, unless you have some program going berserk spinning on 100% CPU.

So trying to bring this same concept to the desktop is insanely idiotic. Microsoft failed miserably by making Windows 8 a phone OS and then bringing it to the desktop with minimal changes. Windows 10 improved so much by actually focusing on the desktop again. Including such no-brainers as allowing store apps to run windowed. I'm not a Windows user anymore myself, but none of my friends who still have it as their daily driver like or use apps.


> modern OSes want to be able to close programs behind your back, while they seem to still be running, and then restore them to their previous state when you come back to them.

I thought they were suspending them, not closing them. Those are very, very different things.

In any case, pretending that applications aren't closed when they are is no better.


What is “suspending”? Like what https://github.com/maaziz/cryopid does? No OS uses such a technique for production process management. (Well, I guess you could say that Genode does if you consider VMs to be processes.) Processes are too non-self-sufficient for this to work with full generality. (For example, what would a process do if it “freezes” in the middle of executing within a shared library’s code section, and then an OS update is applied, replacing that library on disk? The “defrost” process would presumably re-run the dynamic loader, putting the new version of the library into memory; but now the defrosted thread’s PC points into the middle of an instruction. Oops!) CrypPID-like techniques only work in controlled environments, like if you’ve got a machine serving solely as a Docker container host and you want to “pause” containers.

On the other hand, if by “suspending” you mean “the OS asks the program to persist its running state to disk, which it can easily do because it’s using GUI-control and scene-graph frameworks that know how to serialize out their current state”—and then, when it has done this, the OS considers the process “clean” (like a clean page of disk cache) and therefore discardable—then yes, the OS is suspending processes.

Keep in mind that you need to add a flag to your application’s OS-readable metadata (the application manifest in Windows; the Info.plist in macOS/iOS) opting into this behaviour, signalling that your process is okay with being shut down at any time when it’s in the OS-visible “clean” state, and isn’t, say, a host for an RPC server that other applications are expecting to have a stable connection to.


>iOS and Android already work that way.

And honestly, it's annoying as hell sometimes. You never know when the system's going to kill an app or not. Sometimes you think something might stay in the background for a bit then it doesn't, other times you think you've killed something and it's still running, or it's started itself again. Manual memory management options keep getting stripped back in subsequent android versions. I get the idea behind why android works like this, but the lack of control bothers me.


The difference is that iOS and Android will kill apps as it sees fit. People come to expect that and know that reopening their reddit reader after a few hours, it might not be at the same state they left it.

On a desktop, this is different. You have to explicitly close programs. Because of this, it becomes important to know what's running and what isn't.


> The difference is that iOS and Android will kill apps as it sees fit.

Only because iOS and Android are wasting RAM (the former to a lesser extent, but iOS devices also have less RAM to begin with!) and running with zero swap space. (And the latter point in turn is due to the abysmal, bottom-of-the-barrel quality of phone eMMC storage.) This is not progress, it's just the OOM reaper being overactive for lack of a better option.


> and running with zero swap space. (And the latter point in turn is due to the abysmal, bottom-of-the-barrel quality of phone eMMC storage.) This is not progress, it's just the OOM reaper being overactive for lack of a better option.

And all this grief could be avoided by Linux actually allocating memory when its supposed to as opposed to just saying "Sure!" and then sometime later killing completely random programs when someone actually attempts to use memory they requested.


I wish I had more control over the mobile experience though. For example, my run-tracker apps will get closed and stop tracking me if I open too many apps during my run while leaving Spotify, Youtube, and my podcast apps open.

I'd like to be able to somehow pin the run-tracker as top priority. Sucks to realize after a run or bike ride that it stopped recording 20min into it.


Your run tracker should have a proper background service to make that not happen, or a persistent notification widget. If it is still happening, look around your power saving options to exempt it from being killed due to "excessive power usage" when in the background.


Windows Store apps follow the same semantics. When you close them, they are not unloaded from the memory and they restore their state at the startup even if you kill them. The "green leaves" next to the process name in Task Manager means that the app is suspended, but not killed, despite that it isn't open anymore.


As much as I dislike the general Windows UX, which is largely because I can't run a custom window-manager, I feel like I'm the only person that thinks Windows 10 is a big improvement. Maybe we don't have to look at it as a subjective like/dislike type of thing and just say that for me it's an improvement.

I use the classic non-contracted-to-an-icon task-bar, in the small variant. That's just to say that my task-bar looks and feels the same as it would on Windows 95.

The big improvement for me is the start screen. I use the fullscreen start-menu (I call it screen since it's fullscreen), where I put the shortcuts I want easy access to. It was introduced in Windows 8, but later the defaults were reverted to a 95-style mini-menu.

I have desktop disabled (desktop icons, more specifically), because I find that a desktop has a negative effect on organization, so all my icons are in the fullscreen start.

When I need a shortcut that's not pinned to my start, Win-S pops up the search dialog.

Apart from bloat, I'm pretty content with this UI.


I was a big fan of Windows 95 (I was at the Launch Event!). I was working at Adobe then, porting Mac apps over to Windows.

But I also like Windows 10. In fact, I switched "back" from Mac OSX to Windows 10 several years ago when there was no decent "Pro" desktop option from Apple anymore and am completely happy. I then switched laptops over to Windows 10, too, from Macbooks because there are some great Windows laptops out now.

The UI is very good. Sure, every now and then you delve deep into control panel and see an old wonky UI that's a holdover from a prior OS, but 99.5% of the time, everything is consistent, stable, and rational.


The Windows 10 computer I'm using right now has 4 different audio settings screens, all stock, all equally accessible, and it's not at all clear from the names and menus which screen will set volume and which will switch audio devices (those are different settings screens).


Its not just that... There are people here who claim win10 is "nice" but have obviously never really _looked_ at it. It takes about 1 minute to start 4-5 different applications which are the equivalent of a 20 year history lesson in windows UI's. There are the win3.1 interface which is basically a menubar, there is the 95 era ones which have a menubar and button bar, and maybe even right click works, then there are the ribbon applications, and the modern/metro ones. All shipped with windows, so we aren't even talking about 3rd party apps at this point. Even then we haven't even talked about MDI, or the mishmash of shortcuts for doing the same operation depending on which application.


Yeah, I was only giving the example that came fastest to mind, given that it bites me the most often.


That's a good point, naming of settings sections is quite bad in Windows.


and some of the groupings are so weird! (looking at you, "Update & Security".) i mostly just give up and text-search for the setting i want


Maybe it's my particular setup, but I've had nothing but problems using a Windows 10 workstation which I routinely remote into. Graphical scaling is unstable and screwy, I constantly have to select "reset view" in outlook, programs routinely decide to start off screen, and my two identical monitors behave differently for reasons I've been unable to discover in settings.

I'd just use Linux, but at my organization everything is Outlook-centric.


you could certainly run a custom window manager in win95. That's what I did during the best years of my life. No idea if with the windowses of today is still possible.


Sure, people still do it:

https://www.reddit.com/r/desktops/comments/52402v/windows_10...

It's just less common.


I tried to google around, but info was very thin. The bbzero github repo was last updated 4-5 years ago. Is this something that's bolted on top of the current WM, or is it a veritable replacement, in that the vanilla WM is unloaded and doesn't use resources?

The concensus, whether correct or not, is that Windows WM is tightly coupled with the rest of the system.


That's how it's always been. Litestep/bbZero/etc have always just replaced explorer.exe, but ultimately the Windows system is much more coupled than the Linux one. So you're still going to be using functionality beyond the "window manager".


it seems litestep is still alive.. just.


HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon

go to the shell key and replace it with whatever you want.

This has been supported since at least Windows 2000!


It actually has been supported since about forever. You could even do that in Windows 3.0, albeit in an .ini file.


Win10 is particularly bad. I have recently started to maintain some Windows infrastructure and have been simulating a new deployment with Server 19, Win 10 clients, Win 7 clients, etc... and the start menu on modern Windows is absolutely unbearable. That combined with the constant full screen takeover modals for basic permissions, and the default icon collapse task bar ... it’s definitely a high bar for a new user.


you can change the way permission modals display to make them a pop-up window w/o the fullscreen! makes them much less jarring


How?


when a permissions popup appears, click "More details" -> "Change when these notifications appear", then take the slider down one step, to "Notify me [...] (do not dim my desktop)".

that's on Windows 10 though, not sure if it's possible on older versions


Don't forget that the start menu has regressed into a CLI because it is too ponderous to find anything by graphical navigation. Regressed, because typed querys take seconds to complete.


I loved this feature in Windows 7, because all it did there was an exact substring match over the items in the Start menu. As a result, it was instant.

The Windows 10 one where it's trying to do a Bing search, look through the Microsoft Store for apps to buy and who knows what else is... I can see what they're trying to do, but if you have 24 years of ingrained habit of only using Start to launch programs it's annoying having the feature made so CPU and disk intensive to add options you don't use. At least Microsoft have put a lot of work into improving it. Using some of the early iterations on a slow spinning-rust laptop would frequently result in the process searching your installed programs timing out, leaving only the web search option.


Nondeterministic keyboard search is infuriating.


On both Mac and windows, I start programs by typing the first few letters in the search box and clicking.


About 25% of the time on Windows, this takes me to an internet search.


So pretty much the only thing I boot into Windows for nowadays is to play FF14.

My routine is always the same: log in, open Chrome if it didn't open already (happens 50% of the time), and then open the start menu, type fi, and click the first icon that appears.


Similar thing happened in Apple with iOS 7. Before they used skeuomorphic design using realistic images to represent UI which was supposed to help people using computing devices first time. E.g. leather notepad to represent reminders application, 3D buttons, etc. They abandoned this design paradigm and switched to minimalist one probably because most of their users became computer-literate and did not need those analogues to understand UI.


I think going with a simple icon is far better. Consider the design of symbols such as corrosive or the street is icy, or s-curve coming up ahead. They do not look like real-world images but they are so wonderfully intuitive. We do not need to stare at the detail to understand what it is.

Likewise, the new icons are brilliant. The reminders app has basically a list. Music is a musical note, etc. The only one that really sucks is "photos". It looks more like a colour wheel.


I still get a kick out of the clock icon.


They reverted some of the things because the paradigms around them have changed, such as "running applications". That paradigm is dead on iOS and Android and is slowly dying on desktop operating systems too. We're getting over the times when we have to carefully consider which apps we should be running to conserve memory, CPU etc. That's not something users should be thinking about.

Modern equivalent of the Start Menu is still the start menu. When you click on it, it shows a list of applications available on the system. Shut down is also the nearest available option there.

So, I disagree with your sentiment that Windows 10 has rolled back good ideas. Windows 10 is simply the best Windows to date ever in all aspects.


The problem is that it is worse. The system begins guessing what users want, and that means the system will be wrong. It often is.

For example, on my Pixel 3, Spotify is terminated if I open the camera app around 80% of the time. Is it a resource issue? Who knows! The system gives me no insight.

Is this what users really want? I'm very doubtful.


> Windows 10 is simply the best Windows to date ever in all aspects.

Wow.

My opinion is very, very different. The best Windows to date, in my opinion, in Win 7. I even consider Win 95 to be better than Win 10.


But have there been long and deep usability studies with users from all ages and experiences to determine if this is the most comfortable and easiest to use system? Or did Steve Jobs go on stage, show the iPhone tailored to his preferences and usage style and then everyone copied that? Because I kind of feel like the systems you describe are doing little more than cargo-cult programming mixed with envy for whatever is "cool" at the time.


The main issue is organizational incentives. Microsoft has thousands of employees in the windows division who need to justify their jobs by constantly changing things. They call this innovation but it’s really just busywork to minimize the chance that they get fired or their boss’s boss’s boss loses clout in the company. The major innovations in windows over the past 20 years have been increased stability, an App Store, better security, easier control in corporate deployments, and some cloud and local backup features, the rest were mostly unnecessary.


Man, if you assume knowledge and expect people to type what they want why not just use a decent CLI?


press the "start" button to stop your computer? i really doubt this was this helpful to new users. it does make sense for most other things though, like launching programs and maybe even the restart option, but for the shutting down option its fairly illogical so its strange that it took them so long to switch to an icon instead.


I have so far only seen computer nerds complain that "shutdown" is under "start". Regular users have no problem starting the shutdown process.


I love the fact that one of the usability issues identified 25 years ago in windows is still the case today.

* If you "cut" a bit of text, it disappears from your document, and unless you later paste that text, it's gone forever.

* If you "cut" a file, but never paste it, the file stays in its original location, contrary to the users expectations.

This kind of thing is really the implementation details showing through into the UI (the clipboard is not a place on disk, and cannot have files and directories moved into it)


This is likely why in macOS, you can’t cut files but instead have to copy them and then move them with Command-Shift-V (holding down Shift changes the title of the Paste menu item) — the intention to move the file is indicated on the pasting half of the operation, dodging the Cut issue entirely.

Also, cut never really made metaphorical sense for files… you don’t break out the scissors to transfer documents from one folder to another.


I get the menu item to change name with option, not shift.

Incidentally I've been using a Mac for decades, but I never remember to check what modifiers do. I'd always just copy and then go back and delete.

I think the right way to deal with this is to have a fancy clipboard manager / shelf, frankly. Cut things, they're put on the shelf, paste pastes in the top thing in the shelf. I suppose lots of people might just end up with cluttered shelves and lose track of stuff. But the shelf concept is a good GUI power user thing.


Yes, it's Cmd+Alt+V


What happens if you "cut" an important file, and then forget to paste it before copying something else? You could easily loose many hours of work!

That can happen with text too, of course, but because text takes up more screen real-estate, selecting lots of text "feels" dangerous, so you're less likely to accidentally screw yourself.

--

When you "cut" a file in early Windows, did it disappear from the original location, or did the icon fade out as on modern platforms? IMO, the fade makes it clear what is happening, and is a great example of how minor visual tweaks can be imbued with lots of meaning.


Windows 95 behaved like Windows does today: "cutting" a file faded out the icon until it was pasted someplace else.

Prior versions of Windows did not have an equivalent to the "cut" operation; for example, Windows 3.1's File Manager moved files via a "File > Move..." menu option, which opened a dialog with two text fields to specify the source and destination. (Copying behaved similarly; the document-editing-inspired Ctrl-C, Ctrl-V, and Ctrl-X bindings used today were all introduced for file management in the Windows world in Windows 95.)


> "cutting" a file faded out the icon

Cutting a file sets its hidden attribute. If you have turned on display of hidden files then they do indeed appear faded, but you have to be a fairly advanced user for that – for most people they will just seem to disappear.

If you cut or copy another file the the first file will be unhidden, and I suppose also on a clean shutdown. Overall it's still confusing for a normal user who doesn't see hidden files, especially if they intentionally used cut without paste to delete a sensitive file.


I remember https://en.wikipedia.org/wiki/Shell_Scrap_Object_File s. (Most people didn't realize this was a thing. I'm not at a Windows PC at the moment: is it still a thing?)

I've always thought it'd make a lot of sense, for any modern OS, if every time you cut something, you'd be creating a .scrap on the desktop; and then, if you later pasted it, the .scrap would go away. Desktop-as-spool-directory. Same as the way macOS tried to use the desktop as a spool for screenshots until recently (where now the screenshot spool has become a mysterious locationless storage displayed in an ephemeral modal window.)


This doesn't work well with multiple drives. If you cut an object from a usb stick, should it get transferred entirely to your desktop (maybe taking many hours for big files), for you to then paste it somewhere else on the usb stick? (When a direct move would be near instant).

If instead you want a .scrap directory on every volume, you have a new inconsistency - if something is cut, and the device removed before the paste happens, the paste will fail.


Ah, I wasn’t really suggesting that the (contents of the) clipboard be immediately serialized to disk, but rather that an OS built-in clipboard manager be created that exposes its contents as shell objects living on the shell Desktop (rather than the filesystem Desktop.) So every cut would put a shell object onto the shell desktop, but this would just be a handle to a memory object living in the clipboard manager. Pasting would move that object from the shell Desktop, to the shell Trash, such that emptying the Trash would get rid of it; but it wouldn’t really be “in” the filesystem Trash either.

If you select and copied the shell object on the shell Desktop, you’d just be making the “selected” clipboard handle the one referenced by the shell object.

If you moved the shell object, you’d be instantiating a physical filesystem .scrap object at the destination.

And, of course, since all this is simulating something persistent, you’d expect the clipboard manager to make each clipboard object refcounted, and not get rid of objects as long as something (e.g. a shell object) holds a reference to them; even meaning that the clipboard objects with nonzero reference-counts would need to be persisted to an OS “clipboard database” on shutdown, such that the shell objects referencing them would still be valid on next startup.


I'm honestly impressed that they thought so in depth about these things. That said, I'm of the opinion that a clipboard manager should be a standard part of an OS and would arguably solve this problem.


This wouldn't have been considered a problem in the 90s, but my concern with clipboard managers is that they would end up containing a lot of sensitive information, particularly passwords from my password manager.

(Not that this isn't a concern otherwise, but there's a difference between a pasteboard that holds one piece of information at a time and immediately purges old information, and one that logs history for an extended period.)


Clipboard managers, such as the one that ships with Windows 10, allow applications such as password managers to mark something to not be stored in the clipboard history. KeePass supports this functionality as an example.

Really though autotype is the safer option.


Yeah, my concern is that the user won't necessarily be copying from a proper password manager. Passwords are not the only information that people copy and paste: there are credit card numbers, SSN's, drivers license info, etc etc.

You could argue the security trade-off is worthwhile, and I might even agree if the clipboard manager isn't enabled by default. A clipboard manager requires you to treat your clipboard in a certain way that I don't think users do in practice.


It isn't enabled by default (currently) but there isn't really anything you can do to protect someone who copies important out of a text document and/or leaves open access to their desktop anyways without annoying them so much they work around it in an even worse way. Same thing with browser autofill and password managers.


Windows 10 includes one with clipboard history now.


Windows 10 has a clipboard manager with clipboard history now. It's largely just an awareness issue at this point.


I've used Windows for decades and didn't know about this until today.

Start -> Clipboard Settings -> Clipboard History

Once on, Win+V shows clipboard history. Super cool.


I had no idea! How do you access it?


Worth remembering: the original windowed interface on Xerox machines was a view into underlying system objects. It was designed around a unified vocabulary of interactions that allowed user to message those objects and also direct inter-object communication:

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

Xerox -> Apple -> Microsoft interface transfer preserved nothing of those core concepts. UI became a crutch developers grudgingly added to the system for "those stupid users". Thus, most software engineers today are still convinced that a teletype emulation is the best possible interface to the underlying OS that can possibly exist. Also, normal users are treated as second-class citizens in their own systems.


> Also, normal users are treated as second-class citizens in their own systems.

I would argue that’s an artifact of corporate software development culture and not GUI design.


Settings > System > Clipboard

Make sure "Clipboard history" is turned on. It will also tell you how to access the manager: "Press the Windows logo key + V".


A recent example of this for me is using Abaqus (an FE solver). When using it to set up models I usually set all of the boundary conditions and contact interactions in a script. It wasn't until I was helping a colleague (who was using the gui) with a problem that I realised some of the options just aren't available through the gui.


I really like the Windows 95 UI. That's why Windows 2000 was my favorite version of Windows, ever. It was very consistent and functional throughout.


Yeah, despite 20 years of "progress", I don't see today's desktops being any more usable than Win 95's, for the typical mouse-and-keyboard setup. There's a couple niceties that have been introduced since then, mostly keyboard shortcuts for window management tasks, but otherwise it's all change for the sake of change, and change for the sake of advertisements.


Launch-programs-by-search becoming common is the only significant advance I can think of since the 90s. I've added some keyboard window management to my personal workflow since then but I don't know any non-geeks who do that—hell, most of them don't use launch-by-search either.


> Launch-programs-by-search becoming common is the only significant advance I can think of since the 90s.

Windows had this since at least Win 7, so it's not really a recent advance. I dislike that the the Win 10 start menu is essentially unusable enough that it requires actually doing a search, though, which is why I use a replacement start menu.


Well, nine or ten years after the 90s, anyway, haha.


Ah, yes, my mistake!


> Launch-programs-by-search

Which ironically doesn't work well anymore with Windows 10 because the start menu is often too laggy to open, or because it performs any other kind of unwanted search like web search.


Compositing was huge. I agree most of the usability stuff is around the window management improvements though. Overall I wouldn't say there has been a lot of change though, you could teleport a Windows 95 user into the present and they'd immediately be running with the current Windows 10 desktop.


I do not like compositing though, at least not the way it is implemented in pretty much every current desktop environment (including Windows) where the composition is synced to the monitor refresh rate, meaning that it will always be at least ~8ms late on average and that is assuming everything else is synchronized - but this is not the case as applications draw their output offscreen to a surface that is picked up by the compositor (the mechanism of which is irrelevant, the application might just render to a compositor provided resource or the compositor might copy a VRAM surface or it may transfer bytes from the system RAM, it doesn't matter much in what i'm describing) which happens asynchronously and then notify the compositor that the window contents have changed which is again picked by compositor asynchronously. This introduces several frames of delay between the application drawing something and it appearing on your monitor. It is not a big problem when you are passively watching something, like a movie or an animation, but when that "something" that is drawn is an immediate reaction to an action you just did (like resizing a window with your mouse) you can "feel" that delay.

With non-composited window systems (which nowadays is pretty much only Win7 with DWM disabled and pure X11/Xorg) all that overhead goes away since applications draw directly to the video memory that is to be presented to the monitor (the GPU might do some minor composition itself but that is irrelevant as there aren't any delays involved).

Compositing would work if all composition was done on the GPU using dedicated hardware (similar to how the GPU already performs composition for the mouse cursor and an overlay that is often used for HUD-like controls like volume control, but those are not general purpose features that can be used for desktop composition) and applications could draw directly to the video surfaces that would be composited pretty much the same way a non-composited desktop can draw directly to the video memory (this would introduce artifacts like tearing but this is something that could be handled separately and to some - like me - tearing is perfectly acceptable for pretty much all interactive actions).

I am not aware of any GPU or window system that does compositing like this though.


Compositing was a hardware change, though. Modern GPU's (including GPU chipsets that are integrated as part of a CPU chip) basically accelerate 2D rendering by managing arbitrary surfaces as glorified sprites, which the hardware can "project" or "render" into other surfaces or on the screen in all sorts of ways, including occlusion, alpha blending, arbitrary 3D transforms etc. Hardware of the Win95 era didn't have anything like this; even moving and resizing windows only "animated" a wireframe rendition of the window, as with twm today.


Windows 95 was the first release to support dragging of full window contents. I think that hack was even supported in the Windows 95 tech preview for Windows NT 3.51.


But that was via BLT copy IIRC. Which is why there can be a slight delay filling the part of the screen being unobscured as the overlapped program repainted itself.


Compositing required a hardware change to work efficiently but it's not like you just slap a you into a win9x box and your desktop is now composited.


Personally I think VS Code is the epitome of the evolution of the ~~Windows~~desktop UI, and it’s one of my favorite UIs ever.


I really miss that design. It's a lot more relaxing to look at than modern "flat" UI, IMO. Depth and consistent use of UI elements make it so easy to tell what's what. Modern UI seems to be designed to look like a static glossy brochure, even when it's full of interactive elements.


I agree. Even if 95/98/2000 look outdated by today's standards, these are still the best designed Windows IMO.

Windows 10 is really a frankenstein.


The fact that the Settings app and the Control Panel are allowed to coexist baffles me


Control panel has to stick around for compatibility with legacy apps, including those that register functionality into it and call it's paths directly.

It's pretty well hidden from users at this point and basically just a collection of links to compatibility components. I've never understood the concern.


How do I edit environment variables with new UI, for example? It's pretty common task and I always have to use that old UI. Thankfully it was actually slightly improved in Windows 10. Another thing is hotkeys to switch keyboard layout, it's still the same UI from Windows 98.


I'd recommend Eveditor (http://eveditor.com). Of course it would be better if Windows had something like that built-in.


The existing editor does everything you need. The only benefit for that tool would be editing %PATH% more easily, but Windows 10 includes a pretty good editor for that.


Win-key "edit" brings "Edit the system environment variables" up as first option.


That command launches old settings UI.


    rundll32 sysdm.cpl,EditEnvironmentVariables
is the most natural choice /s


There is a ton of stuff you can’t do with Settings but only with Control Panel. I don’t think you can admin a Windows machine without ever using Control Panel.


I use the control panel about 99% of the time. I honestly can't remember the last time that I even looked at the settings app.


I have encountered a few things you can only do in Settings :).


With the past few releases they’ve actually been removing things from the legacy control panel, forcing me to use the new Settings. Sometimes the old options I want just aren’t there, since the new settings are always so oversimplified.


I wonder if anyone has written a third party replacement for both settings applications? If not, it seems like this would be a very desirable utility, particularly as things shift more toward the new Settings app. I smell an opportunity for someone to make a few bucks...


Yes, they certainly exist. But if I needed to change any of them, I've done so long enough ago that I've forgotten doing it.


There are plenty of perfectly mundane, common settings that are only accessible through the old control panels. And, of course, there are a couple of different styles of old control panel to begin with - the more recent 'modular'-ish ones and then ur-panels with lots of tabs and 'Advanced' buttons. As a UI thing, it's easily Windows at its ugliest and most baffling.


But there’s so many settings where you need to drill into the legacy control panel to change things properly. Want to do anything advanced with your network connections or adapters? Got to go into the old style network settings in the control panel. Want to change your audio output device (for example to output to Bluetooth headphones)? Got to go into the legacy control panel audio settings. Want to change your power settings to not go into standby when your laptop screen is closed? Legacy power settings. Want to enable windows features like IIS, hyper V, etc? You can add features in either one but the old fashioned one has a nice tree list with additional tooltip information, whereas the new Settings interface just has one long list that is clunky to interact with.


Want to change your audio output device

This you can do right from the volume control widget on the taskbar. One of the few things that's arguably easier and more obvious in Win 10 than in OS X (where you either have to hit the control panel or know to option-click the menubar doodad).


Is it possible to adjust the keyboard repeat key delay without having to go to the Keyboard item in Control Panel yet?

Last I looked I couldn't find that option anywhere in the new Settings app.


The investment is too great in custom Control Panel extensions from hardware vendors, including for hardware supplied to basically all major OEMs. If they toss that, a lot of hardware will become unconfigurable.


Windows 2000's UI was a slightly evolved 95 UI, but I think Win7 is my favorite. I used the classic theme so it looked like 2000.

And since most home users had no experience with 2000, they saw the theme and thought I was still running 98 or ME in 2010.


>That's why Windows 2000 was my favorite version of Windows, ever

you could still get the same look (with the "classic" theme) up until windows 7.


You get the (IMO ugly) look, but none of the feel. The old start menu, the old explorer, etc. are not available with the classic theme.


I installed Lbuntu on an older ultra-portable machine, and while not extremely pretty the UI is extremely snappy -- and mostly resembles win 95. I do miss the ability to search for a program by name, but that is about it.


Lubuntu is using LXQT these days right? I'm running it on arch, but by default that DE should support menu searching out of the box, and if it doesn't I expect it's been configured out for some reason. Definitely worth playing around to bring it back.


One of the trends that W95 started that I still find objectionable is putting the 'X' close button gadget next to the other window controls, so sloppy mousing can lead to accidental closures when the intent was minimization or maximization.

This is one thing that the original Lisa and MacOS got right (and NeXTstep, and GEM and I think AmigaOS) but W95 and its successors did not. The close button was on the far left, and the other actions on the right.

(EDIT: my recollection was wrong about the Lisa, I think. Its window controls were not as clear as Mac OS)

Unfortunately OS X inexplicably adopted the W95 conventions. And in the first Aqua releases made it even worse by hiding the functionality icons until mouse-over.


It was annoying. The problem they had was the upper left already had a control there. The system menu. In some programs it is still 'there' but hidden. You can see it if you left click on the upper left corner. They could not get rid of it as some old win3x programs went trolling around in that menu and changed it.


Double-clicking the upper-left corner closed the window, a carryover from earlier versions. I think it still does this, to support programs that auto-click (such as quick and dirty corporate IT apps and such). They couldn't put a single-click control in the same spot.


Also, every app, whether it is hidden or not still responds to Alt+Space which does what the single-click used to do and bring up that old window controls menu, and it still drops down from that same Win 3.x corner. It's a fascinating commitment to a strange backwards compatibility.

(Up until Aero Snap in Vista and it's keyboard shortcuts of Win+Arrow Key I used to use Alt+Space,M a bunch because it was always the easiest way to move any window by keyboard in the event it got stuck somewhere out of mouse range or you just didn't feel like switching to mouse.)


It's also a handy way to get a window that got moved off screen easily, which can happen in setups with multiple monitors that get connected and disconnected.

Also, most applications will actually honor Ctrl+C/V as well as the Ctrl+Insert/Shift+Insert shortcuts, from the 1987 IBM CUA guidelines[0].

[0] https://en.m.wikipedia.org/wiki/IBM_Common_User_Access


I guess? Couldn't they just have moved that control to the right? Or put minimize & maximize next to it and had close on right, like in NeXTstep?

Somehow everyone thought it was a great idea to copy them.

I change this in most of my window managers on Linux, but Chrome stubbornly insists on doing its own window controls.


Remember one of the goals was 'just works' too. As they wanted to move people off win3x. The win3x apps had to work/look the same but pick up some of the nice styling from win9x. Some programs would also roll their own system menu and put it in that spot. That would cause a cover up of the close. For a run of the mill application it was just a style and they could have put it anywhere. For programs that controlled the whole window including the title bar area it was a tough choice. Even then it did not always come out looking right. The win95 style guide basically told everyone to knock it off and let windows control it.

Was it all a good choice? For some it was annoying. There was a shortcut that did the same thing. If you double clicked the upper left it would close the window too. Which is how they taught people to use win3x. Most certainly it was an interesting compromise to the constraints they had.

Copying the style was all on everyone else though. They wanted it to be familiar to windows users. All to 'gain market-share' I guess.


> putting the 'X' close button gadget next to the other window controls

My keyboard has volume up (Fn-F11) right beside shut down (Fn-F12). I've accidentally turned my computer off a few times.


If we’re going to talk about wm. CWM is definitely the best: the mouse is only used for sloppy positioning and focus changing (not even topping/level changing.) Everything else is done via the comparatively precise keyboard.


I feel like Microsoft... well, basically everyone in tech really, hasn't actually given a damn about the user experience of their products for quite a while now. Personal computing used to be about enabling people to use technology to make their lives better and OSs like Win95 were focused on allowing the user to leverage the power of computing for themselves. Nowadays, computing is apparently about herding users like cattle so you can get them to look at more ads and harvest their sweet sweet data. Developers stopped caring about user experience because thinking of users as people would make their job of treating them like cattle harder.


Every once in a while I'll catch myself trying to read web content in the tiny window between the sidebar ads, the floating video, the cookie control panel and the "other content you might like" block... and I'll stop, pull back, and ask myself, "how did computing get this BAD?"

Especially in the web context, I think it's as you say: the real product has little to do with the task you came to the site for, so everything is trying to distract you away from that task. The end experience is as if the people who used to run warez pages with 9 giant buttons to download dodgy IE toolbars and one 20px link to get the actual file grew up and got jobs running mainstream news sites.


> Every once in a while I'll catch myself trying to read web content in the tiny window between the sidebar ads, the floating video, the cookie control panel and the "other content you might like" block... and I'll stop, pull back, and ask myself, "how did computing get this BAD?"

Seems pretty nice to me: I have control over my browser. I can simply elect to use any "Reader Mode" extension (or feature of the browser). Or use an anti-annoyances list on uBlock to avoid all sorts of cosmetic warts.

I can even configure my adblocker to remove entire parts of the layout like sidebars and navbars. It's amazing.

I don't have much choice when I'm using a poorly designed native app. And I'm thankful I am able to rely less and less on native apps. The computing experience is only getting better and better in my eyes.


At least the cookie panels are not the fault of IT, they're the fault of greedy marketers and clueless politicians trying to rein them in.

For the inventor of floating video ads, let's just say I hope that hell exists.


They're a great example of the unintended consequences of regulation. I'm sure the original intent was not to have a huge banner where the options for managing what the site tracks were a one click "sure, whatever" or a multi-stage process of "no -> manage preferences -> categories -> reject all -> find the actual 'save' button, not the 'enable and save' one -> confirmation page where 'cancel' is lined up in the same place as 'save' was previously".


The regulation states that consent should be given freely so not only should tracking be opt-in but it the prompt shouldn’t be obnoxious nor pressuring you to opt-in.

The regulation is sane, you should blame the lack of enforcement that allows assholes to get away with being non-compliant.


I can't even read the article for more than a few seconds because I get redirected to one of those full page "Your computer has a virus" ads that won't let you hit the back button.


> the real product has little to do with the task you came to the site for, so everything is trying to distract you away from that task

This is really the biggest thing I had against the reddit redesign (and why I love the simple UI of Hacker News). It cleared away space for ads on either side, made the platform as a whole better suited to image- or video-content (dissuading users from using text posts on the site for discussion), and ultimately felt - exactly as you described - "distracting". It made it blatantly clear that the site was for upvoting pictures, getting inundated with political propaganda from whichever side is currently paying more, and ultimately wasting the user's time.

Facebook is feeling the same way.

I've literally gotten to the point where I separate "good content" from "time-wasting content" by browser - anything that I can consider educational, time-sensitive, or otherwise important (... I have a list of pinned tabs; Schwab, ThinkOrSwim, Financial Times, Bloomberg, The Economist, the Wall Street Journal) goes in Opera.

Anything that I save solely for my spare time goes in Firefox. All of my social media accounts go in firefox (google is the only thing I knowingly allow to track me between sites in Opera). Reddit and Hacker News go in Firefox (although sometimes I browse HN without logging in in Opera, so that I don't have to switch browsers to log into news sites with paywalls).

It makes it really easy to keep myself focused when I need to do something important, and it also makes it easier to live with some of the more annoying (but security-bolstering) browser settings I currently use.


The peak of systematic thinking about UX in tech was arguably the 1984 Macintosh Human Interface Guidelines (https://www.amazon.com/exec/obidos/ASIN/0201622165/guidebook...).

Unfortunately it only took a few years for Apple to start sliding away from their own guidelines, and Windows was always less rigorous to begin with. And then the Web came along and blew the state of UX back to the Stone Age. Sigh.


The Mac is great and all, but I've used a 1984 Mac and I don't think I want to adhere to a guideline centered around running a single fullscreen application at a time.


That was more of a technical limitation than a design guideline, though guidelines may have been constructed around that. Many of the things those guidelines point out are still applicable today by virtue of the fact that Mac has stuck with its "Finder" system of interaction for 3+ decades.


That's an odd thing to say at the tail end of a decade that's seen very rapid and broad adoption of new interaction technologies - touch and voice, to pick a couple. The former in particular is a radical departure from the desktop metaphor. These things take substantial research and development efforts. The result is, in the industrialized world, not merely a computer in everyone's hand but a far more 'personal' and capable device than any Win 95 PC ever was.


> not merely a computer in everyone's hand but a far more 'personal' and capable device than any Win 95 PC ever was.

I don't know if it's just me, but my feeling is that a Win 95 PC is far more "personal" than any modern smartphone. With a smartphone, you are limited by whatever the mass-produced OS and apps allow (and it's getting worse - more recent OS releases allow much less access to the filesystem, for instance), while a Win 95 PC was wide open and could be customized the way you wanted - in the extreme, the whole operating system could be easily replaced.

That is: when I use my smartphone, I feel like I'm a guest at a Google-owned hotel. When I use my PC, I feel like I'm at my own home.


Well, in the extreme you can gut your phone to make it pretend to run Win 95. But that isn't really what I'm talking about, my point is the idea that there's been no work done on UI recently seems plainly inaccurate to me. The last 10 or so years of UI have seen more drastic and widely adopted changes in modes of interaction than the decade between the original Mac and Win 95.


This is so accurate. Thank you. I’ve lost all passion for technology as an industry, unless I find a suitable opportunity to fix the current issues.


You can thank Google and Facebook for that. Tech changed permanently with the 2.0 batch.


It's the appliance vs. tool mentality. Computers were a tool, but now they're just an appliance


And they figured this all out without so-called "telemetry" in Win 3.1. Isn't that amazing. Almost as if today's "telemetry is absolutely necessary for improving UX" mantra isn't actually true...


Telemetry could be an indicator of when they've pushed the user too far in swallowing the horrible design decisions.


Unstated first step in the design of Windows 95: Copy the window title bar buttons more or less pixel-for-pixel from NeXTSTEP, but change the NeXT “iconify” button to do “maximize” instead. But don’t copy the useful mechanism of having the middle of the “close” button “×” be incomplete and look more like “⸬” if the window can’t be closed, like if the document in that window hasn’t been saved.

I can see the reason W95 changed the iconify button; the NeXT iconify button looks like a NeXTSTEP iconified window, so on NeXTSTEP the button is self-explanatory. But Windows 95 does not make windows into icons, W95 minimizes windows to the bottom of the screen into a little line of text. W95 therefore, intuitively enough, made the “minimize” button similar to a minimized window line; the “_” button. The NeXT iconify button, on the other hand, more resembles a W95 window. (W95 was made to run on PCs with low graphical resolutions and memory, both of which made it reasonable to only run and show one program at a time. This made “maximizing” a window a reaonable operation, unlike on a NeXT, where high resolution and multi-tasking was the norm.) Therefore, the change is understandable.

It’s just odd that I don’t think I’ve ever seen anybody mention this.


I hope chunky bevelled edges come back into fashion soon. The UI in these screenshots looks reassuringly solid somehow, like how you'd expect a tool to feel. Not to mention proper scrollbars.


Or, at least, I wish Win 10 had some option to increase the size of the window frame. Those almost nonexistent ones are a serious pain in my butt.


@Johnfen - I came across your comments at https://news.ycombinator.com/item?id=19623785 and unfortunately I am unable to post a follow-up question/comment. What's the best way to get in touch?


try johnfenderson23 at gmail


It's very interesting how the hierarchical file system was something that users couldn't understand, and today with mobile and web we see an almost complete rejection of the file system or hierarchies as a principle of user interaction.

I always found the hierarchical start menu a pain to use. Every company insisted on creating a submenu for their company, then another submenu for their program. And it's quite unfortunate that Microsoft didn't take charge until Windows 10. They didn't set a much better example with their software, and they let everyone do whatever they wanted in the Start menu, resulting in a poor user experience.

What I have been doing since Windows 2000 was to move the shortcuts to the actual programs into the top level start menu folder, and all these useless folders into a top-level folder named Crap. So conceptually the home screen of today's phones. That's what the Start menu should have been about, exclusively: Start programs. Not a help.chm. Not the uninstall.exe. Not a link to the company home page that nobody ever uses. Just the program. And Microsoft should have enforced that from the beginning.

Hierarchies are not needed most of the time. To locate an item, a hierarchy of folders where the items are mostly hidden is probably the worst solution. It's easier for me to locate something in a list sorted alphabetically where every item is visible. And the best solution is spatial consistency: Put the items in a fixed place on the screen (or keyboard). That's how it works in the real world, and that's what the brain is optimized for.


Oh and if you're willing to dust off a flash player, check out Windows RG (Really good edition). It really is really good.

https://www.albinoblacksheep.com/flash/winrg


> We realized that a truly usable system would scale to the needs of different users: it would be easy to discover and learn yet would provide efficiency (through shortcuts and alternate methods) for more-experienced users.

I feel like the UI designers of today need to memorize this principle. Good, versatile software allows users of all skill levels to accomplish their goals through multiple pathways supporting a variety of interaction paradigms. Much of the software I'm forced to use today seems to adhere to the theres-only-one-way-to-do-it school of thought, which is the biggest source of why software sucks so much now.


85 major versions ahead of Windows 10... and rightly so.


Worth reading the comments - as the original author of the paper Kent Sullivan chimes in: https://socket3.wordpress.com/2018/02/03/designing-windows-9...


It struck me earlier today that while Windows 95 took many big steps forward, iOS was actually inspired a lot more by Windows 3.1.

Springboard is very much like Program Manager.. every icon is an app, no documents, plus you can build up single depth 'groups' of other apps. iOS 'Files' is basically the same as 3.1's File Manager, without the desktop/explorer approach to file organization offered by 95 onward.


I'm gonna go out on a limb and say that iOS was in no way, shape, or form inspired by Windows 3.1.


It would certainly be going out on a limb to say that the people involved in designing iOS had not used Windows 3.1 and held no memory of it.

Of course, I am happy to use the word "inspired" in a loose way, like how many modern musicians were inspired by The Beatles or Kraftwerk even if they are not deliberately indulging in pastiche.


Sometimes I hold down the Start and R keys and quickly type wordpad or calculator. It works but then I realise I'm basically back to the MS-DOS prompt where I started 20 years ago. But minus the 4dos autocomplete enhancement.


just press the window key and search for the program you want.


Thank you but that involves the mouse and more time. Using just the keyboard and ctrl R only takes half a second.


It doesn't involve the mouse at all. press windows key. start menu opens, search field is focused automatically. start typing, windows start doing an incremental search, heavily biased towards your most used programs in the list ordering, press enter if your choice is first on the list, or navigate with directional keys on the program.

Your method only works if the program you want is on the PATH, and if you type the full name of the executable.


I find this incredibly humbling and inspiring. 18 month is not much to lay down fondations of quite a lot of today’s work, 20 years later.

Especially the really simple relational database presented at the end : simple yet super powerful.

Each steps seemed relatively simple yet had a tremendous impact on the product development, user experience and on everyone that discovered computers through Win 95 and following versions.

It gives me a lot of hope that I could one day work on an impactful and fulfilling project !

Thanks for sharing !



Thanks, very interesting! OT: Great to see FoxPro as an example application. I used to work with it and liked it quite a bit. As far as as I know it isn't really supported anymore. Does any one use it still or know if there are similar environments available today?


How would these engineers approached to same design knowing what we know now? (I mean as in Internet).

Every upgrade is a step taken on the same ladder. Maybe sometimes, designing a elevator (read:innovating a new UI) might be the real solution.


Can we talk about the ethics of just reposting, verbatim, a paper written by others, but with an advertisement inserted between every paragraph? How did that become OK?


So Microsoft was agile in the mid 90s....


Win95 is still probably the biggest software launch of all time. Coming from MSDOS, there has never been that kind of jump forward for PC users since.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: