Yes. I recall one in particular that I got on the family's computer back in the day that got my sister and I a right proper screaming to by my mother. We had just gotten a brand new multimedia package, including a Voyetra audio tool -- the one that tried to look like a component stereo system. Well, we recorded a sample of some song, and proceeded to hit the double speed button, because things sped up sound funny. We waited, and waited as the 386 SX/25 worked on this task, creating a massive 30+ megabyte file, at which point the computer bluescreened, with windows gasping and flailing with all kinds of errors related to not being able to find files, corruption, etc.
Now, the hard drive itself was about 130 megs. Not huge, but not terribly tiny either for a computer bought the day the LA Riots started. However, the action to speed up the song created a new file which filled up that hard drive, and once at the end of the disk, Windows did the most logical thing it could do. It started writing from the beginning of the disk, paying no heed to what was already on it. Over the partition table, over system files, over c:\windows, everything. I'll spare you the details of the righteous fury we were subjected to after, but suffice to say, it was non-fun.
This strikes me as nit-picking and not a meaningful distinction. If something was going wrong on a win3.1 machine to the point that it no longer worked properly, you'd hit ctrl-alt-del and get a blue screen.
It's not like you hitting ctrl-alt-del made the computer unstable, you were just poking the cooperative multitasking system (by generating a magic, meaningful system interrupt) into realizing it was already unstable.
If you got to the point of seeing that blue screen it was really really unlikely you'd never recover system stability without a restart, thus it was still a blue screen of death. Of course, you were free to try if you wanted to. But most likely whatever crashed or froze up that process spewed a bunch of crap all over areas of memory it had no knowledge of and fucked a bunch of other things up.
The win32s didn't need that because they were pre-emptively multitasked almost all the way down and could detect their own failures better.
You're thinking of the Windows 3.0 "Unrecoverable Application Error". The marketing hype for 3.1 is there would be no more "Unrecoverable Application Errors", a flat out lie, they had just rebranded the UAE to be the Blue Screen of Death.
Every time a post by Alex St John gets shared I start geeking out. He's one of the principles involved in the creation of DirectX and has written tons about it.
If this piques your interest, I would recommend diving in. He's got great stories.
> The Windows 3.1 version relied on the WinG graphics API, but a series of Compaq Presarios were not tested with WinG, which caused the game to crash while loading.
> The crashes caused game developers to be suspicious of Windows as a viable platform and instead many stuck with MS-DOS. To prevent further hardware/software compatibility issues, Direct X was created.