I really love articles about programming from the 90s. Everything was written in C/C++ and maybe some Perl. Working with file formats that may or may not have had specifications, building and using algorithms that you learned in comp sci classes. I miss those days. :(
It's so rare that web pages from the 90s still even exist on the internet. Sadly I have to do a lot of web surfing with the wayback machine as a proxy to the days of the past.
> Sadly I have to do a lot of web surfing with the wayback machine as a proxy to the days of the past.
I'd argue this is not sad, but incredible that the Wayback Machine has been so successful in preserving webpages as they've been revised, expanded, and deprecated. And if anything, it shows just how important the project and projects like it are for the preservation of the Internet's history.
Yes, you're correct. The Wayback Machine is truly incredible. I meant that it is sad that the pages disappear and it is often impossible to contact the original author.
I had a windows xp machine that had very long file names on some pictures, and when the drive ran into some troubles the data specialists I took it to wanted to charge $2k because of the troubles those files were going to give them. Ahh my youth.
<raises hand> Don't remember any file-system problems though I believe I got into the betas mid cycle. I do however remember well the "What the heck is a Start menu and where did Program Manager go?"
Edit: I'm also starting to feel old when threads like this come up :)
The irony of this is half of Windows is STILL utterly BROKEN with respect to path lengths since LFN was introduced in the Win32 API and it shoots you almost every day if you have to work with any deep directory hierarchies. There are so many rules it's unfunny:
Indeed! Try writing a Windows file system driver sometime; I hear it's truly a lovely experience.
The other irony is that the HP48 graphing calculator, originally introduced in 1990, had a user-facing filesystem with full support for long file and directory names. I'd be curious to learn what possessed the people responsible for DOS to have ever shipped a system with the insane 8.3 restriction. Even ancient Unix filesystems (early 1970s vintage) supported far more reasonable 14-character filenames.
> I'd be curious to learn what possessed the people responsible for DOS to have ever shipped a system with the insane 8.3 restriction
CP/M inspired them. PC-DOS (Q-DOS, really) was more or less a rip-off of DR's CP/M-80. It's possible to give "ease of porting CP/M-80 programs to PC-DOS" as an excuse for that.
At that time, short file names were not a big issue, as CP/M filesystems were rather small (5-megabyte hard disks were very expensive at the time) and hierarchical filesystems were outside the realm of microcomputers. PC-DOS didn't introduce directories until version 2.
BTW, the Apple II DOS (all versions) had filenames of up to (IIRC) 33 chars, case-sensitive and with one byte to indicate the file type with a flat hierarchy. ProDOS has 16-char filenames with a hierarchical structure.
Been there, done that. People think it's awesome to create a bunch of long, nested directory names, until the fully qualified file name becomes too long for max path and automation starts breaking.
Spot on. The problem for me is that in some software (Visual Studio) it breaks and other software (MSBuild) it works so you get inconsistent issues. In my case, it dies when building VS solutions (vs.net is win32) on disk but not using MSBuild (win64 platform).
Basically software dev breaks something by creating long file name (which if you ask me is fine) yet the integration tests succeed.
They broke it on day one. Basically we are stuck with a bad decision. It's a Win32 subsystem specific problem. Both NT and NTFS do not have this issue internally.
It's not backwards compatibility - it's just a turd.
It is backwards compatibility, because there are a fantastic number of programs out there that simply allocate a MAX_PATH sized buffer and assume that a single file path has to fit into it. If MS simply changed things so that paths of 500 characters could be returned all those programs would fail.
Or, you could have a new set of API calls without this limitation. This new API could also receive a version argument, so that calls to its present version would return present-style data, even if the then-current implementation were much smarter. It's so obvious, in fact, I wouldn't be surprised if they already did it.
It's so rare that web pages from the 90s still even exist on the internet. Sadly I have to do a lot of web surfing with the wayback machine as a proxy to the days of the past.