>> it’s centered around COM/OLE2 (with Unicode strings!)
> While NT was Unicode internally, Windows 9x barely had any WinAPI functionality implemented with wide strings, for the most part only ANSI versions of the APIs were available (unless you were using the much later released UNICOWS compatibility layer).
True, but does not contradict what I said: OLECHAR was 16 bits on all Win32 implementations—witness the OLE2ANSI shim on old VC++ versions that allowed you to pretend it wasn’t (it was 8 bits on Win16 and IIRC on the Macintosh port). Consequently, Unicode strings were used throughout the new COM-based NT4/95+ shell APIs, even on 9x/Me.
I remember reading that the TrueType implementation used Unicode everywhere as well, even on non-NT where you could only get to it via MessageBoxW and ExtTextOutW.
>> Did they not have the time to write a eleven-glyph cache?
> Come on, the answer is right there in the article you linked:
>> saving even 4K of memory had a perceptible impact on benchmarks
> Granted, you won't need a full 4K of memory for those digits, but the point still stands.
I don’t think it’s all that obvious. On a typical Win95 box the taskbar clock is what, probably around 16 pixels or so high? With non-antialiased fonts, that’s eleven (ten digits and a colon) bitmaps of (generously) 16x16 bits, so under 400 bytes, a full order of magnitude below 4K. There are probably other places in the Windows shell you can shave half a resident K off of, even separating hot and cold data in the linker is liable to get you that much.
(Keeping the window procedure of the taskbar paged in—also mentioned in the article—is probably a bigger issue, although doesn’t it mean in the intended state the system will page when the user wants to interact with the taskbar? That doesn’t sound pleasant. The whole thing might also have come very late in the development cycle—I seem to remember there was a registry setting that brought the seconds back, so the code wasn’t even removed.)
> While NT was Unicode internally, Windows 9x barely had any WinAPI functionality implemented with wide strings, for the most part only ANSI versions of the APIs were available (unless you were using the much later released UNICOWS compatibility layer).
True, but does not contradict what I said: OLECHAR was 16 bits on all Win32 implementations—witness the OLE2ANSI shim on old VC++ versions that allowed you to pretend it wasn’t (it was 8 bits on Win16 and IIRC on the Macintosh port). Consequently, Unicode strings were used throughout the new COM-based NT4/95+ shell APIs, even on 9x/Me.
I remember reading that the TrueType implementation used Unicode everywhere as well, even on non-NT where you could only get to it via MessageBoxW and ExtTextOutW.
>> Did they not have the time to write a eleven-glyph cache?
> Come on, the answer is right there in the article you linked:
>> saving even 4K of memory had a perceptible impact on benchmarks
> Granted, you won't need a full 4K of memory for those digits, but the point still stands.
I don’t think it’s all that obvious. On a typical Win95 box the taskbar clock is what, probably around 16 pixels or so high? With non-antialiased fonts, that’s eleven (ten digits and a colon) bitmaps of (generously) 16x16 bits, so under 400 bytes, a full order of magnitude below 4K. There are probably other places in the Windows shell you can shave half a resident K off of, even separating hot and cold data in the linker is liable to get you that much.
(Keeping the window procedure of the taskbar paged in—also mentioned in the article—is probably a bigger issue, although doesn’t it mean in the intended state the system will page when the user wants to interact with the taskbar? That doesn’t sound pleasant. The whole thing might also have come very late in the development cycle—I seem to remember there was a registry setting that brought the seconds back, so the code wasn’t even removed.)