Can we get one of these you don't wear on your wrist?
I'm all for the exercise and health tracking, but I hate wearing a watch when I'm typing or at a computer a good chunk of the day, where it would be resting on something.
Both Macs and PCs of the time had a very limited ability to run multiple programs at once - Macs had Desktop Accessories (Calculator and Alarm clock, others later), and PCs had TSR programs that would you could invoke in various ways (Borland's Sidekick PIM being one example).
Later the ability to run multiple programs at once became possible - the Mac got Switcher in 1985, then Multifinder in 1987, and there were various methods on the PC for running multiple programs at once.
Even though these were not preemptive multitasking, these solutions were viewed as "good enough" at the time, so there wasn't really that big of a gap with the Amiga.
Also, popularity - it was the early 90's before I saw an Amiga hands-on as a primarily Mac user - the reaction then was more "why would you want this low resolution thing" when SVGA and higher resolutions were somewhat common on both Macs and PCs.
For the PC, there was TopView, MS-DOS 4.00 (multitasking), DESQView and others. However, most people just adopted workflows that weren't reliant on multitasking (not unlike how most people currently use their phones).
Mostly no. People didn't have access to the Internet so most of their computer knowledge came from books or magazines but those tended to be siloed by platform. Even a "power user" who subscribed to PC World or MacUser wouldn't know much about Amiga or Unix.
Byte was probably the most cross-platform magazine at time, and even then there were only so many pages to cover things - it ended up being mostly PC, Mac, and Unix workstations, IIRC.
There was a 3rd party software tool (the name of which I forget) that used the same graphics memory as scratch space trick when copying floppies on early 128k (and probably 512k) Macs. This reduced the number of swaps required to copy a 400k floppy.
The older term "Systems Programmer" (which still exists primarily in academia, I had this title a few years ago at a university) has always felt more accurate to me as the "high level operations person who has programming as a primary skillset" job that the original SRE job description seemed to be aimed at.
Doesn't a logging system need a storage system that can keep up with it, if the goal is to persist logs for later analysis? What storage could keep up with this?
I think the idea here is to separate the log call site in application code from the log processing/persistence?
So, the nanosecond values quoted are the impact exposed to your application code, but some other process takes over the bulk of the work of the logging.
So as long as the offloaded process can keep up with the average log rate, it'll be fine - but also as a bonus the application does not see increased latency due to logging.
There was a proposal back in the discussion of extending copyright to be "forever minus one day" by the maximalist camp which included Sonny Bono, so there are hacks around "indefinitely".
I actually met Jake in person, more than a decade ago, when I was doing freelance tech support and his parents needed some networking help.
Extremely driven guy, and also super interested in the why of everything I was doing and the debugging process.
Also he had the first kinesis keyboard I ever saw in person, which kind of pushed me down the build your own keyboard route, which really helped later when I was having RSI issues.
I'm all for the exercise and health tracking, but I hate wearing a watch when I'm typing or at a computer a good chunk of the day, where it would be resting on something.
reply