Last month I paid $1 for route53, 12 cents for S3. and am hosting my https site which I create with publii. Works fine. Took a little time to set up, but now it is just a button push to update the site. Great for my personal blog and phtot gallery.
I pay 80 cents for the same as you. I only have 1 domain on route53. But I do also have a Lambda and API_GW mixed in there that get's infrequent to zero use.
I was one of the first sailors to go through DS A school at Mare Island. Previously the DS went through ET school first. We learned the purpose of every gate in the UDT a 15-bit computer with 512 words of memory before stepping up to the 642A and 642B computers. This was in 1967 when it was still possible to know how every bit of hardware and software worked.
'67...wow. I'd bet I understand the reason for that kind of training. Folks who remembered (for instance) the Second Naval Battle of Guadalcanal (14–15 November, 1942) would still be serving. And the vast differences in the performances of the battleships USS Washington and USS South Dakota in that battle. Admiral Lee, aboard the Washington, had an incredibly detailed understanding of the ship and its systems. He made "Sink enemy battleship while taking no damage" look easy. Vs. the South Dakota's massive screw-ups in her electrical switchboard room - just before the battle got interesting - converted her into a helpless, easy target for enemy fire.
_Battleship at War_ by Ivan Musicant is the full story of the USS Washington, does a good job at detailing that battle from the USS Washington perspective, and also discusses the early portions of the development of the Combat Information Center (one of the first implementations of the manual processes described at the beginning of article- greasepaint on plastic, reverse writing, etc.- came aboard USS Washington so that Admiral Lee could have improved situational awareness inside the tiny armored room with basically no windows that was the citadel).
Read it 30 years ago and still remember it vividly today.
shoot, I forget...but wasn't there a ton of material written re the Naval Battle of Guadalcanal? I vaguely remember the name Samuel Eliot Morrison. Also, as a kid, I remember a Natl Geo style writeup by Robert Ballard since he led an expedition to use ROVs to look at wrecks on Iron Bottom Sound.
I think Lee got a bit of luck too...since all of the rest of his fleet got blown up around him and the Washington was able to take advantage of the fact that all the attention were on the burning destroyers and the South Dakota, while he could take shots at will.
Samuel Eliot Morison wrote a definitive history of the naval war in the Pacific, so your memory is good! Ian Toll's new series is shorter, but also top notch and benefits from more access to Japanese sources.
Along those lines, a biography video for Admiral Lee by a naval history enthusiast really manages to capture what an impressive and capable person he was: https://www.youtube.com/watch?v=58lfaMFUQc0
Thanks. I have no complaint, but I want to point out that there is a difference between an enthusiast and a professional. Enthusiasts don't know what they don't know.
Now that is cool- I have to ask, is there a moment in your memory where looking back you kind of realized "Wow, I can no longer keep track of everything going on with these computers?" (in regards to it being possible to know how every bit of hardware/software worked).
It's still possible to do so. Just get an EE degree, you will understand hardware down to the gate level. Take a solid CS course and you will understand software down the basic levels. Understanding hardware to OS is something that a lot of people still know, what is difficult to know these days is the layers of software by 3rd parties running on the OS.
You can analyze any single part of a modern large piece of software, but I think the point is that you can no longer remember the entirety of the software or hardware. Even a single function is going to get compiled through multiple layers of obfuscation until it hits the hardware and at that point modern CPUs are also extremely convoluted. Nobody is going to know how a function on your OS will compute with absolute certainty.
From what I've read from others a PDP-8 or maybe a PDP-11 is about the limit. They got a 12-bit computer with 32 KiW to be a time sharing system for 17 users with TSS-8. So they were still quite capable.
Contrary to some sibling posts, datapath width isn't really the limiting factor for comprehension IMO; everything is "wider" but not more difficult to understand. For example, undergraduate computer engineering students could and did design pipelined in-order 32 bit processors as part of their studies.
At least IMO superscalar + out of order execution was when things really became too complex to hold comprehension of the entire processor in one's head.
> datapath width isn't really the limiting factor for comprehension IMO
People reckon that changing the head gaskets on an old Rover V8 engine is a complicated and scary job but it's exactly the same as doing it on an old Mini A-series engine, you've just got to do two of them and they're twice as big.
I can go online and get all of my test results including graphs that show change. X-ray reports. The only thing not online is surgical notes, but I easier got copies of them. This is from the UCLA healthcare system. Also Quest labs has online results to patients. This is in CA. My brother-in-law, could not get results from tests in Virginia to get a second opinion. Perhaps it varies by state.
DEC wanted to sell Alphaservers (and VMS). Altavista was a great demonstration of scale and speed and distributed computing (versus a giant mainframe). Crawling and indexing speed. Tons of open ip connections, security,etc were all selling points. Bring large customers to see the physical Altavista and also the ip exchange helped sell lots of systems. DEC was a hardware company. Google wasn’t. Made a lot of difference in priorities.