Quite big tiny particles in this application: Xenon is a fairly hefty atomic number of 54 - exactly double iron.
And you need quite a bit of it: even fairly small spacecraft like probes can have nearly a tonne of the stuff. Which, considering there's only 30-40ish tonnes extracted per year at a cost of about 1.5ish dollars per gram is quite a bit!
Ions are small enough that you can bring enough for a whole trip pretty easily. Yes they're still consumable, but you need a tiny fraction of the reaction mass you need with a conventional rocket.
Showing a completely legal electronics market in HQB in Shenzhen and claiming it's selling stolen phones is rather unfair - there is a building not far from that market that sells and recycles phones mostly by stripping them for parts and rebuilding them from scratch, but it's not that perfectly legal market that is so much fun to shop at
I think the metastability can't destroy a chip thing is not true, you can get a flop into a state where it's oscillating at whatever freq the internal feedback path is (maybe up to GHz) rather than resolving to a stable 1 or 0. That can propagate to adjacent flops resulting in a bunch of flops pulling too much current.
Like anything to do with metastability this is a statistical thing - it can do this, but it's highly unlikely.
I worked on a chip in the mid 90s where we were very careful about our clock crossings, dropped in special high-gain anti-metastability flops, designed logic to reduce synchronised signal frequencies etc etc all the good stuff - we calculated that we'd see a failure (and mostly that would be a pixel burble on the screen) every year or so - at the time Win95 couldn't stay up a week so management decided to ship it
In the late 90's, I administered the software test lab including UNIX workstations and Windows beige boxes I had build in addition to some servers, the LAN, and WAN for a nuclear engineering consultancy. There were a few Windows 95/98 boxes that were imaged for testing and contained 4 handy-dandy clones that could be copied (prior to widespread usage of Ghost). It would regularly stay running for a month at a time when disused, but I believed it was set to simply reboot to the first primary partition on BSOD. I bet at least one of event logs probably contained entries like these because centralized syslog/log shipping and monitoring weren't set up.
CS/EE here. Malfunction still leads to dragons overall. The problems of sequential logic looping to itself and talking to the outside world under various conditions include (but aren't limited to): getting predictable initialization, predictable durability, matching input impedance, and creating chips with characterizable and reliable setup/hold/delay/etc. times.
These days, I leave chip design to chip designers and barely do silly things like create seven (7) total 4 pin to 3 pin Arduino PWM fan controllers with MOSFETs, MOSFET protection and noise reduction circuitry. See, I have to keep the fans fed with over 4 volts so the tach signal continues and the storage array's BMC doesn't freak out. (The fans characteristics I needed aren't/weren't available in 4 pin PWM.) I try not shock myself like ElectroBOOM or release too much magic smoke from gear or vintage gear that might not be replaceable.
Not really - I worked on a DSP with 9-bit bytes in the 90's (largely because it was focused on MPEG decode for DVDs, new at the time) largely because memory was still very expensive and MPEG2 needed 9-bit frame difference calculations (most people do this as 16-bits these days but back then as I said memory was expensive and you could buy 9-bit parity RAM chips)
It had 512 72-bit registers and was very SIMD/VLIW, was probably the only machine ever with 81-bit instructions
Impressive, glad the alarm chain works. And from what you say the warning message is also clear and understandable. Not tech or geology jargon that people don't understand and then take no or the wrong actions.
I doubt that he understood how unions must work, because the "variant record" he has put in Pascal in 1970 was really bad, worse than even the initial proposal of John McCarthy, while the "union" of ALGOL 68 was pretty decent.
Implementing correctly disjoint unions, i.e. allowing them to be used only exactly like an enumeration in the variable tested by a select/case/switch statement, and then only as the correct type in each of the alternatives, introduces a little overhead in a compiler, but it is certainly not a serious source of bloat in comparison with most other things that must be done by a compiler.
If the programming language designer had clearly in mind how disjoint union types must work, they would have been easy to implement even for the minicomputers and microcomputers of the seventies.
Niklaus Wirth eventually started designing for minimalism, while I greatly enjoy some languages designed by him, the ultimate minimalism that kept himself busy in Oberon-07 was clearly not my point of view.
Even Go v1.0 type system is more advanced than Oberon-07 final form.
In my opinion, the most important contributions of Niklaus Wirth to programming languages have been in his early work on Euler, PL360 and ALGOL W, which introduced various features that were novel at that time.
Starting with Pascal in 1970, his programming languages remained reasonably good for teaching programming and how to implement a compiler, due to their simplicity, but all of them were seriously behind contemporaneous languages.
While Mesa of Xerox was a nice and innovative language, that cannot be said about Modula, Wirth's attempt to reimplement similar features after his sabbatical at Xerox, which was only mediocre.
On the other hand the early languages of Wirth were very innovative, e.g. Euler was one of the first 2 languages with pointers, the other being CPL. In contrast with CPL which had implicit pointer dereferencing, Euler had explicit address-of and indirection operators, but it got their syntax right, not like in C, where the indirection operator has been mistakenly defined as prefix, instead of postfix.
However I still think the world would have been better with Modula-2 than C, unfortunately marketing was never Niklaus Wirth strong point, and no mainstream OS made it unavoidable.
Zig is basically Modula-2 type system in C clothing, plus comptime, if only people were equally as hyped back in the 1980's.
The usual "Why Pascal..." fails flat in the presence of Modula-2, which was actually designed as systems language, and not as a language to learn about programming.
Oberon (the 1992 original), for its simplicity, introduced a wider audience to the concept that system programming with automatic resource management isn't something out of this world, even though Cedar is more interesting in features.
I was more interested into Component Pascal and Active Oberon, even though there were the work of other researchers at ETHZ.
Nonetheless it was his work, that inspired me to dig into everything Xerox PARC was doing, and discovering there was more happening there than only Smalltalk.
I became amazed at the work done across Interlisp-D, Mesa, Cedar, how advanced their ideas for what an IDE is supposed to look like, that many mainstream languages still can't offer.
So in a sense that was also a contribution from Niklaus Wirth to everyone that got interested into his work, and decided to go down the rabbit hole.
I made a living porting Unix to new hardware in the mid to late 80s so all the busses of the day came across my desk, and designed graphics card silicon in the 90s
S-100 was very much history by the time the boxes in the article were designed (5-10 years before) VME and Multibus were the first generation workstation busses, PCs had ISA->EISA, Macs had NuBus/NuBus90 - all of them converged on PCI once chips with enough pins were packaged cheaply enough (plastic rather than ceramic - 200+ pins)
Algol-68 gets a bum rap here - it brought us 'struct', 'union' (and tagged unions) a universal type system, operator declarations, a standard library, and a bunch more - Wirth worked on the original design committee, Pascal reads like someone implementing the easy parts of Algol-68.
The things it got wrong were mostly in it having a rigorous mathematical definition (syntax and semantics) that was almost unreadable by humans ... and the use of 2 sets of character sets (this was in the days of cards) rather than using reserved words
It doesn’t help that this is a Vitesse Semiconductor part that became a MicroSemi part that became a Microchip part through a bunch of mergers and acquisitions…
I'm a sometimes CPU architect and came here to argue just this - modern CPUs have far far slower memory access (in clocks) than z80 memory access. To be fair you can probably fit any z80 table you're using into modern L1 cache, but even so you're looking at multiple clocks rather than 1.
reply