David F. Noble's "The Religion of Technology" is a quite good read to see how we arrived at this from monastic origins.
There are some points in the book that after reading became extremely obvious, like the belief that technology can (and will) overcome human limitations, restoring us to a perfect/divine state; that technological progress is the path to transcend physical and moral limits (all of which we see in the techno-optimism of the tech industry).
It's been around forever, we are just seeing a new dress up of it with the information age, the way tech "leaders" speak has been mocked to no end in Silicon Valley because of how insufferably close to religion all of this is...
Yes, and it's worth pointing out these examples because they don't work as quantum memories. Two more: magnetic memory based on magnets which are magnetic because they are build from many tiny (atomic) magnets, all (mostly) in agreement. Optical storage is similar, much like parent's example of a signal being slowly sent over a wire.
So the next question is why doesn't this work for quantum information? And this is a really great question which gets at the heart of quantum versus classical. Classical information is just so fantastically easy to duplicate that normally we don't even notice this, it's just too obvious a fact... until we get to quantum.
Why can’t it be 8?, the fact that it’s a trit doesn’t put any constraint on the byte (tryte ? size). You could actually make it 5 or 6 trits (~9.5 bits) for similar information density. The Setun used 6 trit addressable units.
Yes: the requirement of commutativity in the "addition" for distributivity to work with a noncommutative "multiplication" is exactly why my current model for code* is based on an (anathema to practicing programmers) unordered choice.
* and also data; with suitable choices of definitions there are very few differences between the left and right hand arguments to application.
Can we add caffeine as a cause? It was introduced in england around 1650 and then a few decades of people getting high from that and voila, steam power!
For me objects capture some notion of invariants in data that would otherwise just be a bunch of variables. This is usually indicated by functions that act on the data maintaining the invariants. Those functions are the methods.
I get that many people either don't see, or don't place much importance on the invariants of their data and so they are happy writing functions and even get annoyed by people and their over use of objects. Maybe these people are just way smarter than me; I definitely need objects/classes to help me organize my code.
I'm not convinced invariants are crucial. Common, certainly, but not crucial, I think the opacity is more important.
Take Rust's core::net::Ipv4Addr, that's obviously just a 32-bit value inside - it has no invariants to be maintained, but it's opaque, even though I know it's a 32-bit value, I am actually not able to just directly treat it as the value or prod that value, I have to use certain procedures, which include predicates (to ask if this is, for example, the broadcast address, or a loopback address) and some useful bit operations,
I’m not convinced you need objects to enforce invariants on your data. Why wouldn't you just use a good type system and type-checker to enforce those invariants? They could check if you're using data in places where they're meant to be, without having to bundle data with procedures.
Yeah I'm guessing these poles just get out in front of the wing aerodynamics enough that they can sense whats going on with the air so that air pockets/turbulence can be predicted. Either that or we are missing a decimal place, maybe they mean 10 milliseconds?
I don't think it's possible for a packer to become a mapper. We're talking about a strategy (being a packer) with some deep seated emotional roots. They have probably been at it since age 3. Rather than trying to turn them into a mapper, better to find what they are actually good at.