The weird way that our memory works, and how it differs from the way computer memory works, is also a fundamental reason why it's so difficult to get privacy and data retention right in the digital age.
Ever had someone with "photographic memory" pay attention to your daily life? It can get creepy very quickly, because she notices and vividly remembers all sorts of things that you never imagined anyone would ever notice. But nowadays, with Facebook and Instagram and sophisticated tracking everywhere, all sorts of strangers have literally photographic memory of what you said and did last summer. I've seen lots of people obsessively delete stuff from their timelines after only a short while, not because they're paranoid, but because feeling uneasy is only a normal response to any violation of our intuitions about how memory is supposed to work. (Unfortunately, our normal response doesn't have normal consequences anymore; the data still exists somewhere.)
We're meant to forget. We're made to rewrite our memories as time goes on, just as any living thing constantly rebuilds itself over time. Homo sapiens never evolved to be reliable preservers of pixel-perfect information. No matter how much our technology and legal framework tries to promote 99.999999999% durability as the golden standard of memory (Amazon S3 actually promises eleven 9's), we need to remember that this concept of memory is a very modern invention that most people still have trouble coping with. Maybe we never will. Maybe we never should. Maybe computers will learn to forget like us, rather than us learning to remember like them.
You're right, but so far information overload has been mitigating these issues. All that information may be out there but we can only look at it very selectively, which limits our ability to draw conclusions from it.
For instance, imagine an embarrassing picture that shows you doing something stupid in an obviously intoxicated state. OK, so you were very drunk one night in 2005. So what? Millions of others were too. But what if everything publicly known about you could be mined for clues as to whether or not you have an alcohol problem?
I think that is what we should think about every time some company tells us "we don't ever look at your data, only our algorithms do" while the same company is busy working on AI stuff that draws conclusions from data.
And yet in a way, computers are ALREADY learning to forget like us. Show me one search engine that doesn't promote recent-ness. All the big internet-wide searches do, all the per-page social media search engines do as well.
And it seems that creeping deep through someone's social timeline is fast becoming a faux pas. You already avoid doing it, and if you do end up doing it, leaving a trace is mortifying.
To the point that people will never mention to you in person that they saw something from your profile that happened a long time ago. As someone with a strong-ish internet presence, it's funny to see how flustered people get when they accidentally reveal themselves to know more than they feel it's polite for them to know.
Same way as we've always had the norm that you let people tell their stories even though you've heard the same story ten times already. It's just polite.
> Show me one search engine that doesn't promote recent-ness. All the big internet-wide searches do, all the per-page social media search engines do as well.
And this seems like an annoying bug, not a feature. Internet is becoming more and more of an ephemeral place. That interesting news article you read few years ago? It's URL has probably changed twice, then it was deleted, and even if not, unless you remember the exact wording of the headline, you probably won't find it on Google as it will be buried under three pages of recent tweets that match your search query - tweets no one will care about or remember next week.
People do want to remember some things perfectly - that's why we invented writing, textbooks, diaries and libraries.
It's annoying as hell because Google promotes "freshness" over quality. You can get a very high position in Google by knocking off someone else's tech news story while removing half the facts and adding nothing of any consequence, as long as your do it quickly. This kind of blogspam also vanishes quickly, but only after the damage has been done.
Meanwhile anything that might actually be worth reading is somewhere around page 10...
> Meanwhile anything that might actually be worth reading is somewhere around page 10...
...much like the way things were in 1997 with Google's predecessors. Funny how despite exponentially more layers of code running (with eons of iterative advances), it all still somehow comes full circle.
I feel like I keep seeing this all around. It seems that there's a point in the product after which any further optimization to "extract more value" actually degrades things. Comapnies need to learn when to stop messing with their product further and focus on something else.
What you're describing seems to be a natural process through which people try to cope with the new reality. New social norms are created and enforced to make people avoid one another's past timelines, and market forces emerge to make search engines rank more recent pages higher. It's all part of our natural response to what makes us uncomfortable. We respond in such a way as to restore comfort, because being creepy is neither polite nor profitable.
What that doesn't fix, however, is the fact that old information is still out there, on somebody's server, every bit unchanged and ready to resurface at a moment's notice. Sometimes the old information is just a statistic in the grand scheme of Big Data. Other times, it can be very specific, such as the one-liner that allowed the Feds to attribute to Ross Ulbricht the very first mention of Silk Road on the open Internet.
Computers aren't learning to forget, they still remember everything. We're just training ourselves to pretend that computers are just as forgetful as we are, because it's a comfortable thought. The problem, of course, that collective games of make-believe have a tendency to break down at some point.
We tend to forget easily because that's how we evolved, not because we were meant or made to. And just because we evolved this way doesn't mean it's optimal.
I'm having a bit of a hard time assigning meaning to your first sentence. If we evolved that way, there's no "meant" in evidence, and vice-versa.
And as for optimality, be careful with statements like that (and what you wish for). I think it ceases to be an optimality problem when it's followed by "if everything were completely different."
In gaming, the concept is called a "replay", where instead of recording the pixels on the screen in every frame, they instead record all inputs processed on every frame, and just replay them thru the same engine. The action is technically idempotent in the game world.
Where this breaks down is when features get updated between revisions. If your game patched the "jump" function to increase upward momentum from 1.1 m/s to 1.13 m/s, the Replay would be incorrect. You would be jumping onto platforms you couldn't get up to before, moving faster, maybe even dodging enemy attacks that hit you when you played that match.
The human neuroprocessor is always changing and growing, always revising itself. Thus memories replay incorrectly. You apply old feelings to new mental patterns, and sometimes they lead to weird places. Or sometimes you mistake something easy for being difficult, because your memory data is out-of-date for your current processes.
I think they're using "idempotent" to mean "If you do it again, you'll get the exact same result as the first time".
(The usage isn't quite the same as in the jargon sense of "f is idempotent in case f(f(x)) = f(x)", and I thus found it jarring as well, but it's possible to see the connection in that above gloss.)
The amygdala doesn't have a direct connection to the visual cortex, as this article suggests. The amygdala receives stimuli from the thalamus, just as the neocortex does, and (iirc) the amygdala's outputs can affect the neocortex, but that's not how the neocortex gets its signals. The signals go to both the amygdala and neocortex independently, like a fiber optic splitter.
Because it takes so long to think, and probably because the amygdala is so small and simple compared to the neocortex, the visual senses we receive hit the amygdala well before they hit the visual cortex. We experience pain and pleasure memory response from senses before we even know what those senses are.
You may have two memories of a horse, one just a plain-old horse, and one a horse while you were getting shocked. Since the amygdala senses the horse first, the emotional memory is returned first, which is why any attempt to bring up "horse" will dig up one of the emotional memory horse records. You then have to wait for the neocortex to catch up and say, hold on a second, this is just a regular horse. When your neocortex is completely swamped by the amygdala it is called "amygdala hijack".
The quoted paper is basically saying "the hippocampus and amygdala can affect each other", which is basically true: usually one does not rule the other, and they work in concert to give you a full picture of things. And just like the amygdala can provide an emotional veneer over logical memories, the hippocampus can temper the emotional response, for example with mindfulness practice.
Hyperthymesiacs generally don't have this problem/solution (however you look at it). Personally, I'd rather not remember things wrongly. The only problem with hyperthymesia is you need to have the right personality for it not to be a burden to you (very high emotional stability, for instance), so it wouldn't be great for everyone.
Since human beings are still evolving, I don't think there's any reason to believe we're meant to do things one way rather than another.
Ever had someone with "photographic memory" pay attention to your daily life? It can get creepy very quickly, because she notices and vividly remembers all sorts of things that you never imagined anyone would ever notice. But nowadays, with Facebook and Instagram and sophisticated tracking everywhere, all sorts of strangers have literally photographic memory of what you said and did last summer. I've seen lots of people obsessively delete stuff from their timelines after only a short while, not because they're paranoid, but because feeling uneasy is only a normal response to any violation of our intuitions about how memory is supposed to work. (Unfortunately, our normal response doesn't have normal consequences anymore; the data still exists somewhere.)
We're meant to forget. We're made to rewrite our memories as time goes on, just as any living thing constantly rebuilds itself over time. Homo sapiens never evolved to be reliable preservers of pixel-perfect information. No matter how much our technology and legal framework tries to promote 99.999999999% durability as the golden standard of memory (Amazon S3 actually promises eleven 9's), we need to remember that this concept of memory is a very modern invention that most people still have trouble coping with. Maybe we never will. Maybe we never should. Maybe computers will learn to forget like us, rather than us learning to remember like them.