"Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society."
A key part of this argument is that people are becoming (or are going to become) more shallow than those in the past. But how shallow were people in the past? I have a sneaky suspicion that many of these dissenters are just being nostalgic. Anything beyond their own lifespan is also questionable--how do we know how well people were able to focus in previous generations? Are people _really_ more shallow than they used to be, or does it just feel like that?
This whole topic incredibly nebulous, so I question the value of jumping to these conclusions when we don't even have a clear understanding of what we're talking about in the first place.
I recall reading somewhere that when books were invented, people thought they'd degrade the mental ability of people to accumulate and hold "wisdom". Did that happen? And even before that, when writing was first invented, it was some kind of devil magic. Now, where might I begin finding out where this might've been said?
...but when they came to the letters, “This invention, O king,” said Theuth, “will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.”
But Thamus replied, “Most ingenious Theuth, one man has the ability to beget arts, but the ability to judge of their usefulness or harmfulness to their users belongs to another; and now you, who are the father of letters, have been led by your affection to ascribe to them a power the opposite of that which they really possess.
“For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise.” (Plato, Phaedrus 274c-275b)
A more modern critique could talk about personalized caching/flushing strategies and the role of latency in: brain memory, motor memory, paper, offline device, online device, public search, private search & classified search.
It's the same nostalgic lamentation that everyone has been telling for thousands of years. Ancient Greeks used to complain to one another about "kids these days" being ruined by the latest trends. And yet, here we are, thousands of years later, casually toying with the kinds of knowledge that Aristotle would have killed to have a glimpse of.
20 years later: "OMG kids these days are so accustomed to 3D holographic interfaces, they can't seem to comprehend any information presented on a flat surface!"
The only solution to shallow thinking is a good education that encourages critical thinking about things that matter. Electronic gadgets are little more than convenient scapegoats that people can safely blame in order to divert attention from their failure to come up with a proper solution.
> A key part of this argument is that people are becoming (or are going to become) more shallow than those in the past.
I don't think I agree with this. What I do agree with is that modern society, and especially internet usage, seems to degrade the ability to concentrate.
I know that if I have to spend too much "computer time", my ability to stay focused and think hard about a single task seems to degrade. For one example, it takes me 2 or 3 books until my ability to concentrate and read the book flows rather than requiring actual willful effort.
It's definitely a hypothesis I'd like to see tested, because I do belive the increasing shallowness and leaning on technology as a crutch is a real phenomenon in Western society.
A good test might be how people fare when stripped of their technology and conveniences, for some definitions of faring and technology. Testing this ethically might be tough, though.
I'm at a loss for how your proposed test actually tests anything of relevance to modern society which is built explicitly on the notion that groups of interdependent specialists greatly outperform similarly sized groups of independent generalists (see: Adam Smith and his pin factory). Sure, you could concoct a survivalist rebuild-the-world scenario in which the tables would turn, but you could play the same game for any other specialization in existence ("generalist" becomes just another type of specialist when time constraints are taken into account).
I would choose a society where everyone knew how to use google, wikipedia, and google scholar over one where every individual could list off the pressure, temperature, and catalyst of the Haber process any day of the week.
Yeah, kids these days with their "clothing" and "language" and "fire" have it so soft. Clearly we should worry about how well they would fare without those things.
Same reason we have search engines. Centralized, low-latency access to a larger dataset than a single person can afford to buy or license. Try searching at http://books.google.com and see the difference in content quality between the "web" and "books".
What will also be interesting is how it will affect education.
We were all told we have to memorize things because we can't carry the textbook around with us, that's simply not true anymore.
Filling our brains with dates for civil war battles and the dates of birth/death of random authors was completely pointless.
I think that more critical thinking style education will certainly be a good thing.
Educational material should focus much more on how to keep kids interested in the material rather than making sure you know every detail of every event.
I think the educational system needs a major overhaul. Let kids learn the subjects that interest them at their own pace rather than constant exposure to random bits and pieces. Or at least try it, now that technology is so cheap and powerful we can track and allow access to everything imaginable for education (other than instant learning, but it's probably not too far off).
If I had something like Khan Academy and a tablet in school it would be amazing. Now would be a great time to be starting school, you still need teachers and parents that are willing to inspire and know how to appeal to the interests of children of course.
> We were all told we have to memorize things because we can't carry the textbook around with us
I was told to memorize things because it's much faster to recall items than it is to look them up.
I also don't consider memorizing certain historical facts "pointless", I think they are vital for context. The birth/death dates of a certain author may have little value in of itself, but it may have a lot of to say about the era they lived in and how/why their works were/are relevant.
Agreed that "critical thinking style education" is worlds better than mere memorization, but let's not forget that the devil is in the details, and its those details that you benefit most from memorizing.
I think it's pointless to have to memorize it. You need to know the events and the general timelines, but does it matter if it was 1492 vs 1500? It's okay for your brain to generalize and compress a lot of that knowledge, the context will still be there for general discussion. If suddenly the exact date does become important, it'll take our kids 90s to look it up.
As technogy gets more integrated into our lives (think Siri), that could drop as low as 30 or even 15 seconds.
What's important is that you are able to have an informed and intelligent discussion about the topics you learn in school. The exact year of the Emancipation Proclaimation is less important as long as you understand the events around it. Same thing with the melting point of Gallium. There's no need to memorize that, as long as you understand the general concept that Gallium melts at a very low temperature, and what this means in terms of its uses.
> does it matter if it was 1492 vs 1500? ... If suddenly the exact date does become important, it'll take our kids 90s to look it up
The issue is more subtle than that. In lots of cases the value of knowing the information is when it lets you make connections that you wouldn't otherwise have _realised_ were important.
Let's say you're reading something that happens to mention The Alhambra Decree. You've never heard of this before, so you do a quick Wikipedia lookup, where the first paragraph says:
"The Alhambra Decree (also known as the Edict of Expulsion) was an edict issued on 31 March 1492 by the joint Catholic Monarchs of Spain (Isabella I of Castile and Ferdinand II of Aragon) ordering the expulsion of Jews from the Kingdoms of Castile and Aragon and its territories and possessions by 31 July of that year."
If that "1492" is a well primed trigger point for you — especially in a context involving Spain — you're much more likely to pause for a second and wonder if this might be related to anything else you know about that year. And, although the remainder of this page never mentions Columbus at all, it doesn't take you much searching to discover that there are indeed some very interesting connections between the two, some of which lead to further questions about Columbus' own religious background and leanings.
But if that 1492 doesn't have that trigger effect, you're much less likely to draw any connection here, and simply nod and return to your original reading.
We do that all the time. There are so many things that pass us by constantly because we simply don't even realise there's anything worth paying attention to. The more hooks we create in our brains for those to get snagged on in passing, the deeper you start to understand things, and the more interesting life becomes.
If you are a history major then what you said might be true, but expecting every student in the US to memorize those dates and everything similar (melting points, number of protons/electrons, dates of birth of authors) is insane and very counterproductive. Especially in basic required classes in high-school.
Students are not willing to learn those details and it turns them off to school and wastes valuable time that could be utilized to better prepare them for the real world.
How many professions could use ANY of the information you mentioned in your post?
With ADD medicine becoming more and more pervalent in American colleges- is that proof that kids' brains are already rewired? and society is forcing them to snap back to the old way of thinking?
I think that's more a sign of the current pressures to succeed (get good grades) in college and double major and etc. are extremely high and the perceived penalties for 'failing' also quite high. A combination of a high pressure situation, more work than most people can comfortably manage, and the opportunity to take a pill that allows you to do more maintained, focused work than you would be able to otherwise? I think many people would go for the pills. They're extremely effective at allowing you to maintain an intense focus on an activity for much longer than most unmedicated people would be able to.
There isn't any evidence that I'm aware of that there has been any noticeable change in people's abilities to do actual, focused work: studies today show that most people can at most manage around 4-5 hours of focused work, and as far as I know that hasn't changed much in the last 200+ years. Ritalin can allow a healthy person to put in bursts of 10+ hours of focused work. Of course students are going to take advantage of that. My understanding is that it's also fairly common in other high pressure situations that require prolonged focus (stock trading, etc.).
Things like the web/etc. do allow people to do more multitasking-esque behaviors (with the obvious impacts on focus), but that's less about people's brains being rewired at some fundamental level and more that there are now more opportunities to goof off. Pre-internet, you had a much higher activation energy of chatting with friends (you had to go call them and talk with them, versus clicking three times to get to Facebook). I don't think that's limited to people in the younger generation, though: one of my family members (an English professor, one of the top in her field in the world) will sometimes turn off the internet/block distracting sites while writing and does all of her editing away from the computer in order to avoid Facebook/reading the news/playing scrabble against the computer.
Learning date ranges is important, what years the wars started and ended (approximately), what period of time an author was born (1700s, 1800s, etc). Knowing a specific date and being tested on knowing that exact number is completely pointless.
If you fill your head with dates that you do not recall often, then they can often get confused, you think you remember what date something happened and spit out the wrong one.
No, history is all about writing. That's pretty much its definition. The invention of history happened precisely when people decided to start writing things down for the future.
We prevent the rewriting of history by distributing the writing as widely as possible. And that's precisely what the internet is doing.
> Physical distribution of acid-free paper books are better at preserving history.
I don't doubt that those books have better survivability, but they also represent only the tiny fraction of written information that was worth publishing in book form.
archive.org is a great example of how a relatively small nonprofit organization can now afford to archive historical information at volumes that were previously impossible.
They were referencing unique, long-tail content such as blogs or personal sites, not bit.ly redirectors to mainstream content that is widely replicated on CDNs.
Edit: try using https://addons.mozilla.org/en-US/firefox/addon/resurrect-pag... or similar extension to find caches/archives of dead links that are no longer online at an alternate address. Agreed that link rot is not the same as offline content, but there is digital content which can no longer be found in any archive or search engine.
Even given that, its not at all comparable to the idea that history is written in books. Books themselves are printed and published and duplicated if people care to do so. How much content that is scribblings in journals and other disposables is lost daily, which is why blogs and the like are the digital equivalent of?
Content that is initially valued is duplicated. Content that isn't dies away. It isn't like things being digital makes them more transient, instead it makes rare copies far more accessible.
The original comment was about "rewriting history".
History is written by winners, who change over time and have been known to rewrite history. Digital formats require constant transformation from legacy formats to modern formats (per http://fileformat.info). This has an economic cost and only a subset of content is prioritized for forward migration, depending on which parties wield economic power at that time.
A book on acid-free paper that has a reasonable print run will be stored in geographical locations worldwide and will preserve information for decades if not hundreds of years, without requiring battery power, OS upgrades or file format migration.
History is also about economics. Digital storage economics is increasingly driven by cloud vendor bulk manufacture of custom servers and data centers, e.g. OpenCompute. Centralized copies can be rewritten. If local communities start building their own https://archive.org/web/petabox.php to record local/global digital history in archival file formats, maybe digital can record long-term history, EMPs not withstanding.
Do you know about Wikipedia's version control and discussion features?
Wikipedia is about 10 GB compressed, meaning that anybody can download a copy and save it forever, for free.
If you include the discussions pages, images, and revisions of Wikipedia it amounts to several terabytes worth of data. (A couple of hundred dollars for a hard-drive that size).
There are many backup mediums that are meant for long-term storage. EMP protection is relatively easy.
How many "acid-free paper books" would it take to store that information?
Wikipedia is a great example of data in an open format that is intended to be migrated forward as technology changes.
It does not protect against rewriting history, one could argue that Wikipedia exists to rewrite history based on the latest consensus and reduction of controversy. While anyone could review the change history, in practice most readers use the latest version. There is at least one research paper showing a steady decline in Wikipedia contributors due to tough editorial policies created to thwart spammers.
Books are decentralized from both a physical and editorial point of view. Look back at any major event in history and there will be multiple records and perspectives of the event. Decentralized locations record the subset of events that are locally most important.
At any rate, it's not an either/or situation. Digital will coexist with print and paintings and sculpture and architecture. Each has a different information density, diversity and preservation profile.
I've thought for quite some time now (last 5+ years, which as a 23yro seems like a long time) as the internet as an extension of my brain. I've joked to friends that I simply keep a small "cache" of "pointers" in my brain to the internet where I know I can find the information if I need it but I have little use to memorize the details.
I fully understand that if the internet were to disappear tomorrow that I would be very lost but it's not something worth preparing for IMHO. While I agree with preparing for the future I do not think preparing for the fall of humanity/technology to be a good use of my time. Call me naive or stupid (or lazy), you may be right, but I find little advantage in remembering large swatches of information when it can be called up (from either local copy or the internet) at will. It's one reason I vehemently opposed the C++ exams I took in college that required me to know the boilerplate of a C++ program as there would never be a time in my life that I could imagine I would be without an IDE or the internet to generate/fetch it for me.
I'm not saying that I think this is a "more advanced" way of living/existing but just that it is the way I live/exist. I have friends who have expressed similar feelings but I'd be very interested in what both people older and younger and I think/feel about this issue.
The human brain is incredible at making connections and finding patterns. When you feed more and more information into the brain and think about it the potential for better, more accurate and more profound connections increase. You see an apple fall from a tree and understand how the planets move around the sun.
What worries me is that your "pointer latency" makes it increasingly difficult to get those aha moments. Vast amounts of knowledge at our fingertips but very little understanding and wisdom.
While some say "shallow knowledge", I say "abstracted knowledge"; just like the jump from straight-up experience to discussion to writing to books to public libraries, we can "fill in the blanks" with reasonable accuracy while upgrading to an exponentially larger breadth of information.
What's the techno-environmental equivalent to random mutations? Remix culture, open-source software, fan fiction, fair use, cross-corporate IP licensing, emergent outcomes of search engine SEO and Google's top ten revenue categories?
If we are going to expose children to technology, should we expose them to open-ended technology with many degrees of freedom and possible innovation, or walled gardens/devices with a limited menu of pre-approved operations?
Imho, while it is not important to blindly memorize data, it is difficult to build mental models if you don't hold some information in your brain. For starters, it is not quite feasible to have all the information be stored in a secondary location (the internet/cloud) but have your brain process that information to make conclusions, to the best of its ability. Even if it were possible, in such a situation, a person would be completely dependent on the source of information, and could be easily misled by a wrong resource -- because they might have little primary data developed from their experience of the world, on which to base their decision making -- in a sense, how does one develop a sense of whether or not to believe something one reads on the internet, without regard to the reputation of the source? This is an example of what I would call critical thinking.
If one is always used to having the information at one's fingertips, then how does one acquire the ability to "figure things out"?
Great quote from the Asimov story: "It won’t do to say to a man, ‘You can create. Do so.’ It is much safer to wait for a man to say, ‘I can create, and I will do so whether you wish it or not.’"
I dont understand how this article can claim that our brains are malleable, but then go on to claim that our children's brain will be wired differently to ours.
Young brains are extremely malleable. Our brains grow (and we learn fastest) during infancy and toddler-hood with a second, less impressive, burst during adolescence. So, our children's brains are malleable in the sense that they readily adjust to this new reality while their brains are growing fastest, but since the reality during their most formative years is different from the reality during our formative years, their brains will be wired differently.
One could argue that this happens to a certain extent with every generation, however, and it is not necessarily a new thing. One might also argue that as technology changes faster, the inter-generational gap grows larger. One might also argue that while our adult brains change more slowly than a toddler brain, they do keep changing which might mean the difference is not so large after all. Interesting questions to be sure.
Most IQ tests use standard questions in different combinations. So maybe not so hard? Find the image, then find the page its featured on, maybe find the answer.
"Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society."
A key part of this argument is that people are becoming (or are going to become) more shallow than those in the past. But how shallow were people in the past? I have a sneaky suspicion that many of these dissenters are just being nostalgic. Anything beyond their own lifespan is also questionable--how do we know how well people were able to focus in previous generations? Are people _really_ more shallow than they used to be, or does it just feel like that?
This whole topic incredibly nebulous, so I question the value of jumping to these conclusions when we don't even have a clear understanding of what we're talking about in the first place.