I’m not sure kids, especially recently born ones will ‘see’ computers. In the same way that most people don’t ‘see’ the abundant ressources around us. Air, water taps, power, cars, internet etc. These things will always have been there. Most children’s have a rectangle with multiple circles on it pointing at them from their first hour on earth.
I can answer for my kids. Everything they do is in the cloud and accessible on any device. I'm not sure they'd understand why anyone would want a thumb drive.
As for attachments on emails, they don't (even for a 100% online highschool). To turn in a paper they can just hit "share" and select their teacher.
Files and filesystems are almost dead to them. There is a little hope... My son googled to learn why a game was going slow and answers indicated he should delete the temp files. That was his first experience needing the file explorer.
As someone who grew up with DOS and turned in CS homework on a floppy disk, it is almost hard to watch.
I can't speak for iPhones, but Android has a file system and the ability to create directories. I think it would be weird if Apple products didn't have that as well, even if it's a bit hidden. I don't have any form of evidence to back this up, but I think that this has more to do with apps, not the device that kids use. Having everything be an app makes it seem like you have multiple independent things on your device, and the device itself is nothing more than a way to connect to those apps.
It does, but using it is really optional. It works equally well if you leave everything in the default Drive directory and use search to find whatever you need.
If you're not used to thinking in terms of files, it just looks like a list of your most recent documents, with a search bar to find anything else.
I just read through the circa 2010 retrospective on how the Drive team tried to come up with their brand. 99% of the concepts were very based on the traditional physical idioms of files, mechanical hard disks, etc. iCloud was released midway through the project and so then they started making “cloud file” and “cloud disk” concepts. It was nearly an accident they ended up with an abstract shape like Chrome. But ultimately Google Drive icon has stood the test of time while files and disks have faded in to history.
Kids don't really use email these days, or usb sticks. They share more than we ever did, but directly via apps or mobile which doesn't work with "files". It seems reasonable in this case the thing missing is the abstraction of a "file". Think of how your phone works, you share pictures, video, text, posts, etc. but never files. Sharing to email is my phone's 5th or 6th priority option. It comes after a boatload of social media options, messaging, online storage, even some sort of AWS application I've never even launched.
I haven't used a USB stick in years as a tech person and virtually all my day to day document processing is in the cloud. It wouldn't surprise me if the person in question does know what a file is, but doesn't work with them enough as files to really get it.
It’s probably my paranoia but I assume something will go wrong with the network at the conference where I’m presenting. I usually have multiple types of backups.
Plenty of bright people don't understand the first thing about computers, even in the generations that came before Gen Z. It just means their calling is potentially in something else. It is possible that the majority of techie youngsters are so used to the abstractions around dealing with filesystems on modern technology that files and filesystems elude them. It's also possibly your nephew is just sharp in another area that only requires understanding how to open and use a web browser, and that plenty of other youths exist with similar cultures around computers as we had growing up.
Then again, a Gen Z software developer at the company I work at was confused on how to download a torrent (not that it matters but the contents of the torrent were legal.) That was at least a bit of a shock for me, as it's difficult for me to imagine someone getting into tech without having gone through the usual piracy phase (that phase lasting shorter or longer depending on the person.)
The concept of a file itself is pretty interesting in itself though, as this buffer that lives on disk and is referenced to indirectly. In Linux, everything is a file. In Emacs, everything is a buffer, and then maybe a file.
Sure, the concept of file itself is an abstraction.
It's a whole different thing to use an abstraction in order to make something understandable and to just handwave everything technical about it. It's not just a different thing.
What is he studying? I feel like a lot of older people in non technical fields, even if they work with files, could not reason through exactly how to define a file.
Can it exist in more than one place? When a file is opened, is the file what is in processor and cache/RAM or what gets serialized to some persistent media?
Metaphorically, is a file a sheet of paper or folded tan cardstock that holds many sheets of paper?
Is a file unitary or can it contain other files? Is a directory a file? Is a link? Is a device a file?
I mean, a file is an abstraction of several of the choices you are asking about in these questions, and different people will answer different things. ps. I've worked on filesystem drivers...
I read you as needlessly combative here, getting far from the points made. Commenter said they told a kid that websites are just a collection of files, kid replied "what's a file?", and I chimed in that that's not so strange, because even people who work with them might have a hard time nailing down a definition. None of this is incompatible with your soap box, but you're writing as if we disagree.
This reminds me of when I took an Intro to Programming with C++ course at a community college almost 10 years ago. There weren’t any basic computer usage prerequisites for this course. So I guess I shouldn’t have been so surprised that at least one classmate did not know how to locate the source code file they created for their Hello World task in the first lecture.
I keep hearing about how the #1 profession every kid wants to be is YouTuber or influencer or something like that. But don't these professionals rely on high-end graphics/audio/video editing software to produce polished videos? Doesn't all of that require a high degree of computer skills, including using files?
Non-IT parents think their kids "are good with computers" because they can tap the next video on YT.
IT parents show their kids what an OS is, what a File Explorer is, etc. Perhaps it's useless, but it sheds a light on that's behind those curtains, and kids start understanding the mechanics of 'stuff'.
I totally relate to this, but from the exact opposite perspective. I'm old-school, everything is a "file" to me. It's still weird to me that modern tools have abstracted away from "files" to just data. I'm not saying it's bad, it's just so different than how I came up.
A workmate (I do labouring and was doing traffic control that day) brought her teenage son a PS5. I don't know why parents would purchase consoles for their children as you don't learn any transferable skills (I guess you could be a streamer).
I'm not saying being a sadist and give the kid a second hand laptop with *BSD because then the kid will want nothing to do with computers, especially if the laptop is cheap and doesn't have a working driver support (This is where Linux is better from my personal experience).
This is also why I like HTML, I got a free internet CD with Netscape Navigator in 1996 when I was 12 and the three working examples (two with Netscape Navigator and one was a html email with Netscape Mail and News client) I had, I was able to play around with and create and learn basic HTML until I got a modem in 1998.
Actually, playing video games is an activity that let you practice real-time thinking. Other activities with real-time thinking include music instruments, sports and social interactions. By real-time thinking, I mean an activity that requires you to think and act within short timelapses. It appears also that activities involving real-time thinking can be quite fun, and few course in schools involve real-time thinking.
Now, learning new languages like html or how to fine-tune LLM can be quite fun. Learning how to draw nice pictures, or how to write nice books can be quite fun too. It's nice to let children explore different activities (or free internet CD if we speak about 1996) so that they can find the ones that resonate with their inner motivations.
> I don't know why parents would purchase consoles for their children as you don't learn any transferable skills
It's plausible that parents may just want their children to enjoy some leisure? To not feel like every experience in childhood should be preparation for labour?
As a child, I was disappointed when I got a BBC B instead of a Megadrive. But I definitely learned more, and some of my leisure time was programming in BBC Basic.
In my own anecdotal experience, if you are a computer gamer, you miss out on all the "console exclusive" stuff. Stuff like Gran Tourismo or the latest Kingdom Hearts or Final Fantasy or whatever. I can't really remember. Then when it is a game everyone plays, it tends to be something where the console gamers can't play with the PC gamers so it's like
"Hey I just got the new counterstrike!"
"Oh that's awesome! I have that too! We should play sometime!"
"Yea what's your PS5 handle?"
"My what?"
"You don't have a PS5?"
"Nah I have a PC"
and yea.....
That is my anecdotal experience though, and it was always bolstered by the benefits of PC gaming - cheaper games, ample mods, lots of sales, and the biggest game library of any platform. Plus better performance. And I could text chat on my keyboard while playing Quake 3 back then haha. Sadly text chat in games has died off as an art, in favor of voice chat and all the problems voice chat comes with. lol.
Edit: And of course, the PC is a tool, the game console is just a dumb appliance.
Well it plays all ways. PC misses out on the console exclusives, consoles miss out on the PC exclusives, consoles themselves have different exclusives and don't interop. For people getting into gaming when they're young, a big influence is probably what their friend group is playing on, which in turn is probably inspired by whatever their parents decide to let them on. That or whatever games is latest hotness.
When I was young Minecraft forcefully evangelized my friend group, which had previously been more nintendo/xbox, into PC gamers. The origins of a lot of my current interests and career go back to the initial questions of: Why is minecraft running at low FPS? How do I set up a minecraft server? It's a little different nowadays, when some of the biggest multiplayer titles are cross-platform.
I never had much time for consoles and I don’t feel like I missed out on much. Popular games come and go. The joy of doing whatever I can imagine with my PC is more than enough consolation.
I also learned HTML in 1996 (when I was also 12), in part from a cover story in I wanna say PC Magazine. I had a modem and had access to an unlimited AOL account thanks to my school’s computer lab teacher not doing a great job of hiding the password for the account (she also trusted me and my best friend quite a bit…trust we only abused insofar as we stole AOL from the county, back when it was still hourly and not unlimited for everyone), but it’s a similar story.
But you know what got me interested in computers to begin with? My Super Nintendo (and I guess my NES before that). So I rebuke the idea that video game consoles don’t teach transferable skills (what the fuck that is supposed to mean) and that they don’t unlock wonder and excitement about technology in lots and lots of people, especially kids.
There are many people in tech industry who began to learn about coding precisely because of video games. Age of Empires II was essentially the first gateway to my career.
perhaps writing the -apps, and -data, for the site, might have more signifigant meaning [despite not being entirely correct] relating to a mobile context.
In the same we should be worried that people don't know how to grow their own food, procure their own heat, or build and thrive in a community. Not being snarky, I am genuinely worried about that
If only tech was at the point where it would be as reliable as your local supermarket or your house.
When i'll leave for family home later, i'll take my laptop with me. My family is big, so the chances are high that i'll need to run some ADB or photorec on a SD card. I hate it[1]. There is some knowledge/reliability deficit with computers that houses/heatings/food aquisition dont have.
from my experience, heating has plenty of reliability deficit, probably getting way worse in recent years. fancy boilers with computers in them break way more often than they have any right to.
Agreed. Likely in the search of efficiency, which oftentimes brings complexity, which then involves more precision or more pieces to a puzzle, or both.
There is something to be said about the dead simple ways of things for sure, and all depending on object/item but a "sweet spot" where something is pretty darn efficient yet not overly complex.
Our endless forwards towards "the most efficient" everything always carries consequences. People / society included.
Great comparison, that's even more worrying. If we suddenly loose fossil fuels and fertilizers, the lack of agricultural skills will probably kill billions.
Unless you’re in full apocalypse mode, every man for themselves, I’d expect for the most case people will learn from their neighbors quickly enough — as long as a survivable subset of the population knows what they’re doing
What’s going to kill billions is the lack of industrialized farming productivity, but you don’t expect every individual to know how to resolve that
I doubt even the lack of industrialized farming productivity will "kill billions".
It will certainly switch diets and most probably cause hunger and malnutrition for some time. But we can support billions of people, given the right diets. We cannot support billions of people's mcnuggets, beef burgers or cornflakes with milk. About a third of my food comes from stuff I make, grow and harvest myself. I can up that, if I need, but preparing, growing and harvesting are a hobby, not a full-time endeavor. I'm certain I can't up it to 100%. Maybe not even 50%. But some neighbor, friends and family can help out (and I them) and I'm certain with some ~100 people we could be self-sufficient. Sober, sure. But not dead.
(I would die from lack of insulin within months, though. TII. So I am aware of how dependent on industry and high tech I am)
No, without industrial scale fertilizer production, there is no way to grow enough food for our current population. Natural processes simply can't fix nitrogen fast enough. The main reason populations started exploding in the 19-20th century was because we found a way to produce nitrogen fertilizer in absurd quantities very cheaply.
If that link in our global industrial chain breaks, a lot of people will die. We can, of course, still grow food, but not anywhere near as much and it will take a lot more care and attention to not deplete all available soil in a generation or two.
In a hypothetical apocalyptic scenario, we might lose a lot of institutional knowledge about agriculture. It's not hard to imagine some average joes trying to farm the wrong way and accidentally starting another Dust Bowl. After all, that's how it happened the first time.
Feeding 8 billion people is so, so much more complicated than just putting some plants in the ground. That we can do it at all is an absolute miracle of technology. We simply could not do it with only naturally available resources. Not by a long shot.
According to most sources I can get to with my amateur knowledge, a giant portion of proteins is "wasted" by "producing meat". Another large portion is wasted because of poor logistics and wasteful behavior.
If everyone stops eating meat, and starts eating the proteins themselves, we win some 1000% to 25000% (sources range from 10 to 25x the proteins in/output when producing meat). Meat is just a terribly inefficient way to produce proteins. (Edit: to be clear, "meat" differs a lot: chicken apparently being rather efficient, more even than cheese, beef being amongst the worst).
And if then the resulting food sources are grown efficient, and all, or a lot of waste can be eliminated by shortening supply chains, sourcing JIT (no avocado's in winter in Europe, no tomatoes in snow), I'm pretty sure we'll get a long way.
But, sure. If some rich countries keep demanding giant portions of global protein in e.g. soybeans and corn to feed cattle to make burgers for 7 meals a week, then certainly: people in countries who cannot afford to compete with the soy-prices will starve.
Market is about supply and demand. People demand meat. Regardless of how inefficiënt it is produced.
And, no surprise, this isn't a free market at all. At least in Europe, meat, dairy (and on lesser extent crops) are heavily subsidised.
https://ourworldindata.org/grapher/feed-required-to-produce-... has a quick overview. My numbers aren't wrong, but, as always, they do need context (i.e. not all input is currently human consumable. Oos, many input is grass, which is rather ineffective land-use, i.e. the same km² could produce a multitude of human edible foods)
> About a third of my food comes from stuff I make, grow and harvest myself.
1/3rd of value or flavour is relatively easy. 1/3rd of protein or calories is a lot harder.
Need those mechanized 2000ac farms growing corn, canola or wheat to supplement the homegrown tomatoes/potatoes/peppers/herbs/apples. But it’s impressive what can be done with a somewhat small backyard in terms of value/flavour and not a lot of effort.
My diet is vegetarian, so a lot of flavor is that staple.
My tiny lot brought me all the potatoes, corn, beans and pumpkin I need and some more. Our five walnut trees, shared with 12+ households, produce some more protein.
I obviously still buy rice and a lot of wheat and rye meal to bake my bread and pizza and to make pasta.
My point was that I don't need the large industrialized food industry to feed myself. My point was not that I can do without a society with specialized and efficient food production.
More people at least knowing and having some practice with these things makes communities more antifragile.
At our current rate of consolidation and specialization we're going to get to the point where so few people know how to do particular high tech things we may be at risk of someone taking those people out and leaving a massive capability hole across multiple industries.
Or just burnout, changing jobs or them leaving tech alltogether. I'd argue that many companies are already incapable of keeping their tech stack running as intended due to knowledge deficits.
we've done it to ourselves, by embracing MASSIVE complexity at every turn to get the fancy new features.
i keep thinking about how we don't actually need 95% of the code running at my job to actually accomplish my company's goals. it's insane how we just kept going with the flow while features and maintenance effort exploded.
that said, anyone know how well the BSD's work as a desktop operating system? mostly worried about hardware i guess, since it's already an issue on linux from time to time. but man the linux ecosystem, while wonderful, suffers from massive complexity too, or at least it looks more and more like it the longer i use it.
Or how a car works. I am always surprised that a lot of young people have no idea how many cylinders their car has or how to do an oil change, never mind any real repair.
Growing a few tomatoes or something is easy. But growing enough food to sustain yourself and your family is much more involved and requires a lot of training and experience.
Most people could likely do it. But it would take them 3-5 years of failures before they could reliably grow enough food for themselves. Probably more on the scale of a whole family.
Why do you think it takes that much land to feed an entire (average[1]) family?
Besides plants, you could have animals, and get milk and meat from them. Chickens for eggs and meat too.
It doesn't take too much land for it in my opinion. And if you live in a community that does that sort of thing, you could trade meat for tomatoes, and vice versa, or whatever, without relying on megacorps to give you your calories
My gut feeling (before searching) was that it would take around 2 acres to feed a person. Searching for data, it seems like estimates ranged from 1-3 acres per person, meaning that feeding an entire family would quite literally take acres of land.
There’s a widely quoted myth that the UK’s ‘dig for victory’ campaign during WWII deemed a standard allotment plot (~250m2) sufficient to feed a family of 4.
Most teenagers simply don't understand that a phone and a tablet and a desktop and a laptop and a cloud server are all computers. They don't understand that an app and a web page and that thing you install on a desktop are all programs.
As soon as you educate them such that those abstractions become visible, the questions start to flow naturally.
Tomorrow I'll strike a blow for this. I got my (12 year old) grandson a Pi 5 for Christmas. I plan to spend as much time as he'll allow on-boarding and hopefully he'll learn the fundamentals. At the very least he'll need to learn how to open a terminal and launch the Docker container that runs a Minecraft server. :) There are other things that are included with the full RpiOS install that he can explore as well, like Scratch, Mathematica and Wolfram. And a few games that I tried and are absolutely horrible.
His father, my son, earns a pretty good living writing code (and now managing a group that does this) and should be highly motivated to assist with motivation.
I gave my granddaughter (strictly speaking, my wife's grand daughter, but she calls me granddad and always has. She also now has a sister) a laptop running Ubuntu at the start of the first covid lock down in the UK.
She still uses it. Mostly to watch crap on Youtube, I gather, but who am I to judge? It is tethered to my home router via OpenVPN and I patch it from time to time. I have an ISO27000 registration to keep on top of 8)
That laptop was a customer cast off, destined for the skip.
What if files just aren’t relevant as an abstraction any more, 40+ years later? As most data lives somewhere in a database the cloud anyway, the concept of a file as a somewhat tangible, autonomous chunk of information may just be outdated.
On the other hand, single-board computers are now both more capable and more affordable than ever, and arguably more open than most of the WIntel PCs I cut my teeth on (at least until I got my own and installed Linux on it).
except the ease and frictionless experience of iPads and most phones, combined with curated ecosysmtems of apps, means you never have to see how the sausage is made, or learn how to make your own
Abstractions are comfortable, but they might reduce the number of techie teenagers like OP in the future, and that worries me a lot