Further, if a student can get a diploma without work, then the diploma does not have value anymore. If diplomas are no longer valuable, the signal they provide in the labor market will turn into noise.
If employers no longer look for the diploma-signal in an employee, what will be the reason an employer will hire an employee?
I think this story will become true, and society will radically shift into one where critical thinking skills will actually be the only skills employers look for in employees, since the grunt work can be automated.
What becomes the signal then? Will we shift back into apprenticeship based employment? How do potential laborers display their critical thinking skills apart from displaying them in person?
In a medieval guild, to be admitted as a master, an apprentice had to create a chef d'oevre, or masterpiece, so called for this reason.
In the computer engineering industry, you increasingly have to demonstrate the same: either as a part of your prior work for hire, or a side project, or a contribution to something open-source.
A diploma is still a useful signal, but not sufficient, except maybe for very junior positions straight from college. These are exactly the positions most under pressure by the automation.
I think software developers might be somewhat of an outlier. Industry wants good programmers but universities teach computer science which really should be called "computation science". Much of what we learn in university will hardly ever be used while many practical skills are at best learned as a side effect. Dijkstra favorite said that computer science is as much about computers as astronomy is about telescopes.
So degrees have been a weak signal for a long time. Several of the best developers I've worked with had no CS degree at all. As a result we have interview processes that are baffling to people from other industries. Imagine a surgeon having to do interview surgery, or an accountant having to solve accounting puzzles. AFAIK we are very unusual in this regard and I think it's because degrees are such a weak indicator and the same is true for certificates in our industry.
> while many practical skills are at best learned as a side effect.
I strongly disagree, that’s the intent not a side effect.
It’s IMO a common misconception that early algorithm classes are just designed around learning algorithms. Instead basic algorithms are the simplest thing to turn abstract requirements into complex code. The overwhelming majority of what students learn are the actual tools of programming, debugging, etc while using the training wheels of a problem already broken up into bite sized steps.
Ramping up the complexity is then more about learning tradeoffs and refining those skills than writing an ever more efficient sorting algorithm or whatnot.
That is true, in the sense that the 100/200 level classes are covering programming basics in addition to whatever algorithmic theory is being presented. But beyond that that, programs really seem to differ pretty strongly on applied projects and software engineering practices (basic stuff like source control) and more theoretical/mathematical concepts. One type of capstone style class commonly seen is compiler design. To a certain extent, a good school will teach you how to learn, and give you enough of a background, class projects, internships, electives with applied options, that you get a well rounded education and can quickly ramp up in a more typical software organization after graduation. But as someone who has hired many new grads over the years, it always surprises me what sort of gaps exist. It rarely is about programming basics, and almost always about "software engineering" as a discipline.
My experience is graduates of schools focused on the more practical aspects tend to make better Jr developers on day one but then stagnate. Meanwhile graduates of the more theoretical programs people pick up those same practical skills on the job leaving them better prepared for more demanding assignments.
This then feeds into the common preference for CS degrees even if they may not actually be the best fit for the specific role.
Interesting. I did my undergraduate in Germany and my graduate in the US, so my experience might be unusual here and different from what you get in the US. My undergraduate algorithms classes in Germany and my advanced algorithms classes in the US involved zero actual coding. It was all pseudocode as you'd find in the Knuth books or in Cormen, Leiserson and Rivest.
And they were supported because they were useful labor. Even an unskilled, brand-new apprentice could pump the bellows, sweep the forge, haul wood and water, deliver messages. If it frees up the master to produce more valuable output, that’s a win-win. Then they can grow into increasingly valuable tasks as they gain awareness and skill.
IMO one of the big problems is that we’ve gone too far with the assumption that learners can’t be valuable until after they’re done learning. Partly a cultural shift around the role of children and partly the reality that knowledge work doesn’t require much unskilled labor compared to physical industries.
I was somewhat aware that in medieval period most started out as an apprentice in mid teens. Essentially work slaves in the house of a master. Then after a decade or so of toiling and gaining the skill they would go on to become individual business owners.
But I wasn’t aware about the master peace. Thank you for sharing that!
By the time one is early/mid 20s they would be nearing master level in skill. Would have faced the real world for 7-8 years, know how the world works in terms of money, dealing with customers, and so on.
Compare that with today, by early 20s one is only getting out of college undergrad. About to start the real world job training.
Yeah, they are different domains. I don't mind options for those who want to pursue an acedemic approach compared to a practical one. But for most fields we just don't have that choice anymore. Getting hand on experience? Gotta be recruited from acedemia first.
More reason to vye for labor protections. If they realize they can't just rotate out people every 6-20 months they may actually go back to fostering talent instead of treating acedemia like a cattle farm.
What becomes the signal then? Will we shift back into apprenticeship based employment? How do potential laborers display their critical thinking skills apart from displaying them in person?
This is already true to some extend. Not apprenticeship taking place of college, but the last couple of places I worked hiring generally happened based on: I already know this person from open source projects/working with them in a company/etc.
In certain companies, degrees were already unimportant even before LLMs because they generally do not provide a very good signal.
I might be a good thing. Colleges have become complacent and too expensive. Costs of an education have been increasing while employment opportunities decreasing for some degree categories. People have been sounding alarms for a while and colleges have not been listening. The student loan market is booming.
Now if students can shortcut the education process, they can spend less time in it and this may force colleges to reinvent themselves and actually rethink what education looks like in the new era.
It's aroind 5 years out but schools are gonna have a rude awakening as the population decrease finally catches up to them. The standards won't raise because many will simply shut down over lack of students.
The Harvards will be fine, though. But I guess that will raise the standards naturally.
Honestly, a future where diplomas are just noise and employers stop caring about them and thus young people stop wasting years of their lives "learning" something they don't care about sounds like a huge improvement.
LLM cheaters might incidentally be doing society a service.
They will only learn what's needed to "get the job done" for whatever it means at that moment, and we could potentially see more erosion in technical abilities and work quality. You don't know what you don't know, and without learning things that you don't care about, you loose the chance to expand your knowledge outside of your comfort zone.
> They will only learn what's needed to "get the job done" for whatever it means at that moment
I graduated university around the turn of the century, long before the current AI boom started, and the majority of my classmates were like that. Learning the bare minimum to escape a class isn't new especially if you're only taking that class because you have to because every adult in your life drilled into you that you'll be a homeless failure if you don't go to college and get a degree. The LLMs make that easier, but the university, if the goal wasn't just to take your tuition dollars to enrich a vast administrator class instead of cover the costs of employing the professors teaching you, could offset that with more rigorous testing or oral exams at the end of the class.
The real lesson I learned during my time in university is that the two real edges that elite universities give you (as a student) are 1) social connections to the children of the rich and leaders in the field that you can mine for recommendations and 2) a "wow" factor on your resume. You can't really get the first at a state school or community college, and you definitely can't get the second at a state school or community college, despite learning similar if not the same material in a given field of study.
It hasn't been about (just) the learning for a long time.
I don't think diplomas have mattered for decades, at least in tech. Let's not pretend anything improved with the introduction of chatbots.
Annyway, any advantage is entirely offset by having to live in a world with LLMs. I'd prefer the tradition of having to educate retarded college graduates. At least they grow into retarded adults. What are we gonna do about chatbots? You can't even educate them, let alone pinocchio them.
I was somewhat downvoted for saying something similar recently here [1].
Four year degree is a very expensive investment in the current environment. We should push younger people to face the real world as soon as possible. Apprenticeship is indeed a great way to achieve that IMO. As a great side effect the young people won’t have to start out their careers saddled with huge debt.
If employers no longer look for the diploma-signal in an employee, what will be the reason an employer will hire an employee?
I think this story will become true, and society will radically shift into one where critical thinking skills will actually be the only skills employers look for in employees, since the grunt work can be automated.
What becomes the signal then? Will we shift back into apprenticeship based employment? How do potential laborers display their critical thinking skills apart from displaying them in person?