>Aafia Ahmad, a sophomore computer science major at U.T. Austin, had hoped to take an elective course in computer security this semester. But when she tried to sign up during early registration in November, the course was already full.
I know this doesn't refute the point the article is trying to make, but it doesn't seem that unreasonable to me that an underclassman might not get into a highly sought after elective. A lot of the "fun" electives when I was in school were little passion projects taught by one professor for one or two small sections a semester (or maybe only one semester per year), and while I agree it sucks that not everyone can take the classes, I wouldn't want them to die out because of complaints along the lines of "if you can't accommodate everyone, don't offer the course". Not that the article is saying that should be the case, I just worry that too many complaints about this kind of thing just puts pressure on the good professors who put enough effort into a class that a bunch of people want to take it.
A couple of my friends in the CSE program (not at texas though) had to take on a 5th year due to the poor scheduling. Doesn't help that you have to apply to the program after you are already a freshman, so if you bomb a tough math or physics class your first semester you are set back pretty far and have to build up your GPA to even get into the major.
You can workaround this with advance planning though.
I wasn't affected by bombing any classes and got direct admit to CS [^1], but there were classes of interest offered only once a year which filled up quickly. I just took them e.g., a year early than expected to make up for that or disregarded the prereqs and learned just enough on the fly. Another technique that helps is emailing the professor in advance of registration or after it's full and letting them how interested you are. Even better if you can stop by their office in person.
Another trick is to get into your school's Honors program which typically allows early registration vs non-Honors.
There were also 1-2 courses I audited for knowledge instead of officially taking which let me invest in basic learning without adding the pressure of a grade to a heavy course load.
Ideally everyone would have an academic advisor that lets them know this kind of thing in advance.
If I were graduating high school today, I would give serious consideration to Lambda School instead of a CS degree; though I also believe that we haven't seen enough time pass for the long-term implications of a decision like this to play out yet (e.g., future career discrimination based on lack of degree).
[^1]: I feel for people that don't get direct admit into their engineering major. This part feels unfair IMO. I think that everyone should be able to start in their desired major by default and disqualify out vs default out and qualify in.
It definitely depends on the school, while you can direct admit to engineering, you are still not in the CSE major yet. If a prof let someone in a class because they 'expressed interest' over the 50 people in purgatory on the waitlist, there'd be protests at my campus. Classes fill to fire code room capacity. Honors scheduling is big though, but there are a lot of people in honors as well so it isn't always guaranteed.
Not when one bad grade is due to a notoriously bad teacher. One poor professor should not be able to threaten a student's future by forcing them to change majors to be able to graduate without an additional $10K or more of debt.
As a consequence of people entering the program at different times, required classes and electives are a blood sport to schedule. Some courses are only offered in fall or spring with a prerequisite class to be completed to schedule, so you could have to literally wait and sit with a very light schedule for a semester because you were a half semester off the 4 year trajectory, or the class was full and you have to wait until it's offered again. This ultimately wastes a lot of your time at college. I bet if scheduling was completely painless and not constrained around classes with a waitlist double the size of the class, and you could take your required classes any semester you'd like, people could get the CSE degree in 3 years and save quite a lot on tuition.
I attend UT Austin and it's pretty well known the CS school is particularly overcrowded, especially wrt lack of professors.
Its ranking went up a lot somewhat recently, so I think its a growing pains thing. ECE is a bigger major but we don't have the same problems because the department has been around longer and growth not so fast.
I had to write a script to constantly monitor my school's class scheduling system to get into a couple of classes. It was an interesting script because the system was an IBM 3270 style mainframe.
I always thought about doing this, but since class signups only opened twice a year (once per semester), and that was the same time I needed to actually sign up for the classes, I concluded there would never be a time to debug and refine such a script.
At my school the class signup system was open after the initial set had been loaded in to IIRC 2 weeks after class starts. This allowed students to shuffle classes around as needed, and it also gave me the opportunity to write a script against its interface.
In short, you gave the school a list of the classes you wanted to take. The school loaded everybody's information into the computer and had it make the first set of assignments. Then you got back the schedule and discovered that the computer had put you in classes at 7AM, 11AM, 4PM, and 6PM MWF and only one class on TTh so you had to go in and shuffle your classes around until you got a reasonable schedule. Or you would discover that all of your electives filled up before you had a chance so now you're in a fight to get into the classes.
It sounds all Mad Max, but the schedules were in such flux that it was usually possible to get the classes you wanted. I only remember getting screwed once, on a mandatory class with only two or three timeslots and one of which was highly undesirable (7AM MWF) and the other two timeslots never had an opening. Worse, the professor for that class had a heavy accent and tended to drone slowly in a monotone.
I've written a couple of scripts over the years to email me when certain things happen on things I don't own, and as long as you're cool with a few false positives, it's not too difficult.
Say you're waiting for something on a website to become available. I'd look for some language on the page about availability that's likely to change when available. I don't know what they're going to change it to, but I know that it'll (probably change). I set up a script to run every few minutes to check that particular spot on the page. If it changes, I get an email.
Maybe I got an email when it changed from 'unavailable' to 'available soon', but that's okay, that means it'll definitely work when it's available for real.
There's always a chance that it won't work, but if you can reasonably assert what is likely to change, it's very easy to monitor it.
I've written this type of script before - I didn't need too much in the way of debugging. Mine (similar to GP, it seems) didn't actually do the registration, just save the current state of known divs or even the whole page. Then it would send me an update if that changed.
So, I didn't need to debug what happens when the class shows "open" - I just saved the div that said "closed" and sent myself an email/text any time it didn't say exactly that.
You could test it by having it sign up for undesirable classes like underwater basket weaving that had plenty of openings. The terminal interface was kind of gross to script, but the underlying data was pretty easy.
You basically had to just send the correct number of arrow key presses to get the cursor to the correct field, send the digits, and then send the enter key. Parse the data that comes back, and add the routine to cursor over to the "add course" prompt when it says there is an availability. The script was totally gross looking but it worked.
When I was in school, I just had a python script that diffed the relevant part of the web page, and sent an email/SMS to my cell phone on any changes in the output. It didn't automatically do the registration though, which would have been slick.
Ah, sounds like your school's setup was different than mine. Our class signups opened once, at a set time and day, and it was just a matter of clicking the signup button fast enough. If people dropped, the spots would go to people on a waitlist, also determined at the original signup time.
I got into the AI class at my school (it was not overbooked) only to discover that I didn't have the necessary math background. Also, the class was going to be all math and wouldn't involve computers. It was not what I was expecting at all. Ended up dropping it after a week.
At UCLA, the Film major is notorious for being difficult to get into. Some colleges have majors that they are 'known' for. Supply-and-demand being what it is, this makes it difficult to get into these programs. That's not to say that you can't take a few classes in those programs every now and again. If anything, having a bit of that in-demand major (and having a broader knowledge base with another major/minor) is better than having none of it. When it comes to learning, perfect is the enemy of good.
I learned a "trick" to getting into these types of courses when I was in college. I was always polite to the department secretaries and treated them like human beings instead of people who exist only to serve me. The number of other students that failed to treat the department staff like people always shocked me. In any case, they were almost always able to just add me to classes, even when the registrar indicated they were full.
There's often the ability for professors to accept more than the limit on a case-by-case basis, at their discretion. If you approach a professor and come across as a motivated and serious student, there's a good chance you'll be able to get in. Contrary to popular belief, most professors care about undergraduate education and want to help motivated students.
I worked in the CS Dept in college and this is a big one. Everyone in the department is there to support you and is happy to help, and the chair too. It's surprising how few CS students actually take advantage of these resources.
You can also use this to get into grad classes as an undergrad even if the course scheduling system blocks you.
Signing up on a course waitlist also helps give the department more info to be able to potentially move it to a bigger room or add additional sections.
At my uni the class sizes were huge, usually 1:300-400, and they only offer 1-2 possible class times. It feels like staff are dropping left and right so the profs that stay on are forced to do this. Many cs/engineering kids cant take the intro classes for a couple semesters, many upperclassmen can't take courses which are only offered in spring or only fall.
"The university is looking to hire several tenure-track faculty members in computing this year, he said, but competition for top candidates is fierce. I know of major departments that interviewed 40 candidates, and I don’t think they hired anybody."
That doesn't sound like fierce competition if they were able to interview 40 candidates... Did they give them all high-paying offers and they all rejected?
One of the problems is that a candidate might sit on an offer for a long time. The best candidates want to wait until the interview season is over to evaluate all their offers. On the other hand, some schools give one week deadlines on offers but any good candidate will turn those down.
It is hard on both sides. The entire process was about 6 months when I went through it, and I had offers expiring before I even finished all the onsites.
Disclaimer: I’m a CS prof at a R1 university in my first year.
I'm not CS faculty, but this is similar to the situation in economics. Interview 40. Of those, 20 may end up being acceptable. The eight best of those 20 get offers at better places. Four others take nonacademic jobs. You're left fighting over a pool of eight candidates with a lot more than eight other schools. It was really tough after the recession, when the Fed was hiring a ginormous number of new PhD's.
this is one of the pathologies in academic institutions: they need teachers, but they evaluate researchers. the people who make the cut for researcher also have a lot of other options, and the institution ends up with an undersupply of teachers.
Some universities hire non-tenure track instructors, with no research requirements, sometimes called "Lecturers".
Of course, they are compensated horribly, so nobody wants to do that unless they have no other choice, and once you do that, good luck ever getting a tenure-track position.
The whole academic job market is pretty dysfunctional. It's more interesting to look at the flawed ways universities have of measuring research contributions, but sometimes it just comes down to money. Universities are trying to save money by paying as many people as possible peanuts, while paying "stars" hugely.
Ironically, it is often the non-tenure track positions that pay horribly that attract the best natural teachers, at least initially. These are the people who are doing it for the love of teaching their subject rather than the pay or research opportunities. However, in my experience, this only lasts for a few years, and those passionate people become jaded about their situation and leave the profession.
>Non-tenure track academic teaching positions that pay "horribly"... I can already imagine the kind of talent they attract.
Honestly, as a former CC student, non-TT professors/lectures at the junior level were some amazing teachers. While they may only have a masters in their field, most of them are taking these jobs out of enjoyment as secondary income or after retirement.
Examples: One psych prof is a clinical child psychologist, teaches PT, highest evals in the dept.
Other psych prof, retired psychologist, head of CHARGE research, drove 45 minutes to teach PT
Social work prof, 20+ year professional, teaches PT, was a complete savior to her students.
Eng prof, trans, former Navy, journalist, taught PT.
Chem prof, retired, works at a local grocery store stocking fruit PT, lectures PT
Many schools rely on adjuncts and lecturers to teach. They get paid very little, have no job security, no benefits, etc. But because the academic career path is so completely broken, lots of of very smart, talented instructors take on these jobs to make ends meet while they look for something else that allows them to pursue their passions with their extensive educations.
Don't look down on these people. They're smart, they work hard, and they deserve better.
> lots of of very smart, talented instructors take on these jobs to make ends meet while they look for something else that allows them to pursue their passions with their extensive educations.
My point is that you can't attract and retain top talent by offering low pay and no job security.
You are confirming said point by showing any person of talent taking this job is only for a short, temporary period while looking for something else.
Academic career paths are completely broken, which is why universities can and do get away with it. The problem is not that these instructors aren't "people of talent."
Just a warning for those who don't know, but this only applies to lecturers in certain countries. In the UK, a lecturer can do teaching, research and be a permanent, tenure-track position.
Yes, although I think we might be better off with some dedicated teaching positions. Especially for for some first and second year undergraduate courses where the material is pretty standardised.
I went to a relatively large public school with a big endowment. The "just teachers" were called Adjunct Professors and they really had a shit deal. They basically didn't make any more money teaching at my school than they would teaching at a community college and I don't even think they were considered full-time with benefits. Meanwhile, my research PI was a tenured professor who made a very decent living (salaries of teachers at the school were publicly available) but almost all of his salary was paid out by research grants, not the school.
Really among most of the department, I was surprised how much of an afterthought teaching seemed to be. They were by no means bad teachers, and I guess when you've been teaching the same class for many years it becomes pretty effortless, but there really is no good incentive for them to sign up and teach a bunch of extra classes when research is what they are interested in and pays for their salary (along with the salaries of all of their lab members).
It does, but it's an interesting question why it always does. Knowing universities that lost their best lecturers because they insisted on a high-impact research record to consider giving them a permanent position, more flexibility in that regards looks like it'd be a good idea.
Not really. It’s a good question to ask but the answer is rather simple: R1 institutions stay afloat because of research grants; they need researchers to secure that grant $$.
And it's surely helping the grant-getting that their researchers have to spend time teaching classes they don't really want to teach. Teaching-focused people, even if you make them full professors, are still comparatively cheap (since they don't need big teams or labs), and I don't think it's obvious that it doesn't make sense, especially if you consider decrease in teaching quality and other factors. And this also happens at institutions that aren't in dire need of cash. (To be clear, I'm not suggesting to do this for all teaching, but if you get good teachers as e.g. postdocs to try and keep them)
>Teaching-focused people, even if you make them full professors, are still comparatively cheap
This is where liberal art universities strive. While their endowments and grants may be lower, their retention and graduation rates may be higher (unsourced opinion). But Unfortunately we’re living in a time where we need both type of institutions and having that choice is good. While i still don’t agree with the rising costs of education, research driven programs are good for future researchers and STEM. Liberal art programs are great for the humanities.
Institution that is looking for tenure track person is not looking for someone specialized in teaching. People who specialized in teaching are treated rather as lower status by universities, so you don't want to be one anyway, tenure or not. Generaly speaking, even people who might like teaching and could be good don't want to specialize in it often, prefering research positions or out of university work.
It's not pathological. Most universities have flirted with hiring non phds to teach. It can work, but there's a reason they prefer traditional candidates: thebfaikure rate is way lower.
the competition is often between people with phds. i am not 100% sure that a university teacher needs to have a phd, but it seems that even keeping that requirement, the present system is pathological. most people with phds do not make the cut for researcher, which leads to the university being short on teachers.
This article is so poor I dont know where to start.
The Computer Science Stampede
>> "While the number of undergraduates majoring in computer science at certain American universities more than doubled from 2013 to 2017, the number of Ph.D. candidates — the potential pool of future professors — remained relatively flat."
They seem to ignore the three decades of legacy PhD graduates who gave up looking for jobs in academia because of so few spots opening up. The article makes it seem almost like all legacy graduate supply poof disappears!
>> “I had a faculty member who came in with an offer from a bank, and they were told that, with their expertise, the starting salary would be $1 million to $4 million,” said Greg Morrisett, dean of computing and information science at Cornell University. “There’s no way a university, no matter how well off, could compete with that.”
Nice anecdote, but the number of graduates in that spot is tiny. Rarely do PhDs get that type of salary on Wall St, and often it does not last. No, they cant compete with the back on this one candidate but luckily there are a thousand other candidates, and tens of thousands of candidates once you consider the pool of grads having graduated over the last generation.
No, but most recent PhDs can get a job at a FAANG company, where their starting salary will be much higher than a faculty salary (and that's before bonus, stock grant/options/etc). Even with the cost of living in the Bay Area, they'll be able to pay off student loans faster and will finally have time to enjoy life.
How many CS PhDs are getting $1m-$4m starting salaries? If any, are they just from Stanford/Berkeley/MIT/CMU or are all the PhDs across all US universities getting these salaries? What about the tens of thousands of CS/Physics/Math PhDs who have graduated over the past two decades, are they also getting these salaries? I find it hard to imagine that a highly reputable school like Cornell would have a difficult time finding faculty. Most my PhD friends gave up looking for tenure track positions and begrudgingly accepted finance/FAANG jobs distraught from the drought in academia.
It is poor reporting to chose the most extreme possible anecdote (one person $1m-$4m salary) and then use that as an example of what is going on. The real story is...universities messed up badly structuring decent pathways for teaching positions and are trying to blame a hot market. There is also the ongoing story of poor measurement schemes (measure how good a teacher is by how good their research is) and the story of uncertainty (let highly accomplished academics rot in limbo-hell for a decade and use a tenure carrot to control them.)
BTW, the quote was from Cornell University's dean. I did my CS undergrad from Cornell. 90% of CS lectures were in a large hall where ~60% of the seats were empty, and this was in an upyear (1998-2001.) There was no shortage of space in the lecture halls. The teaching was mostly done by MS and PhD students, of which there were plenty -- the supply of recitation sections is quite elastic since there is no long-term commitment on either side.
Fairly recent PhD graduate here (graduated last year). My observations are that the skills required to do a PhD are orthogonal to the skills required to pass FAANG interviews and, more importantly, to being a good software engineer.
I don't dispute you can make a higher salary than a faculty member, even without working at a FAANG, but having a PhD won't automatically make you eligible to get such a job.
Interesting observation. I was under the impression that getting hired into research groups at a FAANG was a different process than getting hired as a SWE, and that you could bypass some of the whiteboard "find all sub matrices with matching determinants in an NxM matrix of arbitrary size" style questions.
Those jobs, the SWE jobs, there's really no degree that will allow you to skip that gauntlet. But I think there might be a different hiring process for PhDs or faculty leaving academia to work in research labs.
I don't really know, though, just something I heard/read somewhere.
Just one POV, but it does confirm that PhD graduates do have to go through the same kind of coding exercises (at google) as anyone else. It is worth noting that this is the case if you're applying for SWE positions, where a PhD might not really confer that much of an advantage. Again, I'm not sure if this would be the case if you were, say, getting hired as an AI researcher for a lab.
So, this link sheds some light, thought I still don't know about research positions specifically.
I do know someone who ended up at IBM Research from my lab. He was collaborating with them for his research before he graduated so I'm not sure if he even had to interview after finishing.
For what it's worth, though I've never particularly sought out a research position in industry, I haven't really come across too many listings which leads me to believe the positions are few and far in between.
There is some difference in research scientist hiring, both in the method & the criteria. But as I said, these roles are very competitive, and most PhD holders in FAANG companies will be in SWE roles.
Often a research university will interview many candidates, give offers to/recruit many candidates, and yet find that for each individual it recruited, there was a competing university which worked better for that candidate.
Usually (from what I have seen) candidates on the academic job market don't end up in industry (at least not immediately) - they are in the job market because they are passionate about academia.
My experience is not from the US, but sometimes companies/universities/etc self-sabotage themselves in weird ways.
I teach in http://www.sena.edu.co/ (the biggest trainer in Colombia, run by the gov) because for a advance class about databases them not have anyone and one of the faculty ask me the favor to cover for it (I was there as part of a program for startups).
It work well, and I do it because the time commit was not bas and the extra money help. So I was ready to continue to teach about mobile development. I was like the only one that apply for it.
I was not selected, because I don't have certain diplomas related to how operate the (very) arcane Sena "education platform" and other side stuff that demand ONE YEAR of extra courses for me.
In the end, only the people that accept that get it the job, but how many with actual skills on the field?
I would wager the pay would be at or near minimum wage levels. I had this exact predicament. A decade ago, it was an $8/hour offer or do an internship/co-op and make triple. Passion for your area of study is great, but going to a home with minimal to no luxuries gets old pretty quickly as years go by.
I went to a highly ranked liberal arts college and in 4 years, I didn't have a single instructor that wasn't a PhD. I don't know what the salaries were like then, but even freshly minted PhDs can be over 30, meaning that from ages 22-30, they were making graduate/doctoral student stipend money at best.
$95k at age 32-35 honestly doesn't seem like a lot if your savings at that age doesn't even hit 5 figures.
I don't understand how you can infer from the number of candidates they interviewed how fierce the competition is. Don't you also need to know the number of positions available at other institutions that those candidates are also considering in order to estimate competition fierceness?
The best programmers I know are self-taught, because they have a passion for it. I know people working at minimum wage, never enrolled in college, who got decent programming jobs after a year or two learning CS on their own.
These people will always (I hope) form the core of the profession.
It's only elitist to the extent you believe people who teach themselves are somehow superior. Actual elitism is prioritizing the candidates who can afford to spend tens of thousands of dollars at top schools. That's... I mean, it's classic elite. Access to education is a core part of an aristocracy or upper class.
Can't agree enough.
I am a autodidact, and am still learning. And really every other week I wish I had a mentor/professor. Even though good books somehow covers up for no teacher. But, you can't just read thousand of books before solving a petty problem and there's where a Professor comes. They provide you with a distillation of years of work and experience. and not to mention the resources available for you at Uni
But your mentor can just be your coworker! That's how many professions have worked for a long time. I don't get where this idea that academia has a monopoly on difficult skills and knowledge comes from.
No CS professor I had ever helped me with a problem I had in CS. Now, as a Teaching Assistant I helped tens of other students diagnose bugs in their code, but that's a different game.
Yup, honestly if I didn't have the structure of university I don't think I would have made it as a self-learned dev. Don't get me wrong I enjoy programming, however I very much needed the formal education process.
It's not elitism because learning how things work "under the hood" at the age of 20, when you've been using tech your whole life, it shows a lack of curiosity. Tech is inherently obfuscated, nail-biting debugging, and a crap ton of reverse engineering. CS is a subset of mathematics. Software development is a lifestyle and an ethos. No degree will provide that in full capacity like being a curious kid who wrote game mods or hacks, and looked into how the things he loves actually work. With something as accessible as software, you don't need privilege or thousand dollar lab equipment. If you don't know how things work under the hood because you waited for college...you're gonna have a bad time...
If you think that being a plumber or an electrician or a mechanic never requires creativity and deep knowledge, then I think you are looking down on those professions a bit too much. Much like programming, most of what those fields involve is routine grunt work. But from time to time, the usual ready-made solutions don't work and you'll have some tricky problem to solve.
I don't think that's true. If your job is checking off Jira tickets maybe it's the case. But there are plenty of areas where you need a lot of creativity and deep knowledge to make progress.
Exactly. There are plumbers who just execute exactly what they learned in school but nothing else and there are others who can solve tricky problems. Same for programmers.
You really do need privilege. Not everyone has regular access to a computer or internet even these days. And not every is in an environment where it's considered ok to sit and tinker with software all day.
More importantly we need people in our industry who have interests beyond just tinkering with things all days. We need balanced individuals who have skills beyond just code, who can look at problems from a wide array of viewpoints. If we only hire hackers who have been coding since they were young we're hiring an incredibly narrow set of people while trying to solve problems for billions of people around the world. That's not efficient! It's good for our ability to solve problems that we expand the tech community to include diverse individuals, including folk who didn't code as kids.
If you don't have regular access to a computer & the internet, then American CS college is miles away by cost comparison.
Like seriously, a $200 chromebook that you earn by working a local minimum wage job, a household that feeds and clothes you and your local public library wifi is what you would mostly need. The requirements for college far exceeds that by a mile.
The poster was walking widely about the industry as a whole not just college. And there are lots of scholarship students who had to work to help put food on the table, who couldn't afford a chromebook and the time to go tinker.
In many colleges, the cost of one or two textbooks outdoes the cost of a chromebook. If you have to work to put food on the table in college, you can work to put a chromebook on the table too and spend your college time on self study vs college itself
I didn't finish my degree, but I can tell you that "a year or two" is nowhere near enough to "learn CS on (one's) own". On a daily basis, I use the foundations that I did learn in a formal environment much more than any of my self-taught skills. It's a very complex subject that cannot be effectively navigated without some guidance. Otherwise, you may think you know it, but you're just skimming the surface (and likely doing so in a very inefficient way). Now, if you've been self-taught for a decade or more, that's a different story.
I don't agree. Most people simply don't work very hard. For a motivated person, 1 year on their own time can easily be enough time to replicate a bachelor's in CS. When I did my bachelor's I was a young dumb kid who partied a lot and wasted a lot of time.
Now, 14 years later and tons of industry experience, I realize how trivial the material in my CS bachelors was and really the big problem is that most students don't really work that hard. Now my work ethic is much, much better, and I've been learning far more advanced material, far faster than I ever did before, because I come home in the evenings and actually study (using online options) and practice.
When I look back at college level CS, it's a joke. Also, the online resources are built to actually be taught well, when I was in school, there was no Khan Academy, Pluralsight, Coursera, Udemy, etc. Most college professors are research oriented at heart and are uniformly terrible teachers. The online moocs and so on are focused on actually making things easy to learn, which gives modern learners a massive advantage.
> I realize how trivial the material in my CS bachelors was and really the big problem is that most students don't really work that hard.
As someone who dropped out, worked in industry for a few years, and is now back as an undergrad(though in math, not CS), one thing that I feel is neglected in these discussions is the time-sucking effect of homework.
When I was self-teaching as a dev, I could learn something, play around with it until I felt I had a good grasp, and move on. A college course ties a lot of work to each concept. I have solved way too many matrices by hand in the last few weeks, for instance.
This, so much this. Homework is an insane waste of time. Some people might need that level of tedious repetition to learn a concept, but generally it just detracts from actual learning.
Funny to some one else write almost exactly what I feel. I fucked around in college. I almost failed my first cs class, and failed a few others. A few years later I went back with the help of free online classes and found it a lot easier.
There's not even that much cs in a cs degree. Half the classes are gened. There's a lot of irrelevant math.
A year or two is enough time to get your foot in the door with web dev. For more advanced stuff, sure, it's an ongoing process.
On the other hand, I question the utility of most university CS curriculums. There are usually only one or two classes covering the core "algorithms" knowledge that autodidacts might not have, given how many people squeak through these classes by copying HW, I'm not convinced that anything really sticks.
I agree on your fist point, but I can't speak to the modern curriculum or how it's tackled, since it's been over a decade since I was in uni. I do know that web dev is only a very minuscule subset of CS. A typical web dev may be able to tell me which algos are inefficient for a given task, but I can tell them why.
The algos you're talking about (BSc in CS) are trivial. I went to a research-focused college for CS and at Bachelors the algorithms you learn are not very complicated, they're only made hard by the students themselves not actually doing any work OR terrible teaching. In fact, I'm fairly sure I could my 70 year old dad the various tree based algorithms, Dijkstras, A* search etc, and he doesn't even know how to program. The intuition behind these things isn't hard to grasp.
In fact, I think I could take any random person off the street, and provided they're motivated, teach them algorithms and data structures to BSc level within 3 months. It's not rocket science and any engineer who makes it out to be like "way hard" is just being a bit insecure over their knowledge.
I had some prior experience as a hobbiest, mostly messing with BASIC as a kid, but with no formal structure. I majored in music and after a bunch of wasted years in dead-end jobs, became a professional programmer after only 3 months self-study while on unemployment. So depending on your background and passion and luck, a year or two can be plenty. There are several other stories floating out there, like the blacksmith who became a Rails web dev after I think it was one year of study.
My experience with a BS in CS degree was that 60% of the classes provided no career value at all. There's all the general ed classes meant to make you well rounded, and then there's classes in the core that are either useless (Calculus, very few CS grads will ever use it) or provide history/background about the field that is nice to know, but not generally useful or necessary.
Out of the remaining 40%, the level of overlap and repetition was just absurd. It seemed like each class spent the first half of the semester re-covering things that were covered in prerequisite classes. So maybe 30% of your actual class time is spent on new concepts, skills, etc.
Of those, probably 1/3 involve very basic things, like bubble sorts and the like, which help you build basic problem solving skills and language familiarity, but not much else. Then you get another 1/3 of CS related stuff, like architectures, patterns, SDLC, etc, which is probably the stuff that separates the degree holders from the self-taught in most cases. Then, if you're lucky, you get a couple of classes that are elective and teach something moderately interesting or useful. As I recall, the only classes I had in that category were a class on Computer Graphics (which ended up just being a bit of Java Swing and a bit of math to explain some of it), Web App Engineering (neat class using ASP.NET MVC), and an AI class that really just covered some solvers and search algos, topping out with Genetic Algorithms and Simulated Annealing.
Point being, I could definitely see someone being able to learn all the same CS stuff in a year, if they were self-motivated enough. A few good programming books, a few good CS concepts books, and you'd be covered. Honestly, you could probably cover CS with both more depth and breadth in a year than I got in my degree, if you put a full-time effort into it.
And all that is ignoring the fact that half my class graduated with virtually no ability to actually sit and write code of their own beyond just copy-pasting snippets from online examples until they had something that worked well enough to pass. The only real value of a CS degree, as far as I can tell, is to signal to employers that you can show up and put in at least a minimal effort at something for years without giving up. I guess that makes for a useful enough filter to keep employers using it and keep colleges in business.
Sorry, but what they are teaching themselves is not Computer Science, it is programming, which is a tiny subset of computer science or even slightly outside it but intersecting. To get a well rounded knowledge of fundamental things, such a program and data structures, memory management, analysis and design, architectures, how OSs work, cryptography, machine learning, networking and functional programming etc, takes a degree course and possibly a masters or even PhD.
It might make some of the people here who are worrying about competition for their jobs feel better to know that a good portion of CS graduates couldn't code their way out of a paper sack[1]. I have a CS bachelor's from what is considered a very good school and let me tell you, I frequently worked on group projects with complete sloppy bozos who had no idea what they were doing and still graduated. I can't imagine the standards have gone up with the recent flood of students and pressure to improve graduation rates.
[1]I know, I know, CS is not about "coding". But trust me, these people were not making up for it with advanced theoretical work or even like, HCI.
i would hope not. in fact, two of the best programmers i know have doctorates in computer science and maths respectively.
these people generally do not learn computer science on their own, they learn programming instead. This is useful, but it is not computer science and there is no way that (most, modulo some exceptional edge cases) non college graduates will have the same skills or abilities as someone who has studied maths and computer science at university, as well as software engineering. there is some truth that the core (80%) of the software engineering profession will always be formed of programmers/developers/software engineers who do not need computer science skills, and simply need to be able to write code according to a specification i suppose. but the profession will be advanced by the (20%) rest who have the skills and knowledge of the underlying science...
> The best programmers I know are self-taught, because they have a passion for it.
This doesn't preclude getting CS degrees--that is often a part of that passion as well. But I agree that the best programmers would get there with or without it. And people who start programming in college are extremely unlikely to ever really get very good at it (in my experience).
I'm self-taught. I still learn stuff that probably never will help me to get money (like, how make programming languages).
I go to the university and get severely disappointed, and after 3 semester all the passion get sucked and I left.
However, I WISH that college/university were way better. Be on a class with others help a lot, and despite I'm not very social at all, be along others with the same interests? that is so rare that is a prize on itself.
How make things better and why I get disappointed is other matter that have been discussed on HN a lot, so instead I wanna point that having a good experience in an education setting (that I have in a tech course!) not only is good for the people that like to walk the common path, but also the self-taught!
How lovely If I could get into some very rewarding clases, not matter if in a university or in a backyard....
You don't think it's possible to be passionate about CS and have a CS degree? That seems counter intuitive. My own anecdotal evidence disagrees with yours: the best and most passionate programmers I know are the ones who cared enough to get a real, formal education.
I really can't imagine a scenario where getting a formal education would make you a worse programmer. Having a degree does not necessarily mean you are a great programmer, but your self taught friends would still most likely be able to benefit from experienced and knowledgeable professors.
> I know people working at minimum wage, never enrolled in college,
If your starting point is "didn't go to college", then why comment on an article specifically about college students trying to enroll in a class?
Would you suggest that in addition to a full course load + tuition, students should set time aside to learn on their own? When they already go to a school with a highly regarded CS program?
It doesn't work this way for everyone. It was priceless for me to have my CS101 professor walking around the classroom when we were doing our assignments. When my C code wouldn't compile, he could spot the error with just a glance. I can do that easily now but I never forgot that.
Doesn't line up with my experience. Many self-taught people I've worked with lacked knowledge in fundamentals and the best people I've worked with all had a degree.
I would argue the two CS courses I took is school gave me my passion for programming. I ended up switching majors but because of those classes I had the fundamentals I needed to go out on my own and learn new CS principles and get a good job out of college. I don't think I would have been able to learn what I know without those two courses.
For all the people here that are downplaying the difficulty of getting into CS courses or degree programs please make sure to do some research. The article wades shallow in the details. It can be very difficult depending on the institution. Please see this comment brlow from a CS Professor about the difficulty of transferring into the Computer Science major at University of Illinois at Urbana-Champaign. As a former student who was getting into computer science concepts but not in the CS/engineering program, it was disappointing to learn learning that transferring into the program was basically impossible by the time I hit junior year and didn't have a _stellar_ GPA.
I transferred into CS just before junior year at UIUC. The next year they raised minimum GPA to like 3.6 and I've heard it's even higher now. This was 4+ years ago and we had overflow classes where they had lectures on a projector because we couldn't fit into the 400 person lecture hall.
Why do US colleges over-admit like this? In the UK you apply to do a computer science course, and they have 50 places or whatever for that course, and so they admit 50 people. You can't get into the college and then be told that actually it's full. If they do accidentally over-admit it's their problem to solve, not yours.
The US university system is quite unusual (I believe Canada also has a similar system): at the undergraduate level you typically don't apply to a specific program; all students are expected to take a variety of classes (confusingly called "courses"), only some of which apply to a specific course of study.
Domain-specialized classes become an increasing proportion of your overall classes after the first or second year (when you choose a major) though never 100%. So while the school can get some idea of demand based on what people put on their application form, typically what you put on that form has little to no impact on whether you are admitted and doesn't actually bind you to a particular program.
I actually think this is a pretty good system, and was glad my own kid chose a US university (which luckily we could afford) rather than the free university education he could have chosen in either of his mother's or my countries. The theory is that you can get a broad foundational education to prepare you for a variety of possible futures, and also that there is more to education than simply work skills. Of course the reality isn't quite as utopian. It also means professions like law and medicine require whole additional degrees.
>Of course the reality isn't quite as utopian. It also means professions like law and medicine require whole additional degrees.
I kind of like the fact that in the US, doctors usually have some exposure to science outside of memorizing facts for their next test at med school. It makes them more likely to be able to understand studies, for one thing.
I might hazard to guess that if you study philosophy before going to law school (a very common path) you will be a better lawyer, but I can't attest to that with any first-hand knowledge.
Given the depth of knowledge that is generally imparted in 3 or 6 ECTS (average course length in europe, 75-150 hours of combined self/guided study) is fairly shallow, I’m not sure how much this would help the learner.
Why is this downvoted? Credits are based duration with an assumed correlation to workload (where workload actually changes pretty heavily with professor and field).
The LSAT is an interesting test. One big section is "analytical reasoning", which has you solving logic games. These are actually kind of fun.
After I took the LSAT, and scored pretty well (99.6 percentile), including a perfect score on analytical reasoning, I was in the supermarket and saw at the magazine rack, down in the section with the crossword puzzle magazines, a magazine of logic puzzles just like those from the LSAT. Based on the ads in it, the main audience for this was old ladies.
I bought it, and decided to start with a puzzle marked as hard, figuring it would be easy for me--I had just aced these things on the LSAT, after all, a test designed to make distinctions among the brightest students in the country. Obviously, anything old ladies could handle I could handle almost in my sleep.
It completely kicked my ass. So did all the medium puzzles. I think I was able to do a couple easy ones, with a lot of effort.
Relatedly: Bletchley Park deliberately sought out the very same (at the time, much younger) ladies to work on codebreaking tasks:
> The heads of Bletchley Park next looked for women who were linguists, mathematicians, and even crossword experts. In 1942 the Daily Telegraph hosted a competition where a cryptic crossword was to be solved within 12 minutes. Winners were approached by the military and some were recruited to work at Bletchley Park, as these individuals were thought to have strong lateral thinking skills, important for codebreaking.
Yes, price aside I really like the US approach of basically giving everyone a chance and slowly siphoning people into specialization (or out of university). In comparison to some of the "free" university systems where you end up with huge competition for a tiny amount of slots, the US seems to do better at finding those students that just didn't care about high school but are exceptionally talented in the one domain they are interested in.
I'm not sure how universal this is; I suspect that's the case at more liberal arts schools, but not the case at engineering schools. My undergrad was at Virginia Tech, and I applied to the CS program as part of my university application. I was in the CS program from day one. But I did part of my graduate school at William and Mary, and they are as you described.
While I was in undergrad, the chemical engineering major required over 200 quarter units, and was only barely manageable if you started right away and carried 4-5 classes each quarter for four years. There wasn't any time to chill out your freshman year and take fun classes, not if you wanted to finish in four years.
The system in Scotland is somewhere in between - atleast in the ancient universities. I was accepted as a BSc Math student and took Math, Compsci and Psych first semester. Second semester I switched to BSc Compsci and took two Compsci and a generic 'great ideas' module. It was generally up to three subjects first year, up to two second year, then your degree subject in third and fourth year. But you could only do subjects in your faculty, most subjects falling into Arts or Sciences, with a few such as Math, Psych and Philosophy being part of both.
That said they did limit some popular classes to only people accepted to those subjects but Compsci wasn't one last I checked - though first year Compsci in my fourth year had tripled in size from my first year.
It's important to point out the underlying reason for this: a US high school diploma is absolutely worthless garbage, so you need to continue generalist studies in university to bring people up to the standard of other developed countries.
E.g. someone with a French baccalauréat or enough British A levels can usually skip the first year or two of university in the US.
A US high school graduate with enough Advanced Placement classes can skip the first year of university (at least for public universities which accept those credits). But only well funded high schools can afford to offer that many AP classes.
Yeah, exactly. In France, getting a "baccalauréat général" (the high-school diploma leading to university studies) is optional, and not everyone does it. Just like AP tests are optional in the US.
There are two main differences.
Firstly, in France, this choice is available to everyone. Every student is able to go to a high school that offers it. It is not some unusual special thing that exists in rich neighborhoods.
Secondly, for people not enrolled in it, there are meaningful vocational options (bac technologique, bac professionel...), so they are still doing something useful. Contrast this to the US system, where your choice is between semi-serious academic work that kinda sorta approximates a European standard, and non-serious, waste of time babysitting.
This is because there's, for the most part, only one type of high school in the US. It's not split into academic vs. vocational. The US is one of the only countries in the world like this. I believe it expresses the idea that everyone has the ability to succeed if they try hard enough, so we shouldn't be sorting people into more and less prestigious tracks. This is a false myth, but it's so deeply implanted in the bedrock of American culture that saying common-sense things like "it's possible to figure out who the good students are well before age 18, and allocate resources appropriately" shocks a lot of people.
> But only well funded high schools can afford to offer that many AP classes.
No need. Study on your own then specially request the AP exam; it's what I did for the CS one, since my highschool didn't have a class for it. Ended up being the 3rd or 4th to succeed at my school, out of around 600 graduating students per year for over a decade.
When I was an undergrad (Finished in 2007) it was different. You were definitely assigned to a program which had required courses with a 1-2 half courses a yearyou could use for other courses. They didn't generally limit what classes you could take and I filled out my schedule with computer science courses. This changed in my last year where they switched to a specialization / major / minor system where you could combine courses to more or less create your own degree. I ended up leaving with a specialization in biochemistry and microbiology and a major in computer science.
That's the case in some liberal arts colleges, but is not typical, or at least wasn't in the early 00's when I went to school. Then most students did have to declare a major pretty early, usually when you applied for competitive majors like CS was back then. There was a "University Studies" major which you could be in for the first year if you were undeclared, but it was pretty important to declare a real major pretty quickly.
I don't know about other countries, but for non-professional degrees in Sweden a program is essentially only a guarantee to be able to take certain classes. Otherwise you can take whatever courses you want, at whatever school you want, as long as you in the end fulfill the degree requirements. Which usually means half your bachelors are credits in the same subject.
The rules states that you shouldn't comment about voting because it makes boring reading and doesn't make a difference. But it is both informative and does make a difference if you add new information, which I did. If the moderation team wants no comments about voting at all, they need to fix the issues of undue voting or change their reasoning. Until then I am going to act in the way that I think is best long term for a productive discussion. I don't have a problem people voting down an opinion, but when something is factual they should be called out on it. Because the people who do that are destroying the site for everyone. I don't see what your comment adds however, which should be the first rule of posting.
My (UK) uni admits about 350 people to the Bachelors programme (I think about 250 survive the first two years) and about 100 more into the Masters programme. This year we had over 300 people take the AI module and it has absolutely resulted in a decrease of teaching quality, and I'm not sure if some of the students are
qualified to be there (namely, Chinese students who can't even read the coursework specification correctly).
International students are moneymakers (they got £75k off me) so I'm pretty sure they also over-admit.
But everyone who was admitted can take the class, right? This article is talking about people being admitted, but then not being able to take the class at all.
I think this is a problem across the board in UK univeristies these days. When I did my CS degree (1993), I was one of two people in the year who got 1st class, now I think it's something like 25% awarded 1sts and the rest get 2:1s
I'm not convinced people aren't just legitimately working a lot harder.
When I went to university graduating in 2007, so just before the crash but not really very long ago, everyone was pretty chilled out about their courses. Nobody did internships. As long as people weren't failing they were relaxed and enjoyed themselves. Getting a 2:2 was a bit of banter rather than a serious threat to your ability to ever get a good job.
Now when I see students they are laser focused on absolutely doing the best they can in order to survive in a much more competitive world. I'd be working harder now as well.
> When I went to university graduating in 2007, so just before the crash but not really very long ago, everyone was pretty chilled out about their courses. Nobody did internships
Perhaps these things go in cycles. When I was in university in the late 90s, almost everyone tried to get some sort of internship over the summer, and some were very competitive.
The telecoms were booming back then and I was turned down by one of the big companies (Nortel) before landing a co-op at a smaller network equipment maker.
I was definitely influenced by the on campus CS culture (big "top 5" state school), which encouraged summer internships. However, I could also see how people who entered college after seeing and experiencing the economic pain of 2008 would have a more focused approach to how they spent their time in university.
British final undergraduate degree grades are first class honours (1st), upper second class honours (2:1), lower second class honours (2:2), third class honours (3rd), or ordinary degree.
That's what you put on job application forms. We don't use a GPA.
A 1st is good. A 2:1 is fine. A 2:2 is a problem. A 3rd is a failure.
Some people use cockney rhyming slang - a 2:2 is a 'Desmond' (Desmond Tutu, two-two).
In some cases you can get a double 1st, or triple 1st, but these are very specific to your university and are about what courses you took.
And, critically, the variance varies by discipline: science subjects hand out more firsts and thirds, while humanities subjects might give two-thirds of their students two-ones. So, as a single grade it's not particularly meaningful even within a given university. There's some cross-university moderation (there are external examiners for each course from somewhere else) but that only goes so far.
I'm not from the UK but I remember hearing that they were making school much easier over time their, almost to the point of "impossible to fail". Is this true of age 5-18?
The US, especially liberal arts colleges, is pretty wary of premature specialization. A common thing here is that you apply to a university, take classes in a range of departments, and then by the end of sophomore year (two of four) you decide on a major.
I went to Swarthmore (referenced in the article) before CS enrollment surged (www.swarthmore.edu/computer-science/alumni) and there were some people in my CS classes who decided to start on a CS major their junior (year 3 of 4) fall.
I went to UT Austin and worked there until 2004 or so...
In addition to the department (Computer Science, say) requirements, there are required classes for the college (Natural Sciences) and the university. A typical bachelors degree would be 120-130 semester hours, divided up into approximately 1/3 departmental, 1/3 college and electives, and 1/3 university and electives.
Someone like me (and I know I was) would go into CS and take 1-2 CS classes per semester the first couple of years, then more after they have requirements out of the way and have the prerequisites for higher classes.
Someone who didn't choose a major initially would take all the general requirements and electives until they did decide on a major.
A semester hour is one hour of class time per week for a 14-15 week semester; most classes were three semester hours.
How many computer science classes are needed for your degree? I believe when I was in college it was 9 classes (each one being one semester, about half a year). I took more than that, because I liked it, but many CS students just took the minimum.
(This doesn't count requirements outside the CS department, like math, or other graduation requirements like "some total number of classes, at least three humanities classes, at least three social science classes" etc.)
> How many computer science classes are needed for your degree?
Most non-US universities don't work like this. You get a programme that you follow, sometimes with alternatives that you can pick from. You don't have a target number of classes or credits or hours to complete. Everyone takes the same classes (except for the alternatives) at the same time in one big cohort all the way through the degree. There's not much flexibility for a minimum or maximum - there's just the one set path.
It's definitely different for US vs non-US schools, but I don't think counting in years is a good measure as it disregards depth, skipped courses, total # of courses, # of concurrent courses, etc. Even course stats like that are pretty handwavy.
It's more useful to refer to the CS curriculum standards across ABET accredited schools.
That said, we don't have anything like a comprehensive undergrad final exam for CS schools that would validate this.
The article is using an elective course as an example, for some reason. Usually what you do (based on my experience at UT Austin) is try to register for all your classes in the system, with priority based on major and year. Then if there's something you really need to get into that's already full, you go to an adviser and tell them why you have to get into that class that semester and they can override the system and get you in.
Sibling comments mostly cover it. Outside of truly specialized professional programs (engineering, nursing, architecture), US colleges delay specialization until year 2 or 3.
The other notable difference is an undergraduate degree in the US is typically a 4 year program. Many in the UK are 3 year programs.
The system is completely different in Scotland from the rest of the UK - undergraduate degrees are usually 4 years here rather than 3 years in rest of the UK.
This is largely based on the fact that admittance is done mainly done on what people do in 5th year of high school rather than 6th year A-levels.
The classes aren't usually limited to people majoring in CS. Lots of non-majors take these classes, and the popularity of a class among all students is not controlled by limits on how many can major in CS. In some schools they give CS majors first priority, but there are still all sorts of issues such as scheduling conflicts causing a major to need to sign up for a class late, after it has been opened to non-majors. American universities are not trade schools, and most require their students to take several classes--which students choose for themselves--that are entirely outside their majors. Waves of popularity then rise and fall.
Where I did my CS degree, any student in the major was always going to be able to get their core courses on a proper schedule that allowed for on-time graduation. Sometimes that meant the college would add entire extra courses if there were more people needing something than expected.
Did I get every elective (not all electives were taught every semester, some were only taught once every year or two, there was a large rotating selection - You had to take a few from different areas) that I was interested in? No, but I got most of them.
-------
However, if you were a non-major looking to take some CS classes (beyond an intro one specifically intended for non-majors), you would have a tough time getting in. You were only allowed to register for those after all the students majoring in CS registered.
US colleges' approach to admissions is more like selling aeroplane tickets but not asking people where they want to go, then be surprised when they all get to the airport and try to cram on the one plane going to the same place.
In other places you buy a ticket for a destination. Yeah some flights might be a bit overcrowded in some cases, but it's not as bad as not knowing how many people want to go to each destination at all.
There is certainly an aspect of overbooking present in university admittance/enrollment:
Universities regularly "admit" more students than they can actually allow to "enroll". If a university can physically support a freshman class of, say, 1000 students, they will "admit" a number higher than that, say 1300, knowing that only a percentage of the students that they admit will actually end up enrolling (because very simply some of them choose to enroll at a another university that they were also accepted into).
Here at the university I work at they refer to this "admittance/enrollment" ratio as "yield" or "yield rate". The yield is tracked from year to year and the number of acceptance letters sent out is based on this historical data.
Sometimes, though, the University is surprised (like this past year here) and gets a much higher yield rate and ends up with more students than dorm rooms...
At my university, in this situation, there _were_ more students than dorm rooms, and we have a freshman live-on requirement, so the university actually had to pay to rent out several entire floors of commercial apartments near campus.
They aren’t surprised. Rather, they simply can’t make all planes go to the same place. That happens over time, but then you have recessions where that plane becomes extremely unpopular, which messes up that growth.
Overbooking increases the likelihood of full flights which increases the efficiency of the system and reduces costs. It's hard to see a problem with that.
I think it's a value judgement. Is a lowered price of the majority of instances of a thing (flying or taking a class, in these examples) a worthwhile trade for the reward being a probability rather than a guarantee? Is the severity of the minority case (getting bumped) worth the trade? What actual probabilities for happy vs. sad path are acceptable to you?
Those are subjective questions with no inherently correct answers.
Also there's some worth in having spare capacity. When you're constantly overbooking and suddenly there's a spike in demand, you're SOL and there's no good way out of this.
In the UK they oversubscribe most 101 courses, including Psych and Comp Sci. The real watershed is Junior Honours classes which are fixed places. Even 201 is very large.
Oversubscribe in what sense? In that they're a bit crowded maybe - but it's not the case that they admit you to the university and then won't let you take the course, which is what this article is about.
Did you go to Edinburgh? They have a US system unlike the rest of the UK.
Oversubscribe in that they fill the auditorium and overflow with high numbers (psych 101 was 500+ and CompSci was 250+). Many will dropout or won't make the level 2 cut.
Follow the money. The cost per credit hour is the same regardless of what classes you take. That 500 person freshman CS (or engineering, or bio, or whatever) class is a big money maker for the school compared to that 40 person capstone class. Schools want to make money so they pull out all the stops to freshman enrollment.
The school doesn't care if students drop out because they already got their $$. The school doesn't care if students who can't hack CS switch majors because then they take more money making 100 level classes in their new major.
And you would have had to have done related courses A levels for the last two years.
You cant just rock up at a university and say id like to study X with out any previous education - this is what the Raspberry Pi was made for originally.
To be fair, you can certainly give it a go. There's no requirements to be able to submit an application, and university grade requirements are rarely total, only recommended.
You might not be able to get into a top flight oxbridge university, but show a history of interesting projects and maybe attending hackathons, competitions etc many universities will fight to have you
I mean they will want good grades in maths (and decision maths covers some parts of the CS curriculum) but little else, perhaps Physics - but other than that just good grades.
The thing is that the Computing A Level wasn't widely available and the quality differed between qualifications and exam boards as well - whereas Maths is pretty consistent and is run at every school.
I was fortunate enough to be able to study Further Maths as well, which honestly I would consider a better preparation.
I sadly concur, as someone who did fine at Computing A-Level with no work, found Further Maths difficult and was lucky to get a B with extra tuition and some juggling of module results to maximise my two final grades, and dropped out of a CS degree.
I liked coding for fun but really had almost no idea what CS was when I applied. So maybe the US system would have been better for me in that I could have dipped into a bunch of 101 courses and changed my major.
In the UK system, most people don't go to other subjects' lectures and if you don't like what you're doing, you have to apply from scratch. There's usually no way to change course at the same university or even get any kind of help or advice about finding an alternative elsewhere.
One of these years I'll work out what I should have studied and do that instead... actually, I probably won't, now it costs £30k+. I'm glad there are MOOCs so I can satisfy my urge to sign up for random things, watch one week's worth of lessons, and then never go back again, all for free.
I had some idea of what CS was as my dad was a programmer.
I chose Physics though in the end - I think the subject that really gets let down is Engineering, I had no idea what that meant.
The UK (and the rest of the world) really needs much better access to education - there is no reason it should cost 30k when huge parts can be delivered MOOC-style and the remaining exams and labs done in person.
Even the Open University costs a fortune these days! I hope Corbyn's National Education Service idea might fix it - if he gets elected and Brexit doesn't wreck everything...
Ah so its been taken over by the maths department :-)
Back in the day I tried to switch from a HNC stream mech eng to a Maths Stats and Computing but it was 99% pure maths (no CS maths) and the computing element was nugatory and ancient.
I made an account just to comment on this link as I have personal experience in the matter.
I graduated from a relatively good school with a degree in Computer Science, yet wasn't admitted into the major until 2 weeks before graduation. I spent my entire college career in Computer Engineering while taking Computer Science courses and (thankfully) was able to complete the curriculum without actually being a part of it. I lucked out as at my school Computer Science isn't locked down like many of the other engineering majors, so anyone can take a CS course as long as they meet the pre-reqs. It would be an understatement to say the experience was harrowing.
That's so strange. If there's room in the classes (as appears to be the case), why limit the number of majors?
My school (UVA, in the late-90s) already had recently converted Computer Science as a limited enrollment major to get around this problem. Declared CS majors had preference for CS courses. It wasn't quite so bad that others were completely locked out, but if you weren't declared, you had to be quick to register and might not always get into fun electives (the core courses were usually larger and easier to find a seat).
Since that time, they've actually added a second CS major. They now offer the original BS in CS through the engineering school. And a BA in CS through the college of arts and science. The two primary differences being a foreign language requirement for the BA students, and a heavier emphasis on math and general engineering in the BS program.
Since when is being a software engineer a "high-status" job, as mentioned in the first paragraph? Most people would consider a non-washed out doctor or lawyer (which is expensive and difficult to achieve) as having higher status than pretty much anyone working with a CS degree, unless they are a successful founder or went into investments. Besides medicine/law, young people with some degree of privilege and ambition seem to go into banking, private equity or management consulting still. If they have a CS degree, they're doing something closer to the above with it, not being ads CRUD engineer #3445. Everyone working for a for-profit company as a software engineer and even founders and VCs are basically dancing to the tune of one of these people holding the purse-strings. Is this wrong or do I just have a chip on my shoulder about this?
> Besides medicine/law, young people with some degree of privilege and ambition seem to go into banking, private equity or management consulting still.
The number of MIT/Stanford/Harvard kids running around the valley should disprove that point. As someone who attended a Goldman/McKinsey farm school, many of the prestige chasers who would in another era have gone to a bulge bracket are going to work for a FAANG or a unicorn instead because they do see tech as an appropriate substitute for a more traditional high prestige career.
If by lawyer you mean partner and by doctor you mean a practicing surgeon, both of those are pretty far into their careers. It's not very fair to compare them with an entry level engineering job.
By lawyer I meant either hired at big law or running a profitable practice, by doctor I meant made it into residency. This is only one, narrow dimension of "high-status", but still. Newspapers get things wrong all the time.
Interesting. Manual laborer or "blue-collar" in the US is almost like a separate "tree" than professional or "white-collar" jobs. If you actually have a skilled or semi-skilled blue-collar job that hasn't been displaced by immigration or automation you are doing pretty well compared to those in the bottom part of the "white-collar" tree. Also the majority of your friends and family would probably be blue-collar and you wouldn't even be comparing yourself to doctors and lawyers.
I've heard that "actual" engineers, like people who design plants, vehicles, bridges, buildings, etc. are considered higher status in Europe than in the US. Here I think probably the top 20-30% of software developers are better regarded (and make way more money) than most of these engineers, unless these engineers are on their way to upper management or determining investments. I also think whether you're one of those people -- your social class, basically -- is pretty much determined by the day you turn 18, even in the US.
While the article bemoans these programs becoming restrictive, it is ignoring a major reality. Let's take a person. This is the person that might end up getting 'weeded out' before they've even gotten their foot in the door. They probably have 0 or near 0 background in anything related to computing. They similarly likely have negligible background in anything related to science/engineering/mathematics. And keep in mind we now live in the era where there are a million free resources online (as well as compilers/etc) meaning the cost of doing any of this is practically 0, so they've generally chosen to never pursue any of this.
Let's take this person, John Doe, and imagine he gets accepted to a computer science program at a well regarded university. The majority of people in his class will fail/drop out of the program before graduation, regardless of background. How likely is John Doe in particular to make it through the program? I don't know the exact number, but it is going to be extremely low. What these 'barriers' do is not only save people from wasting their time, but also make room for people who stand a better chance of making it through the program.
And furthermore rejection is hardly some death sentence. If somebody genuinely wants into the program then they could take independent classes at a community college, take remedial non-major classes and demonstrate excellence, or any of a wide array of other options that could then be segued into acceptance next year/semester. And this doesn't even necessarily have to slow them down. There are so many core non-major classes required that you can get those out of the way and ultimately end up graduating in about the same time as if you were accepted to begin with. Imagine getting your calculus, linear algebra, physics/chemistry/... pick, etc stuff all done in your freshman year. That would've actually been AWESOME to have been able to have your schedule packed with nothing but CS classes and maybe a few softball liberal arts requirements for your later years.
It's not a failure of the system; students are, increasingly often, mistaking universities for diploma mills. Some universities gradually acquiesce, others hold their ground.
The universities that maintain their standards for graduation will typically have droves of students dropping out, because there are droves of students who do not put effort into learning the material.
Attrition rate is high in engineering - I assume it's similar for CS.
They don't all drop out of university - just the program. They usually change majors. I assume most still get a degree. People who can't handle CS usually go for BIS or something IT related.
Georgia Tech's online Masters in CS appealed to me for this very reason. Classes generally seat 300-700 and limited by the number of TAs they can hire. There's a fair chance of getting into popular classes since their not limited by the dimensions of a physical room.
My school, Purdue, has handled the surge pretty well, primarily by being very accommodating towards increasing capacity (when CS majors are involved) and severely restricting non-major access to CS-major classes.
For the lower level core classes required of all CS majors, space is basically guaranteed for CS majors. If there's not enough seats, capacity will get increased to accommodate the students who need to take the class.
In the upper level classes, capacity will generally be increased if needed. Sometimes, there's a handful of seats (1-10) open in the less popular class at the beginning of the semester.
During registration (which occurs two-thirds of the way into the prior semester), only CS majors are allowed to sign up for CS classes. All other students (including CS minors) are required to submit a request which will be decided on a space-available basis the week before classes start. Those requests are only granted for students who have taken <=5 CS-major classes. There's a very specific sequence that CS majors usually take the core classes in. The semester a CS major would take a core class if they're on schedule is called a "peak semester". Non-majors are completely barred from taking those core classes during their peak semester.
Isn't CS teaching the kind of thing that ought to be able to scale?
There's no other subject that's as well covered on the internet; nothing in an undergraduate course isn't public knowledge. There's loads of video lectures these days.
And the coursework is incredibly scalable too, in addition to being similar to what people actually use in industry. My brother did CS at an Ivy, and they were just pushing repos to a server, and the UT software would check you'd done it properly.
Couldn't you just let everyone enroll, then make sure they do their practicals?
Every topic has plenty of literature published. Scantron was founded in 1972. Self-directed learning has always existed. CS isn't special.
A bachelor's degree once meant more than passing a sufficient number of tests. Now that everyone treats it like a ticket to employment, it barely even means that.
I would agree, that CS isnt special here, its something common to at least STEM. And personally I dont know why the first 1 year isnt optimized for that. I dont think for example, that there is any reason to teach Math 1 for CS every semester with a Prof in a room limited by how ever many students might fit in. Its simply basic knowledge, that just doesnt change but that you need to know as a prerequisite for understanding later stuff. There is absolutly no reason to waste the time of Professors, or students for that matter, on that. Instead of making admission to college dependent on prior work, why not simply provide the necessary books and film it once. You learn the fundamentals yourself. Admission would be a lot fairer, we wouldnt waste an incredible amount of time of Professors and they could in turn teach actually interesting stuff.
The whole thing got quite apparent when a former roommate of mine studying mechanical engineering asked his father to help him study for a first year course and they figured out, that his father used the exact same literature in his college days as my roommate did at the time. The content hasnt changed, yet we keep throwing time at manually repeating it or going even as far as reinventing it with everyone writing their own script on the topic.
>Every topic has plenty of literature published. ... CS isn't special.
Not as nicely made video lectures. I doubt I can find video classes of some of the upper electives I took in engineering. Sure - textbooks have always been available. But they are dense, and rarely does a class cover the whole book. It helps to have a professor point out "You should know this" and "Don't bother with this - it's just academic detail."
Yes. And is being scaled. There is unlimited access to freeCodeCamp and the WGU C.S. degree. I'm pretty sure that if you go through those, you'll get a start as a software dev. The C.S. degree is probably unnecessary for getting employed. It's still a multi-year process, but available to anyone who wants it.
Automated testing scales easily, but properly reviewing code/proofs/technical prose really doesn't. Lecturing scales even more easily, though maybe not if you want students to be asking questions in class.
I had a professor mention that during the last bubble (circa 2001), they had dedicated an entire building to the EECS department. Shortly thereafter, the bubble popped, enrollment collapsed and the EECS department went back to just the hacker type. It seems we are in another one of those cycles.
Anecdotally, when I was in university around 2012, I noticed that the same people who would have gone to medical or law school, abruptly change their paths to CS. When I started hiring people, I noticed the same trend in resumes.
If CS is so popular right now, how can it possibly command high pay by the time these people hit the job market? Do they all expect to compete against each other for high paying positions? How is that a good bet? Do they expect that there will be a ton of jobs that will all pay well?
There's an information delay that causes these trends. Remember that the people making decisions on their college majors are 16-18 year olds. They're basing these decisions on $$$. The university encourages these decisions because of $$$. The new students aren't looking at the job market and recognizing that they're pushing it towards a glut (as a cohort), as individuals they're making a sound decision in that moment with the incomplete information and lack of forethought typical of someone that age.
Of course, that lack of forethought isn't exclusive to them. You see the same behavior in other markets. Today I see that someone has developed X and is making money, but the market isn't captured (lots of growth potential). So I decide to make X (sound decision on what I know). However, 98 others also make the same decision at about the same time. By the time we all start shipping X we've flooded the market and now it's not profitable (getting too small a slice of the whole, or competition drives prices too low). I'm trying to recall the precise term in systems theory, "bounded knowledge" maybe?
EDIT: "Bounded rationality" was what I was trying to recall. It's useful in a number of fields, I came across the term (proper, the concept wasn't new to me) in studying systems dynamics.
I've talked about this with friends – is CS a gold rush right now? I'm sure with the younger generation being fed that it's a cheat code to make money the market will flood and wages will be driven down. Or, web dev jobs will be flooded and proper system design jobs will stay the same since they have such a high barrier of entry? We'll just have to wait and see.
I'd be curious to know whether there's a good way to avoid this. As someone who's currently in high school and looking to major in CS, this really worries me - am I actually going to be able to find a job a few years from now? Should I be planning to study something else instead?
Personal opinion, the general guidance I give to teenaged kids of friends:
The best way is to choose to do it because it actually interests you, not just because you want the money. You can get a job with the degree despite the glut of graduates. Just accept that you may not always be getting the $150k/year starting salaries you sometimes hear about.
But if CS is only of passing interest and you have other majors you're considering:
I wish more people would minor in CS, rather than major in it (NB: Not all schools have a good CS minor program even if they have a good CS major program). I have a lot of engineer friends (from my professional career or my time in school) who have little programming skill, but find themselves increasingly needing to program. Even the first 5 CS courses in most programs (versus the first 1-2, at best) would make them 10-100x more effective in their jobs.
A former colleague (EE) wrote test analysis software (we were both doing verification and validation work) that took minutes to analyze the data. My slightly improved version resulted in getting execution time down to seconds in the worst case (500MB file being processed). The improvement didn't need more understanding of programming than a 2nd year CS major would have, but he didn't have it because he'd taken literally one programming course (and never programmed again, except for small matlab things). I've seen similar things from aerospace and other engineers. Their code is usually correct, but often inefficient. Or they lack an understanding of the underlying memory model of a language like C and try to do impossible things (that, again, a 2nd year or so CS major ought to know).
So: Stay the course with CS if it is of real interest to you. Otherwise, look to double major or minor and use your CS skills to stay ahead of others in your discipline.
Thanks, that's a good point! I definitely am interested in the subject itself, and I think I would want to do it even if the money won't be as good as it is now, but it's hard to deny that's a factor as well. Mostly, I'm worried that there will be enough oversupply that there will barely be any jobs available at all, but it's reassuring to hear that may not necessarily be true.
I almost wonder if it would make more sense to major in CS and minor in something else, which would hopefully make it easier to specialize in a particular field rather than competing against a lot of generalists. I'm not entirely sure what else I'd want to do though, which goes back to the lack-of-interest problem that you mentioned. It's definitely something to think about!
Yes. My school (UC Berkeley) was the same (felt like everyone wants to study CS so they made the GPA cap 3.3 so if you have <3.3 no CS for you, and classes were STILL overcrowded clusterfucks) but my graduating class had $105k avg salary.
It used to be that way but nowadays Data Science, Statistics, Applied Math are a little more popular (in this order) since they give a larger flexibility. Especially Data Science, which is a new major gave its first graduates this Fall! CogSci is still a pretty popular major though; the reality is there are people studying "CS" among at least 6 different majors in Berkeley (EECS, CS, Data Science, Stats, Applied Math, CogSci) who will be future software engineers/data scientists. Note that Berkeley is a ginormous school, this is a lot of people... When I was in Berkeley it felt like almost every student was somehow affiliated with CS one way or another, even social science students (of course there is some sampling bias here).
(of course, one should also note that most of people studying Stats, Applied Math and CogSci would be studying not for CS but for other applied fields (physics, bio etc). These majors offer a huge flexibility for their applied cluster)
Having a degree is not the same as being able to work as a coder.
Sure it overlaps, but it's a kind of marmite-like work: you either love it or hate it.
You also have a fairly harsh selection process, at least at the top end companies. Not everyone is going to pass the whiteboarding quiz, and although you can somewhat study it, it's very different to most other fields where it doesn't feel quite as harsh.
More CS workers means more output, which means even more CS workers can be hired. Since knowledge work has such a high productivity multiplier we seem a long way from saturating the market.
Well, it won't, as it also does not in most of the world. In Europe, CS pays middle income, about as much as a better-earning manual laborer or something (with the same status as well), with incomes ranging from 35000-70000 in my country. Companies in the US would probably like that as well.
It'll also drag the quality of major way down. I TA'd a few classes at my University and the quality of people is not that high already.
Entry level/junior positions already have a plethora of candidates. Where companies are sorely lacking are in senior level talent. It's going to be a gut check when all of these CS grads come out and they are competing with everyone else plus all the boot-camp coders.
There's not a ton of overlap with boot camp grads. At least here, only the bottom barrel of CS grads even apply to positions that are good fits for boot camp grads.
> There's not a ton of overlap with boot camp grads. At least here, only the bottom barrel of CS grads even apply to positions that are good fits for boot camp grads.
That is not true at all. Maybe if you are in the upper-echelons at the Stanfords, MIT, and CMU's of the world doing advance research in AI/ML/Robotics/Security but for the most part most CS grads are going to school to get a job -- I was one of them.
Some universities rank those that are majoring in CS higher up in the registration chain, than those taking it for elective, those that are minoring in CS and those that are returning back to school. This means multitudes of students face difficulty in getting into their 100 & 200-level courses. At my school it isn't uncommon to have three sections of 300 students each in the 101 class and 300+ people vying for limited seats in the 200-level class. One of the sophomore level classes expanded their seats to nearly 500 people, but that doesn't solve the problem of getting into the upper division courses that are capped at 150 people. Schools are leaving out a large population of students registering for these courses, which may include first generation students, women and minorities. Even the 100-level courses can nearly be impossible to register for at this point.
There is also a factor with lag in budgeting. When I first started computer science at my school it was a smaller department (maybe 2-3 intro cources of 50 people). When I was a JR, my department didn't have the budget to put out a off schedule OS course. The budget was consumed by 13 intro courses.
As I was graduating, my program not only had 10-13 intro courses but had 4x as many electives as it had when I started.
Budgeting is an issue with my school as well. Our school is understaffed and has been opening up positions for new faculty hiring, including hiring teaching only positions (where they hire PhDs for such positions, with renewal contracts).
I've also heard of the idea of charging a departmental fee to become a CS major, similar to the idea that certain engineering departments have such fees. The idea here is that this fee would go towards the department but allow the student access to things like hacker spaces and or other resources that everyone else in the school would be presumably excluded from. A lot of things are trying to be figured out to address this issue.
Isn't this a problem of too much coupling between credentialing and education? People aren't seeking to sit in a room full of 400 people through a lecture given by a certain professor at a certain time. They could probably find the lecture video online and watch it whenever and however frequently. They are only seeking the stamp of approval from a well respected institution. Assessment is not the same problem as education. Even though online education is getting some traction, it seems as if employers (and parents!) can't give up obsessively indexing much on traditional multi-year lecture-driven university education.
It's ironic that university engineering (especially software) departments teach modularization, decoupling, distribution, reuse, etc. that makes possible to scale like Google, Amazon, etc, but fail to apply these learnings to themselves.
The hard part of computer science is the staggering complexity, the abstraction, the language/framework/library learning fatigue, the stress of trying to deliver on overoptimistic deadlines. Getting into a class is nothing in comparison to these actual challenges.
Another idea to retain or hire faculty: let them have dual enrollment at tech companies and universities. Not sure if this is a good idea, or if it works. But I'd certainly be interested in studying the outcomes of it. Potentially could cause a conflict of interest in motivation of academic research vs industry driven research to profit the company, or even be used as a way to fast-track students to said company. Not that these things are inherently bad, but such decisions could have undesired outcomes for the universities that do this. But could be an idea worth exploring. Or, universities could become more competitive with salaries.
I had several classes taught by adjunct faculty who had "day jobs" in my years at UVA. Although they were all in economics, not CS.
So, it can work. But, you need a good supply of local professionals with the aptitude to teach. Probably not too hard for entry level courses, but 3rd or 4th year courses are likely harder to find suitable adjuncts. I'm sure I could teach CS101, but no way could I teach a senior level OS class.
My school had industry practicing adjuncts like this. It mostly works though adjunct comp is radically different than tenure track. Adjuncts did it because they enjoyed teaching or were seeking a more permanent professor position, not for the adjunct comp. There are definitely some perverse incentives around exam difficulty and grading.
We also had full-time instructors aka teaching professors (no research).
I know I’m one data point, but I’ve taught classes as an adjunct before and my (tech co) employer was fine with it... it’s totally workable if you can say get Friday mornings off.
This presents a more interesting problem than the article realizes. Computer Science has long suffered from overstuffed first year courses.... but only first year. By the second year, the vast majority of students have changed majors or otherwise left the computer science field. With even greater overcrowding of first-year courses, it probably makes it even harder for a person earnestly interested in computing to get in. At least when I attended school, every single person who gave their reason for choosing CS as 'I heard there was good money in it' was gone by the 3rd semester. I'm not entirely sure, but I've always suspected this might be due to the fact that while you can BS your way through an essay, you can't really do that with a compiler. Either that or a fundamental psychosocial resistance to hard-line logical thinking.
In any case, perhaps they should factor this in somehow and do a bit of preparatory work to give potential students a real sense of what the work and subject is actually going to be like before they fill up the first year courses and just waste everyones time before dropping out?
Why the emphasis on recruiting new faculty to teach undergraduate level computer science courses ? Does the material genuinely require the input of a professor? It seems that universities/colleges are experiencing short-term shortages of teaching personnel that can readily be recruited from other sources without necessitating hiring new faculty.
In my university a professor isn't usually newly hired for undergraduate level CS course teaching- but the university doesn't want to hire a non-phd to be teaching undergrads. It's not a good look for a university to be hiring non-phd bearers for its teaching positions. However, CS PhDs can (relatively) easily go into industry or tenure-track dependent on what their research was about, so there's not many PhD people who are willing to take a teaching position.
I'm very curious to see the number of people start as CS and drop vs the number of people who end up as CS. I know at my school there's a lot of people who come in wanting to do CS but end up dropping and a decent amount of people who end up in the CS major either via other STEM majors or just randomly.
Because of this churn, the intro classes need to be a lot bigger than the rest, simply to accommodate both sets of people. And really, intro classes are the most important part of the CS degree, for motivating people, for teaching the practice of programming and for the skills that you'd actually use on the job.
One potential solution is to just not hire research faculty for intro classes. While I do kind of believe in having actual professors teach undergrad classes, if schools are struggling to hire PhDs, why not look at non PhD's? It's not like having a PhD helps with teaching intro classes. If anything, it probably hurts.
At my average small-to-mid-sized state school in the midwest, a typical CS drop rate from day 1 to graduation was ~3/4 either switched majors (mostly to business majors) or dropped out of school entirely. Technically it's probably even a bit higher than that because this is net of people switching into CS. Not sure how students who transfer schools are tracked.
This is what I would like to know as well. These cycles tend to follow the job market pretty well. I remember when I was in school after the dot com bust our CS department really cut back on entry level classes because the interest was not there. In the end I don't think it made a large impact in the number of CS graduates.
Meanwhile, at community college, a place with NO admission standards except a high school diploma, enrollment numbers are down, down, down. How do I know? I teach at one. And I used to teach at a different one, WITH TENURE, but we didn't have enough students so they let me go. A Merry Christmas that was.
> Some universities now require incoming students to get accepted into computer science majors before they arrive on campus — and make it nearly impossible for other undergraduates to transfer into the major.
I was slotted into an adjacent major (applied for CS, got slotted into CSE) when I went to undergrad at UCLA, and had to demonstrate grades etc and apply to move into my desired at a strategic time (after enough of the cohort in my desired major had switched away to other majors). This has definitely happened before.
I got so lucky when I did an internal transfer to CS at UT Austin. I was an economics major and wanted to give CS a try. At the time (~2013) all I had to have was a 2.5 gpa and sign a transfer document.
The next year the department exploded and kids with 4.0s and previous internships were getting denied. This was also the first year with the new Dell-Gates Computer Science building. I didn't have a class in that building until my Senior year!
I think this is a little different. American schools now court more foreign students than ever to populate these programs. Domestic students that would otherwise go into EE or CE have fewer job prospects compared to 20 years ago and are better off in CS too. Previous booms were more faddish. Now it's basic economics.
Interestingly I went to an okay state school. I remember when I first transfer from college to complete my computer science. I wanted to take 4-5 classes of CS to get a ahead, but only two classes were available everything else was already filled.
I literally had sign up for classes as soon as registration was open to graduate, otherwise I would not get into courses required.
I wonder to what extent the barriers of entry for CS programs filter out some of the best future software engineers. Being laser-focused on getting all the homework done and maintaining a 4.0 GPA is generally not the same kind of personality that results in creative thinking and problem-solving skills.
I'm reading part of the problem is getting good teachers, because the tech industry wants them as well. Would the tech industry be willing to cooperate with universities and help pay for teachers / make them available to give classes? It's their future employees after all.
At my school almost all of the electives where taught by people in industry.
They either taught a 2 course long elective and offered internships at their company during the second course (Hey you spent the last 5~ months training these people and no you know who is a good worker), or they taught one elective and offered internships to the students they liked at the end.
Their companies paid them to do this for... obvious reasons.
I've always really enjoyed teaching people about programming topics at work and I've often wondered about trying to volunteer at the local high school or something, but having a wife who has worked as a teacher and a mother who is a college professor I have zero illusions about how much work it takes to do this well. I'd need a pretty significant amount of support from an employer to even consider this and I've never worked for an employer that I think would have supported it to that degree.
It's probably something I'll never do despite being interested in it because it would be too much effort.
Whoa. Don Fussell was my thesis advisor at UT Comp Sci a lot of years ago. That was back before the Gates building, when Michael Dell was still selling PC clones from his dorm room.
very stressful to try to teach under these conditions! the logistical problems of teaching and consistent grading grow worse than linearly with the number of students, unfortunately. if you want to have 500 people graded on the same curve, it's a lot more work to make sure they're all treated consistently and fairly.
I've spent a couple of weeks looking at resumes from people applying for a web development position we have open and half of them mention machine learning or AI.
I was interviewed for five mins on the phone back in 1990. They asked if you programmed your own computer at home. Not sure what they would say if you said "No."
This wasn't particularly true by 1990 but early in computing history, most engineers didn't have any computer in their house; being expensive and all. Maybe they thought of it from that angle. There'd still be a lot of older engineers who may not have gotten on board the PC revolution that began in the late seventies.
I know this doesn't refute the point the article is trying to make, but it doesn't seem that unreasonable to me that an underclassman might not get into a highly sought after elective. A lot of the "fun" electives when I was in school were little passion projects taught by one professor for one or two small sections a semester (or maybe only one semester per year), and while I agree it sucks that not everyone can take the classes, I wouldn't want them to die out because of complaints along the lines of "if you can't accommodate everyone, don't offer the course". Not that the article is saying that should be the case, I just worry that too many complaints about this kind of thing just puts pressure on the good professors who put enough effort into a class that a bunch of people want to take it.