Hacker News new | past | comments | ask | show | jobs | submit login

Universities are always several years behind the curve. At college in the 90s they were still teaching token ring networking despite Ethernet already being common place. The same college told me that programmers didn’t design any of the code they write; they only transcribe code from flow charts.

Just yesterday I was talking to a grad about DevOps. He said the field sounded boring from what he was taught at uni. Then when we discussed it more it turned out his “DevOps” course was actually just teaching them how to be a scrum master and didn’t include a single thing about automation, infrastructure as code, etc.

I also remember just how garbage general publications were with regards to IT. And to be fair they still are now. But there was always a wealth of better information in specialist publications as well as online (particularly by the late 90s).




That may well be true of some universities today. In 1970, they were pretty much the only place you could get hands on experience with a computer unless you somehow slid into a programming job in the financial industry, or a one of the few other areas that actually used them. And they were not behind the curve on the technology, although they tended to have lower end hardware than industry, because any compute was very expensive. The invoice on a 64k byte HP3000 in 1972, which on a good day could support half a dozen users actually doing any work, was over $100K. Memory upgrades to 128K ran you about $1/byte installed - maybe $8 in today's money. It was a big deal to be allowed hands on use of them.


I was talking about 90s to modern era. Not just modern era.

And having computers doesn’t mean any of the lecturers understand the modern (for that era) trends in computing. More often than not, it’s computer clubs rather than cause material that hold the really interesting content.

I don’t doubt there will be exceptions to this rule. But for most people I’ve spoken to or read interviews from, this seems to have been the trend.


It definitively is true of local universities. I've met people from the local university who have a master in machine learning, yet have never heard of docker.


This is a good thing. Opportunity costs are incredibly important with university educations because students have a limited time to learn.

Why spend the time futzing with a tool like docker? It's not foundational to machine learning, so learning that tool takes away from time that could be spent learning something more relevant. And the student may or may not use it when they get a job.


"Getting shit to work" is more foundational to machine learning than you would think, and containers helps a lot with that. If you want to train models on someone else's machine - and you probably will, for anything big - you need to know a little about how that sort of thing is done today.

And if you want to try two different deep learning frameworks, dependent on different versions of cuda, and want them to not break each other, God help you if you try that without containers.

It's not that they don't have a "course in docker". I understand that. It's that they haven't even heard of it, so they don't even know where to start to look for solutions to problems like that. I have been through that pain myself.

Containers is just one of so many easy things, that make your job so much easier, I've learned the painful way in 20 years as a developer in (mostly) non-elite companies, where no one else knew it either because they hadn't been taught at the local universities, because no one there knew it either.


Docker I can forgive, but I’ve worked with a lot of grads who haven’t even been taught the basics of using the command line.


It's highly dependent on school. The Ivies, including "public Ivies" will teach you proper comp sci. A lot of other big schools will do you well also. When it comes to smaller regional universities or junior colleges and community colleges, then it's hit or miss. Your intro CS course may be great if you manage to get an instructor who knows it well themselves and wants their students to know it, or you may get someone who teaches students how to do Microsoft Office without a shred of programming.


I went to RIT in the early 2000s. I remember the CS and CE departments were quite good (although the prevalent Sun workstations were already getting outdated). Somehow I ended up taking 1 elective from the "Management Information Systems" department and the instructor kept mixing up search engines and web browsers. I think I dropped the class shortly thereafter.


I dumpster dove at RIT to pull out a discarded VAX (in think an 11/70) and serial terminals. Probably about 1989 or 1990.


I was having to deal with token ring in '96-'97, and have not touched it since. Seems like it went away quite quickly. Cue up someone replying that they're still maintaining a token ring system in 2022... :)


I had to deal with token ring way up until 2001 when even the most die hard nuts had to admit that you could buy a dozen ethernet cards for the cost of a single TR. IIRC the TR people tried to convince us that ATM was the future.


Not quite 2022, but yeah I was maintaining a token ring based network for some subway at my last gig in 2019. As far as I know, no work is done on it now but the subway-car using the system are schedule to run for at least another decade so another bugfix release of the networking firmware is not entirely out of the question.


Hah, not quite nowadays but I, too, was dealing with one from around '97-2000'ish. What a pain in the ass. That was just one network in the building, I also had to deal with 10base-t, which was also a nightmare. shudder


I remember taking a graduate level networking course at NYU in the early 1990s. The instructor was an IBM consultant. We studied token ring, FDDI, SNA, HDLC/SDLC and several other commercial products.

One evening, I raised my hand and asked when we were going to study TCP/IP.

He simply quipped, "TCP/IP is not a real networking protocol."

So I wouldn't say that universities are always behind the curve :)


In 2015 or 2016 o was taking the computer architectures class at my local university… the processor they based the whole course upon was the motorola 68000.


As far as introductory courses go, the older/simpler the processor,the better it is for everyone. My class groused at being taught "old tech" because we taught the 68k, but very few of us had done any assembly before, I think most of the class would have failed if started of on amd64


And why wouldn't they base it on that CPU? If you're trying to learn the basics of shipbuilding, you don't start by going on a deep dive into the construction of an aircraft carrier.

It's a simple chip, with a simple instruction set, that can actually be taught to you in the time allotted over a three-credit class.


The class was worth 10 credits though.


The bit on "DevOps" is pretty egregious. There's two key things at stake here.

1. "DevOps" is an absolutely critical part of automation. It's the reason why we can start tech companies with such small engineering staff compared to 20 years ago. It's as important as all the high-level languages we use. This stuff is the logistics of how software gets deployed. It's the same in business as it is in war. Coding chops is like tactical strategy, and being able to ambush a tank column. It matters, and you won't have an engineering org without it, but the whole chain of how stuff gets deployed and iterated is what keeps the ammo flowing and the fuel pumping.

2. Universities want to teach stuff that'll still be relevant in 50 years. Given their proclivities, that means stuff like algorithms.

On one hand, I think that universities and academics can be somewhat forgiven for their ignorance on this matter. In fact I think we ourselves don't know what's going to be needed in our field in ten, twenty, thirty years. If the folks in industry didn't predict infrastructure-as-code 20 years ago, then the universities couldn't have taught it.

But what I know now is that:

- After all these years, no one is getting rid of shell scripting.

- Old school (i.e. 2nd generation) config management still has its place in many companies. Ansible is great for provisioning an AMI, if you need one, but if you need static infrastructure, puppet and chef are actually better because they track state, which allows you to better manage config drift.

- k8s may be hot and all, but a lot of the underlying "ops" stuff still translates. You average resource usage over pods instead of hosts, for example.

- Put together, there is an "instinct" for ops that is not unlike the "instinct" people learn for math, algorithms, and code. They are completely separate and an engineering org needs both. I think that universities don't "get" ops because computer science is more like math, whereas ops is more like history.

- On one hand, being stuck in an older ops paradigm is pretty awful – if you missed the transition to infrastructure-as-code, then it may be really, really hard to get out of that rut. But the field itself can be pretty bad with being stuck – it took us forever to give up our own datacenter racks.

- But otherwise, the old knowledge about old tools didn't necessarily just go away, in fact it's oftentimes still quite relevant. Linux internals (e.g. iptables) are still useful.

- When I was at CMU, a lot of folks learned some of that ops instinct in the dorm room, and in the computer clusters. But the universities pretty much made it optional. Looking back, I think this was a mistake. Ops is pretty much entirely transmitted through osmosis, whereas we at least try to teach people to code in official uni classes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: