I've had the pleasure of working with 60+ year old programmers. It is opposite of what you expect from the stereotypical mainstream opinions. I constantly go back to them, call them for lunch, go for walks, just so I can ask them questions about old days and how things used to work. There is so much to learn from hindsight, this knowledge is vanishing. Take advantage of learning from older people even though it doesn't fit the current trends. There is wisdom and experience under those opinions, which can sometimes be a little harsh. Similarly, read old computer books. Byte magazine has a full archive online. Read user manuals of IBM 360 system. For entrepreneurs, read old corporate press releases. Pull up a copy of Westinghouse's 1978 annual report as you fall to sleep. Fantastic stuff, I am enamored by history.
I'm a 62, almost 63, year old programmer. Thanks for being open-minded. It's been a great career for me. My first computer was a PDP-8/E and an ASR-33 teletype. I've had the good fortune to see a lot of computer history over the years. I still do embedded systems programming for work and hope to continue to do so for another five years. I've used UNIX most of my career.
Thinking back, maybe the first useful program I ever wrote was on a printing calculator that could remember a sequence of operations -- maybe up to 64? -- but had no branch instructions. You would push "go" each time it stopped, kachunk kachunk kachunk..., until it finished. The program figured loan interest payments. (Printing calculators used to be loud.)
I don't know if I have favorite resources to recommend. I know there are a ton of YouTube videos and podcasts that offer a lot of good ideas. I like reading the Linux Kernel Mailing List and tech notes.
Simply buying any one of the cheap boards you can find online (if they have supply) and trying to get something like a JTAG debugger working will be beneficial to you. Just try to get something simple going on bare metal. It's been a while since I worked with PIC microcontrollers, yet as I remember there is a lot you can do with some of the cheap PIC devices, a breadboard, and an investment of $100 - $200 in electronic parts. It was important for me to build up a modest lab of electronic test equipment, power supplies, breadboards, etc.
I find it interesting how stuff that was relevant for PCs in the 80s became relevant for early smartphones. I believe computing history often is cyclic in nature, so there is value in the old ways.
I always thought of any human progress like gradient descent algo. Hindsight is clear, and if current tools/services/methods/processes don't work, look at how it used to be done. Was it better objectively? May be we took a step in the wrong direction, go back and learn; try to take in a different direction, may be there is a higher ground ahead. It is also important to allow (and be tolerant) to experimentation in different directions because without it; we risk getting stuck in a local optima forever. Opposite of that is Chesterton's fence, there is a reason why we are on this hill, if you go back, it's a steep cliff.
"The Wheel Of Reincarnation". Seriously, it's called that. Ideas from mainframes show up as "new, innovative ideas" on minicomputers, and then on micros/PS, and now on phones.
I am familiar with that as "The Law of Recapitulation" based on a discarded idea from Biology. I can't find any references now, but the idea was that the evolution of mainframe computers (memory sizes, features such as caches, and so on) was repeated years later by minicomputers and even later by microcomputers. Knowing this I was able to design very advanced micros in the early 1980s since I knew what the future would be like.
"The Wheel Of Reincarnation" described by Ivan Sutherland is closely related, but is based on the idea that computing history is more like a spiral than a straight line, so the future is more similar to the past than to the present.
It's kind of crazy to think that computing is still young enough that we could have worked with the developers of operating systems, the world wide web, foundational protocols, and file formats. I had a CS professor 13 years ago who worked on computer chips back when personal computers were just starting to take off.
In 30-40 years, I wonder how many things from that old guard will still be around.
Haha, this reminds me, I used to work at a startup social network and I kept hearing from various people there that "people over 30 don't understand tech"; buddy, who do you think built the computer you're using and all of your software tools? :p
If anything, it’s people below 20 that don’t understand tech. Smart devices have made interaction with tech literally too easy, to the extent that uni curricula have to teach people how to use files because they didn’t learn how on their own.
This is of course a generalisation. There are plenty of 20yo and younger who are curious about tech and do understand the basics. But earlier iterations of tech depended on you knowing these concepts. You had to know what a file was to make use of a computer.
As someone approaching middle age, who has been coding professionally for a long time, I can tell you without a shred of doubt that I am a far stronger software developer than I was five years ago. I was far stronger then than I was ten years ago. The list could go on, but then you’d realize I’m really old :)
>As someone approaching middle age, who has been coding professionally for a long time
"Hi avg_dev, I am impressed by your experience and think you would be a perfect fit for our new opening in our exciting company. Please send me your resume, and here's the link to your unpaid take home assignment that should be easily done in a weekend or so. If your submission is satisfactory, we'll invite you to an on-site interview then discuss your compensation with the management. Looking forward to your submission" /s
"Be wary of old men in professions where men die young"
- old Norse proverb
However, I've seen this rarely applies in tech. It works in fields where old knowledge and experience is highly valued and not go obsolete very fast like medicine, law, accounting, military, construction, sports or martial arts, but modern web and app driven tech moved fast and made a lot of old knowledge experience obsolete.
If a company is hiring a JS dev, your cobol and fortran experience from 30 years ago is pretty useless to them and won't give you extra money for it than a new grad with only JS experience under his belt. But as a dentist, the experience you gained 30 years ago is still a benefit to you now.
I've had to struggle a lot to get hired as a backend dev with 10 years of experience in coding C for real time systems because no employer valued my previous knowledge for their business.
I remember back when I was studying programming at university the prof was demonstrating something or other on the board. He wrote a program in Java and it took about 15 lines. Someone said they could do it in 5 in Perl or something. The prof replied, “well I could do in one line of awk but that’s not the point…”
I’m watching the video now and Kernighan addresses exactly this point. He talks about using the right tool for the job, about matching only single patterns or else not using awk, the general propose nature of Python, and more.
I think it is maintained, but perhaps not actively developed. LuaTeX I think is where more development is happening, but XeTeX is still faster in most cases.
I know it's self-evident, but I'm so used to being around younger folks in tech that it's really cool to see gray hairs.