Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not so much. I'm 35, started programming at about 10 in my TI99-4A, typing games from magazines, etc. the usual stuff.

No way a 40 year old guy have more experience than me in general programming, unless he was working at Microsoft or IBM at that date, and surely they are about less than 10000 of those guys in the world. Of course there are 40 year olds that may have way more experience than me in a specific language or technology.

My point is that if computer tech is only 30 year old, then you won't find programmers with more than 30 years of experience, no matter their age.



Computer technology is about 60 years old, though. There were people doing some pioneering work in the 50s and 60s, and the modern hacker culture really has its roots in the 1970s, just prior to the rise of personal computing.

There was lots of stuff going on with the DEC minicomputers, LISP machines, early UNIX systems, and big-iron mainframes long before most of us were even born, let alone getting our starts on TI-99s, Apple IIs, and Commodore 64s. And let's not forget that the people who were creating those early home-computer platforms were themselves all from the previous generation.

Would you turn people like Steve Wozniak or Al Alcorn or Ken Williams away from your new startup? You'd be a fool if you did. The basic architecture of computers hasn't changed at all since they did their most important work, and the cognitive skillset that makes a great programmer is the same as it's always been.

In fact, I'd expect that people who cut their teeth on early systems, and had to work within narrow constraints, without a big stack of abstraction layers and frameworks, would be all the more adept at writing robust and efficient code than those who learned on modern tools. And I'd bet, with Moore's law slowing down, and with better-optimized code becoming more advantageous in building scalable systems, older programmers will soon be in higher demand.


The work in the late 40s was interesting, too. First, you had specialized systems, then Eckert, Mauchly, and von Neumann started working on stored program computers. They had to figure out what opcodes were needed and how they were going to be implemented. Jean Jennings Bartik's book (http://www.amazon.com/Pioneer-Programmer-Jennings-Computer-C...) is a great first-hand account of all the work (and drama) from back then.

(Not trying to short-change Bartik here either -- she did a lot of the work on the ENIAC instruction set when they converted it to a stored-program computer)


The fact that you can name by memory the people that worked in the 40's means that the size of the computer science community back then was tiny.


He didn't say that those were all of the people working on computers in the 40s.


Exactly. Just like how history doesn't celebrate the names of everybody who worked for Thomas Edison, there were hundreds if not thousands of people toiling away in anonymity underneath these giants.

Edit: And many more "giants", of course…


Assuming Ada Lovelace was the first programmer, programming as a discipline is more like 160 years old.


She was certainly the first Computer Scientist, well that's what I was told in uni!


She was definitely the first person to realize the potential that Babbage's Analytic Engine had, particularly outside of just calculating things. Babbage's design, if I remember correctly, was basically an improvement on a design he initially made to calculate trajectories for artillery teams, and most of his thoughts on what to use his computer for were "calculate (thing)". Lovelace wrote on how the computer could be programmed to solve more complex problems.

I think that it would be a mistake to say that computer science has truly been around for 160 years, though. A few people (there was also an Italian who was interested in Babbage's work, although I'm not sure what his contributions ended up being, if any) does not a field make, and the fact that any progress in it was more or less put on hold until the early 20th century (when mathematicians started working on what you could calculate or construct in a finite number of steps), and you didn't get (untyped) lambda calculus and turing machines until 1936, which is probably the best place to truly start the idea of computer science as a field. (And since you got early devices that were sort of primitive mechanical computers in the late 30s, early 40s as part of the whole Bletchley Park cryptography work by the Brits.)

A very large debate around the turn of the century was if mathematics that you couldn't specifically construct in a finite (or countable) manner were, which became particularly heated after Cantor's set theory work (showing that the real numbers were uncountable) and then things like Russel's paradox (showing contradictions in Cantor's naive set theory if you allowed sets that contained themselves). I'd argue (without firm, researched proof that it was definitely the intent and case) that the spirit of early computer science (lambda calculus, turing machines, etc.), which was concerned with what you could and could not compute with a finite algorithm, came in spirit from those sorts of debates. (See finitism, intuitionism, constructivism, etc. for parts of this debate; traces of it remain in modern day mathematics with some people's concerns about if the Axiom of Choice is a valid or reasonable axiom to have)


I think you're mixing up a couple stories there, but it doesn't really matter that much. Just to fill in the details (and because the alternative is for me to start responding to work emails which I just don't feel like doing this morning): Babbage was concerned about the quality of mathmatical table books, which were used for calculations before we had modern calculators and slide rules weren't mainstream. The figures in them were computed by hand, and some were so bad there was an error on every page, which made them useless for scientific work and dangerous for industrial work. The Difference Engine was designed to print out pages of figures using a little printer attachment.

ENIAC was ininitially put to work calculating artillery trajectories, once the programmers figured out the bug where the shell kept going after it hit the ground (oops!).


>No way a 40 year old guy have more experience than me in general programming

I started programming toys in 1979. [1] I started programming "modern" computers in 1981. I was selling games when I was 15 in 1983. My "real" professional experience started in 1987 with a contract to write a video game from a major publisher.

So yes, it's not hard for someone to have "more experience" than you in general programming.

[1] I had one of these: https://en.wikipedia.org/wiki/Big_Trak


OMG THANK YOU! I had one of these as a kid, but couldn't remember the name! We used to have competitions to have it go from the living room, out to the kitchen, through the table legs, and back again. I was 5 years old, and wouldn't have another programming opportunity until 3 years later when we got a C64.


Ha ha, I'm 42 and always wanted a BigTrak too. I finally bought one a couple years ago when they remade them. Sucks though it does not have the dump wagon accessory.

http://www.bigtrakxtr.co.uk/home


>My "real" professional experience started in 1987 with a contract to write a video game from a major publisher. >So yes, it's not hard for someone to have "more experience" than you in general programming.

Do you know how many people were doing that? you are way over estimating the amount of people in the world with your experience.


> over estimating the amount of people

You said "No way a 40 year old guy have more experience than me". To disprove an absolute statement you only need one example.

That said, a significant fraction of my friends from those days were doing similar things. I won't deny that I'm particularly good at programming, but all of those friends could also claim more experience than you.


My grade school had a Big Trak. Loved that thing. I saw there was a new version coming out that works with a phone app and thought about getting it for my great-nephews. But there's so many interactive toys out there like it I have to wonder if they'd think it was cool.


I'm 35 and I was doing OOP via HyperTalk and AppleSoft BASIC with a dose of 65C816 assembler before you picked up your calculator. Also, TIs were toys compared to HPs.


30 years ago, computers were actually pretty common. The school I went to had two rooms full of computers and were teaching programming to every child as part of the maths lessons. Software was being talked about everywhere.

I've heard a number of these comments recently drastically underestimating how long computers have been around.


Yeah, maybe I should have said 50 instead of 30. I forget we are in 2014, my mind still is in the 90's. Miscalculation.


You think you have more experience than 40 year olds because you were typing in games when you were a kid? How about the people working when you were a kid? I don't know, this is sort of an incredible thing to say.


Maybe I'm not that good at english because I believe I was clear that there obviously are people with more experience than me, but there are few. IT in the 70's and 80's was a extremely small field.


> I'm 35, started programming at about 10 in my TI99-4A, typing games from magazines, etc. the usual stuff.

> No way a 40 year old guy have more experience than me in general programming, unless he was working at Microsoft or IBM at that date, and surely they are about less than 10000 of those guys in the world.

You're talking about 1989, right? In 1983, there were 443,000 computer programmers and 276,000 "Computer systems analysts, scientists" in the United States alone (source: http://www.census.gov/prod/2/gen/96statab/labor.pdf).


Damn thats a lot. I recon I had my numbers a little wrong.


"My point is that if computer tech is only 30 year old"

I'd be really interested as to how you get that figure...


From my ass, basically. It was an example. It work for any other kind of tech.


The TI99-4/a was discontinued in 1983 as I well remember because it happened not long after I spent my savings from my paper route to buy one.

If you were using one at age 10 and are now 35 that would have been 1989/1990 or so... there were no magazines writing about it by that time. You had some old issues of the TI-99er or something?


I did the same – cut my teeth on a 4A well into '94 or so (~9 by then) using old magazines. It doesn't really matter what's "current" when you have no-one but yourself to learn with.


I beg to differ - I distinctly remember getting TI magazines well into the '90s. IIRC, there was one called Micropendium that kept publishing until 1999.


Started programming on an Apple II which I still own, in 1983 aged 11. I'm 42 now so that's 31 years. You'd need to have started at 4. Plus by the mid eighties it's not like this was that rare. My school had rooms of computers (BBC Micros, later PCs) and the kids all had ZX Spectrums, Vic20s and Commodore 64s. There are tons of people on their 40s like that. Most CompSci and Physics undergrads who were my contemporaries at college - so also now in their 40s - followed similar paths.

So it makes little sense to suggest that it's somehow a crazy idea that someone 5-15 years older than you (in their 40s) might have a longer track record working with computers.


Really I first coded at 13 in the early 70's at school and started professionally in 79 (fortran) so thats 34 years


My father was programming before you were born...

Working in a bank, not for an editor...


Did I say that people with more experience than me does not exist?


"computer tech is only 30 year old"


Well, you are "old". Yup, over thirty, you have missed the boat. Welcome to the new ageism.

FWIW, I'm 47 and was programming assembly and microcode in my youth. So I have 10 years on you. At this scale, I doubt the difference matters much. Because you are old, too.


My point is that there are VERY VERY FEW people like you and they are expensive.

Also, you have about 30% more experience than me, I don't think it's a small amount.


TI99? You pussy. I'm 40 and when I was 10 used to help my Dad build homebrew z80 machines. My first programming was writing a small OS in z80 asm. Helped him build an awesome z280(that was a rare chip) transputer.


I'm sorry, what? If a guy has been doing a job for longer than you then he can't possibly have more experience than you? Isn't "experience" "doing"?


Did you read my entire comment before pressing reply or only the first three words?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: