Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've seen this happen so much with IT people it became a bit of a clichė: Around 35, pure IT is not enough anymore.

IT people just drop out. The simple cases become managers or architects. The more advanced cases start a bakery, go work in a call center. One of the most extreme cases was a very intelligent, very cynical, very anti religion guy who just quit without warning and joined the hare chrisna. We got photos from him in red clothes doing some kind of ritual. Huh?

A big part for me is that IT just doesn't learn. Every 5 years, a new generation pops up, invents completely new tooling, and makes all the mistakes from the previous generation. Now your knowledge is obsolete, and after you relearn everything your tooling is worse than where you started. Enter a few years of slow tooling maturisation with a very predictable outcome, after which a new generation pops up, declares the existing stuff too complicated, and reinvent everything again. 35 is 4 or 5 of these cycles, bringing to front the huge wastefull uselessness of it all. Learning your whole life is a nice slogan, but becomes very pointless.

The survivors that continue in IT, deal with it somehow. You enter a new cycle knowing it will be change but not much advancement, and don't learn the stack as deep as you used too. You get a life outside IT: Kids, hobbies, social events. You let the youngsters run before you, smile when they do better than your old tech would, and compare with older tools when they get stuck. And you keep some of your tech enthousiasm just for the hell of it.




> The simple cases become managers

I wonder about this one. I became a manager because the organizational problems can be so frustrating and I wanted to fix them. However, what you get is pretty much only the aspects of tech work that the author calls out as burning people. A few days ago I wrote a simple Rails app for a young relative's hobby. Most satisfaction I've had with anything related to tech in years.


> I became a manager because the organizational problems can be so frustrating and I wanted to fix them.

The biggest "carrot" held out to idealistic people wanting to become a manager is exactly this: the (fake) opportunity to "fix things". Only after you've become a manager does it become apparent that long-lasting problems are long-lasting for a reason and that none of the choices low-level managers are allowed to make can meaningfully move the needle. Sometimes they fall for it a second time, but being a manager of managers is often even worse since now you don't even have direct reports to effect change.


I've long wondered why we insist on modelling company structure as strict trees / hierarchies. Perhaps there are other classes of graphs which would be better suited in order to avoid such situations?


In any large company there is the official HR hierarchy and then the informal collaboration graph of how work actually gets done.

The former is necessary to coordinate coarse-grained decision making, policy and vision that needs to be unified across thousands of people, most of whom will never meet each other, but who should ideally be rowing in the same direction.

The latter is necessary for the armies of individual contributors to get their respective jobs done. Trying to document and formalize this ad-hoc network holistically is impossible because it's too complex to be understood by any one person. Attempting to do so would require a non-trivial time commitment from all the workers, which would actually take away from them, you know, getting work done.

It's tempting to look at an org chart and assume this represents how things work on the ground, but Conway's Law is more ironclad than it may appear at first glance. Don't confuse legibility for operational capability. If lower level folks did not understand their goals and improvise, then large corporations would be even more rigid and brittle than they already are. They would be utterly incapable of responding to changes in the marketplace and smaller firms would dominate.


The official hierarchy still represents decision power. I'm not suggesting trying to document all work relationships but wondering whether a non-hierarchical model of decision power would be better than a classic hierarchy.


The official hierarchy does represent decision power, but it is not the only source of decisions in a company. The vast majority of decisions in a company are taken on lower levels and either not passed through any hierarchy at all or only presented through the classical technique of presenting a list of options, all but one of which are unpalatable. In practice specialists prepare most decisions in advance (cloaked in boss-pleasing terms like "advice" or "RFC"), so the non-hierarchical model you speculate about is already there.


Is your take then that the classic official hierarchy is purposeless? That it is simply an obstacle? Or that it cannot be improved upon because it is already optimal for its purpose (whatever that may be)?


I don't think I said any of those. The official hierarchy is a neat compact description of formal lines in an organization, not less but also not more. At most I think any formal org chart represents a vast oversimplification of the intricate graph of relationships that exist between people in any company. There are things about a company it can describe well (like who is in charge of performance reviews for who), and things it cannot describe well (like the nebulous role of individual popularity on company decision-making, or in distinguishing between productivity differences between individuals on the same "level"). It is also almost always limited to the company itself and excludes any factors outside it like competitors or suppliers, despite those factors sometimes being more important to company decisions than its internal organization.

In short, my take is that the "classic" hierarchy is a useful but limited tool. It is not sufficient in the slightest to describe how a company makes decisions, yet too many people treat it as if it is all you need to know.


> I don't think I said any of those.

You didn't, and you didn't leave the impression of having done so; it's just me poking to understand.

> like who is in charge of performance reviews for who

These are the kinds of things I'm trying to challenge: Are performance reviews useful? Would performance reviews actually be more useful in some other structure than a classic hierarchy?

In other words, I'm not convinced beyond all reasonable doubt that the classic hierarchy is the optimal structure for the limited, but useful purpose you describe it having. Obviously the standard thinking is that it is.

> It is not sufficient in the slightest to describe how a company makes decisions, yet too many people treat it as if it is all you need to know.

Completely agreed.


I've lead a engineering department where we had something more akin to a matrix org. If a team wasn't doing well, "debugging" it was a disaster. I had to talk to several managers, PM, ICs and disentangle a mess of he said/she said feedback.


When tasked with data viz on these lines, one of the early things I ask is what the edge relationship is supposed to represent. Specifically!

Organization / chain of command? Parts information? Charge Codes? Messages/Sentiment? Business Information Systems capture way more data than mahogany row might realize, and you can get some "split the atom" visualizations when you combine the right parameters, like RnD funding + messaging. "Huh. Looks like new tech needs a LOT of communications with the field technicians. Like, a LOT a lot - totally wiping out bandwidth in remote locations"

The data is maybe there but if you model everything at once it's going to be a minimally-significant graphviz blob.

Stupid punchline? Execs hold up the blob as scientific proof that their job is hard. "Don't change it! It's so complex and pretty! I can show this to the VP-Manager of Goofball Systems Inc to validate my existence!"

No, it's not hard or pretty, you just can't tell the difference, conceptually, between a hex driver and a lathe.


"mahogany row"?


“Execs”, a reference to the old fashioned setup of a corridor filled with executive offices that all have impressive mahogany desks. Feels like a British phrase to me but I’m not sure if it actually is.


My bad, that's what we always called the guys upstairs, and then I heard it used across my industry and assumed it was general.


Organizations do implement matrix structures to greater or lesser degrees which have their own advantages and disadvantages. Typically someone formally reports to one manager but will be "dotted line" into one or more additional people.


Maybe because it's a pretty good fit for the only things that ultimately matter: allocating budget and attributing revenue and PnL


It's because there's ultimately one person responsible for company's performance (the CEO), so everybody has to report to him/her through a tree-like structure. A company is a very centralized structure, no different than the army.


"The CEO is ultimately responsible" is just the highest level of the carrot from my post a few levels up. A CEO may seem quite powerful from the inside of a company, but from their perspective they have to deal with competitors, shareholders, suppliers, regulators, and a host of other actors, all of whom have different objectives than the CEO and all of whom can constrain the possible actions of the CEO to varying degrees. Not to mention that many of the managers at or just below CXO-level are highly ambitious people who more likely than not have aspirations to become CEO themselves, so it may be in their interest to do some tactical backstabbing to make the current CEO look bad to the shareholders. All of this adds up to conditions where a CEO definitely cannot do whatever they want, because resources are limited even at this level. Just look at all the failed projects various CEOs at (say) Apple and Google have tried that didn't work even with all the money in the world.

(As a former military officer, this is the same for generals btw. They may have a lot of "power" in the organization itself, but they're heavily constrained by outside factors. They have to make do with the budget they're given, and have very little control over hiring targets etc. Not to mention that during wartime the enemy will not be under control either)


Yeah, of course. The CEO is judged based on how the company under his management is doing in its overall environment (vs what would be the baseline expectations), not just on the absolute numbers.


Perhaps there shouldn't ultimately be one single person responsible for the company's performance. I know the standard way things work, but that's not necessarily the best way.


It's even worse. Even if you made a difference you won't know for certain and you'll mostly hear from people who didn't like what you did. One of the clearest success you can get is avoiding worse shit that would have come to your org. That alone can have a massive impact, many times that of a average IC, but "avoiding worse shit" isn't entirely satisfying. It's just frustrating that it had to be avoided in the first place.


> and I wanted to fix them

In the more optimistic case, you would need an independent and irreverent labor union with a lot of buy-in and pretense to try to call the shots, in order to do that. In the less optimistic case you would need a social revolution. It's generally more the structure of companies that creates these situations than the individual composition of management.


You can try to start a solo services business where you create MVPs just like that for startups. Am considering it myself


Assuming one got a Master, 35 translates to 10+ years of work experience.

I kinda understand why. Programming as a craft, the excitement of mastering it, plateaued around 5 years into this profession. Not saying there isn't specific domain that requires many more years, even life long devotion, but for the generic bunch, that would be it. The challenge is going away, and mundaneness of labor kicks in, and you start questioning yourself, what is the point of all this?

For me, I pivoted to some of author's recipes of reading technology history to reignite that romantic aspect, and it does work. Still, I think courage is needed to switch track and makes yourself uncomfortable every now and then, in new fields of tech. I was working in AdTech space, now I am in more pure ML application space, and I am happy I made the jump, and it is pretty rewarding.

Regardless, for fellow engineers, and I would say, to embed the curiosity for ever new technology into your belief system is critical, and it takes time to realize curiosity is indeed a blessing rather than a giving.


Plateauing after 5 years? That seems quite early. Maybe something >15 depending on ones capacity to go further and learn more. There is more stuff out there than one could learn in a lifetime. Many decades of learning. And if one does not know what to learn next, just learn a new language and see how concepts one already knows apply there. Or pick up a book of the masters of our craft and work through it. Like for example one could ask oneself: Have I really worked through all of SICP (or insert other great book here)? If not, maybe there is lots of stuff in there to learn.

Maybe one plateaus after 5y of mainstream every-noun-a-class and endless-design-patterns-forced-in kind of stuff. I guess I would, if I did not look for more elsewhere. I think such kind of job is also why there is a disillusionment. One suddenly realizes, that at the job one might never apply all the cool things one knows. Then it is up to oneself to either find interesting side-projects, or deal with it in some other way, or quit.


Yes, and then write a blog post like OP, with 3 paragraphs about how terrible and boring and evil the industry is, and the next 20 paragraphs an autobiographical history of every computer they ever owned since they were a child.

You know how to recognize a burned out case? The one who's making kombucha, not the one posting on HN claiming they're so burned out.


I think that's a really good observation and it aligns with my own experiences to some extent. I started working as a dev/designer at the age of 15 and have spent almost 20 years in the trade. My career progressed fairly quickly, with my first CTO gig at the age of 24, then moving to founding dev/tech-lead roles, which makes me feel super privileged but at the same time worked too well as a distraction from issues in other parts of my life (PTSD, depression).

The last 5 years have been a struggle, there are days, _weeks_ when I just can't code the simplest thing. The irony is that I have the tools to build most of the things I want, both technical and product related (design, UX, marketing), but now my brain takes 10x time to apply them and it just feels almost physically painful. At first, it was weird and scary.

Therapy and moving from a big tech hub to a small town in a different country helped a lot. But, I feel like I have to re-learn so much because most of my life revolved around IT.

I grew up poor and I'm aware of how privileged this sounds, but that's also one of the reasons for my problems: it's hard to make decisions that are good for you and your loved ones if you keep constantly second guessing/judging yourself/overthinking. There are still days I'm terrified of ending up alone and homeless, although I know, rationally, how much I have and how happy I am in my relationships with people.

You see, for me "pure IT" was never enough. But, it was a good way of creating a constant stream of problems I could solve, then get rewarded for solving them, rinse and repeat... This includes dopamine (I solved "the <small design or CS> problem"!), a sense of progression ("I can solve more difficult problems now"!).

Another issue is that in our trade often it's very hard to see the actual results of our work. I mean, actually, physically interact with people and see their happy faces when we do something for them. At a very basic level, this is something that humans need to keep going. I sometimes mentor or just chat with random people in our trade via "office hours" and this is brought up very often, regardless of age and expertise.

On a positive note, I'm aware that with enough work and patience this will get easier, and will make me a better person. Most of the days I feel happy, more than before. At the age of 25 it was easier to work 80-100 h per week to brute force your problems, instead of slowing down and learning how to live with the ape you are.

I just think that the nature of our work enables an unhealthy pattern of avoiding problems. It's easy to get lost in it.


Also started (paid office work) at 15. In 95. And I'm still obsessed with never forgetting anything. Long after the dopamine of solving a problem and moving on to the next, you go back and look at code and think: This was genius. How does this even work? How did I come up with it and how was my brain working? Much of my old code looks like brainfuck to me. Whenever I don't have anything better to do I pry it open and learn how it worked again. Then suddenly I grasp it and know I'm still good.

I never did it for anything other than solving puzzles for fun, and because being a musician doesn't pay the bills. But the joy of code and music are both more about keeping your brain in shape than anything else.


Yup I definitely fall into that bucket of not really feeling the impact of what I build. It seems like most companies are creating "lesser of two evils" products. Users hate most software. It causes people so much headache because UIs are confusing, bugs occur, simple tasks are painstaking and repetitive. People only use software tools because there's often nothing better or simply because a company forces them to do it. Or maybe it's like, yeah service A is janky and horrible but service B is even worse.


you've never coded anything you were passionate about?


I have but never received money for it. I suppose that's pretty normal, though!


Hah. Yeah, I think that describes most people in the industry to a T. It's rare to get paid to do the fun stuff, where the hours just disappear. A guy can still dream, though.


> At the age of 25 it was easier to work 80-100 h per week to brute force your problems, instead of slowing down and learning how to live with the ape you are

My position is slightly different - I can juggle much more stuff now and I have far more threads running in the brain at the same time. I can see much further ahead and I can do much bigger things right now. But one person cannot code such large stuff no matter how productive he or she is. The only way seems to be working as a team. Or with larger teams.


Could you give me some examples of those cycles? Genuinely curious what you mean.

I started my career just a couple years ago. In my first company, they used 10+ year old tooling and imo it was terrible. A very old legacy mess monolith that made adding features pure torture. Trunk-based development with a "who needs tests" mindset, resulting in horrendously buggy code, 50:50 chance pulling newest version would break something. Several instances of files with over 10k lines of code and deep nesting, absolute nightmare. Absolutely no mindset for performance. They wrote quadratic complexity code in the backend to fetch data because hash-based data structures already seemed too advanced to many of my coworkers and then they wondered why their frontend was so terribly slow. Not that they had a lot of pressure to deliver a great product, because they are / were market leader in their B2B niche. I left within a year.

Now I'm working in a company that uses all the latest gitlab CI/CD shenanigans, code reviews and heavy use of unit, integration and end to end tests. Everything is hosted in the cloud with a microservice architecture. We actually need it as we scale to millions of customers and have performance and reliability requirements.

The difference is not just the tech stack that was horribly outdated and imo extremely tedious to work with, but also the mentality is completely different.

In the first company, you couldn't change anything, there was a strict hierarchy and everything stayed as is because "it works". You totally got the feeling that there's some old people up in the hierarchy that were way too lazy to learn new things and didn't want to endanger their meaning in the company. When I left, I spoke with the Head of HR and he told me that basically all people that leave do so because of the mentioned reasons. So that company drives away motivated talent with their crap mentality. Pay wasn't very good either, but a first job is a first job after all.

Now mind you, both companies are a couple decades old, but imo one always kept up and the other didn't. Both companies have 10+ year seniors. Personally, the people in the current company are way more competent and excited about work. Much more fun to work with, I learn more and I absolutely don't get the feeling the tooling is reinvented in any way. It's improved in all possible aspects I could think of and makes the development experience much better.


I think the newbie coming to the fancy pancy hubernates cluster company in 5 years when you and the other engineers have moved on will have a complete new level of headache inducing mess to deal with compared to what you had at the boring company as a newbie.

I started out my career thinking best practices with agile, code review, CI and "shared ownership" and stuff were the way to go.

But in the end I like the old siloed do-your-stuff way more. It works and gives you actual ownership and freedom. It turns out that it is easier to cooperate when you can say no.


I was thinking the same. I feel like I've seen the cycle with my own eyes at this point. Projects almost always seem fresh and good at the beginning, and then they become monsters after awhile seemingly no matter what you do.


It is different this time!

I mean GP could have been at a genuinely bad place with bad practices.

It could also be that he was just a idealistic newbie trying to give advice to hardened experts rightfully ignoring it. HR boss agreeing does not tell us anything. They are buzzword driven.


> Could you give me some examples of those cycles? Genuinely curious what you mean.

A large part of the work surrounding the Docker ecosystem has simply been re-creating features that were already around in the JVM ecosystem 10 years ago. In the same decade, we also had the move from server-rendered webpages, to browser-rendering, back to server-rendering.


OK just to be clear: Even if the constant stack switching can be very tiring, you have to do it. The alternative is deep stagnation, which is much worse. I am also very much pro everything that raises quality like CI, but remember some groups where doing it in the 1970's.

I started programming with basic and then DOS and assembly. I have very fond feelings and deep knowledge from both of them, but the UNIX generation rightly looked down on them as 2 turing tarpit hellholes.

Onward to C, with Mix, Watcom and DJGPP. Better programs, but you live with some stupid inefficiencies that wouldn't fly in x86 asm.

Onward to Win3.1 and Win32. The end user experience is much better, but as a programmer you now have to accept control by the OS over your work. You can't just e.g. write to VRAM anymore. First serious dark clouds appeared for me when I realized Microsoft cynically used us all to extinguish all competition. Politics had entered my IT life.

Then came the web. In one way, it was glorious, but programming it in javascript was a serious hellhole. jQuery brought some sanity, but the user friendlyness from windows was almost impossible to reach.

Serverside was java, which was dog slow until hotspot appeared. It eats memory like there is no tomorrow. Sun dictated the very shape of your program. There was some war going on between EE which was horribly verbose, and Spring which was geassroots and looked down upon by the architects, as if it smoked weed or something. Whatever camp you chose, pain would follow. Ir you could go to the PHP camp and spend more time debugging than programming .

There was some python here in my life, good but dog slower than even Java.

Then nodejs. If you thought Java gobbled up memory, you'd just die working with that abomination. End user usability had still not recovered from the win95 days (it never did). You had no type safety with javascript. In fact, every decent tool and technique was sacrificed on the altar of equal back and front end language. Then came frameworks like angular where v2 managed to commut ecosystem suicide, and react. Meanwhile transpulers packers etc managed to undo much of the noneedtocompile of javascript.

In the mobile world, 2 massive companies appeared, and their app stores killed any liberty of publication.

There us more, but I ran out of time ;-)

All of this is quite ranty, partially deserved but there is also quite a lot of good in here. Even so, programming for me was most fun on DOS, and user experience on win95 to xp.


> I am also very much pro everything that raises quality like CI, but remember some groups where doing it in the 1970's.

CI is risky, because it is a great micromanagement tool, just like ticket systems for non-bug tickets. I don't think it is strategic to lure traps for our selves.

I believe one should have a setup such that "good" management wont mess our stuff up and not being dependent on having "great" management.

It is like agile which only works with great programmers and managers but messes up for most of us. But CI is not nearly as bad or dangerous and have benefits if kept simple.


Let me describe to you a system I've seen myself. I think it was created around 1985, in Cobol, by 1 company, for only that company. Afaik, it succesfully runs today.

At the start, there are screens for what we today call issues: 80x25 terminals that input, edit, prioritize and assign changes. Nightly batches provide management views of what is being done where.

Other screens let you check in and out code files, tracking history, linking to issues and people, and managing which versions are on local dev preprod and prod. Nightly batches run the compiler for changes.

Promotion to preprod requires quality checks, where e.g. no new deprecated calls are allowed. Promotion to prod requires a manager sign off, where the input from the test team is validated.

I have not seen this level of integration until github matured. In some ways, github is superior, in other ways the deep integration with their procedures and tech choices is still superior.

That's more than 3 decades, maybe even 4, that this system paid off. It survived the departure and replacement of a whole generation. It survived all attempts to managerial reorgs, and thank god for that. It came from a time that computers where so expensive that having the manpower and long term vision for this was a good choice, even for only 1 company. Unfortunately, it also makes new people relearn everything they know about version management.


Ye. CI systems can be beutiful. And in some companies you want some sort of formal sign off process. I am not dogmatically against CI.

> It survived all attempts to managerial reorgs, and thank god for that

The problem comes when it is cargo culted and forced I guess.

The temptation for some manager to rewrite the system you describe in Groovy and use Jenkins or integrate it into Jira! Imagine the possibilities of unnecessary work and complexity. A big opportunity cost.


There is zero risk to CI as long as you have a proper branching process.


> I started my career just a couple years ago. In my first company, they used 10+ year old tooling and imo it was terrible. A very old legacy mess monolith that made adding features pure torture. Trunk-based development with a "who needs tests" mindset, resulting in horrendously buggy code, 50:50 chance pulling newest version would break something.

Monkey paw curls.

My story is complete opposite. Three years ago joined a startup. We use relatively new Java, branches everywhere, microservice architecture, +85% branch coverage, integration tests, end-to-end, performance, you name it. CI/CD integrated and self hosted. Heaven, right?

It was an absolute shit show. Because of microservice architecture you had no way of running +50 necessary microservice on your machine.

Tests are mandated but brittle. Mocking libraries break whenever you refactor a method. Integration tests are flaky and inconsistent (behaves differently on local vs remote). End to end test takes hours to complete. There are 20 different environments, each with different configuration, each divided into dev/qa/prod.

In how long I was on we didn't have two successful deploys on main branch. But you have to keep adding features because one customer said it might. Oh security found that library is 20ms too old. Have to replace it asap, despite the convoluted nest of dependencies between microservices.

It had good pay though. Taught me to really hate mocks and that tests need to be applied at right level.


Microservices can't have dependencies between each other otherwise they aren't microservices.

I think the main issue is just the other engineers you are working with. If they are bad, they will screw anything up.


Microservice is a module. A module that got separated by a network layer, most often due to somebody's momentary lapse of judgement.

It's encouraging that you forbid the next person to fall into identical trap (you effectively say: this kind of remote module must not use further remote modules). Alas... they can, and they will.


> Microservices can't have dependencies between each other otherwise they aren't microservices.

See Hyrum's law:

    Put succinctly, the observation is this:

    With a sufficient number of users of an API,
    it does not matter what you promise in the contract:
    all observable behaviors of your system
    will be depended on by somebody.
One example we bumped spring from 2.1->2.4 (not actual version numbers) Harmless, no? What's worst that can happen?

Failure when doing some but not all operations.

Why? Because some Python/Java micro-services down the operation chain expected null fields to be omitted and the default behavior was changed (between Spring versions) to write null fields instead. Which only occurred on those services that relied on null fields being omitted. Fix was easy but finding the bug was difficult.


How would you design a microservice, that does not depend on another?


Microservice 1 <-> on call engineer copy pasting <-> Microservice 2


Send a JSON package with some HTML and dimensions, get back a JSON package with links to that HTML rendered as JPEGs at the requested dimensions.


At some point you are going to have another service that uses this HTML->JPEG service though. That would be a dependency, at least in my view (ie, if the HTML->JPEG service goes down, something else will break).

Or are all microservices user facing?


What you are describing at the old company is not a failure of old tools, but rather a failure of management/employee self-management at that company.

Any tool can be used to do good or evil. They were using old tools to do evil things-- namely, writing bad code.

The only caveat here is that if I had to maintain bad bash scripts or bad koobieboobie cicd automated shlalala, I'd always choose bad bash scripts, as the blast radius is smaller and easier to reason about.


> In my first company, they used 10+ year old tooling and imo it was terrible. A very old legacy mess monolith that made adding features pure torture.

Everything becomes like that, legacy, torture, mess. New things comes along, clean, new, solves some problem. Mess dissapears from one place but starts popping out somewhere else but still better than before you think. Wait 10 years and you and you’ve got a completely different mess, lots of people who built it have now left, few know it all but have stopped caring. A new you joins the group. Sees a crazy unweildy legacy system. Sees new technology that solves these problem. Starts over.


>Everything becomes like that, legacy, torture, mess.

>few know it all but have stopped caring

If they kept caring (and were allowed to by being listened at, that's maybe why they stopped), that could have not turned into a mess (I know (of) 15 years old systems that only got better with time, thanks to lead devs playing both as conductors and as musicians).


I went treeplanting one time when I lost it. It was awesome. A lot like this story: https://bivouac.com/ExpPg.asp?ExpId=84


I think it has to do with how boring the thing you're programming is.

If you're working on some humdrum mobile or web app, it's hard to stay excited. Making apps that clearly do not need to exist takes its toll. I saw the writing on the wall and got out early. For me the solution was to go back to graduate school. I figured that I would be able to find an application area to work in with some real depth and let that drive the programming, instead of working on something dull where I would need to find my excitement in the programming itself. Now that I've done this (I have my PhD by now), the programming I do is quite dull, but the applications are fascinating. What's more, what I work on now has so much more depth to it compared to programming (I do computational math). To attain technical mastery simply requires dramatically more time and effort than programming. I think in this way the proper balance is restored between the tool and its use---a hammer is simple and understandable, building a house is deep and complicated. My hope is that one day people will have more opportunities to use programming for the powerful computational tool that it is, rather than as a means to make a quick buck.


> The simple cases become managers or architects

Not at all. As a senior, you note that even if you can do a lot of things - almost everything - right now, you are still just one person. Even if you are a mythical creature that can put out code dozens of times more than others, you note that what you can physically produce is always far behind the possibilities that you can see on the horizon. Then you realize - you need to cooperate and work as teams. For only by collaborating you can make happen all those things that you can envision. Which obviously takes you towards leading people one way or the other.


>Every 5 years, a new generation pops up, invents completely new tooling,

This is in general true for life not just IT. The timescales may be more along one generation say between 30 and 50 years.


> A big part for me is that IT just doesn't learn. Every 5 years, a new generation pops up, invents completely new tooling, and makes all the mistakes from the previous generation.

It doesn't learn, because it's chasing a fad or just considers everything before it crap.

As I said, those those that don't understand old systems are doomed to reimplement them.. Badly.

As someone approaching this 35 year limit, I never once had this problem. Jobs are kinda meant to suck. If you want enjoyment, start a hobby.


An older colleague once told me you can only expect two things from your job: money and experience, everything else is extra.


This sounds pessimistic at first but I 100% agree.


Haha, are you me?


  History repeats
  itself. You see, it has to.
  Nobody listens.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: