Earlier in my career I felt a lot of disgust at bad code and bad solutions.
Sometimes it tye badness was really effort due to unfamiliarity or not instantly understanding what I was looking at.my laziness.
Sometimes it was because it disagreed with what ever framework or methodology I was using to give me confidence in the face of ignorance. I feel like an imposter but at least I know design patterns so this guy who did MVC wrong is worse.
Sometimes it was looking at something genuinely bad.
Now, later on, maybe my emphasis is more on business outcome than perfect implementation or maybe I've been involved in making enough abominations due to time pressures and architectural compromises that I can read those forces in other people's work.
Either way, I don't feel that kind of disgust anymore. It's code. No one is going to read it. It will be replaced next year. It works or it doesn't. Having to rip stuff out when the business changes or someone ways to use a different stack for resume reasons is part of life.
I think when I was younger I had in general more strong opinions on how to do things.
They weren't really based on anything more than sounding like they were true.
I'd hop on every paradigm that sounded correct. Clean code. Pure functions. Effective java. Pragmatic programming. Defensive programming. Like it has that righteous vibe to it. I'd totally strap a bucket on my head and go conquer the holy land under any of those banners.
If only we could do it my way, I thought, then we wouldn't have to put up with all these chafing points that annoyed me. Never do this! Always do that! My mind was like thumbnails from fitness youtube.
Along the way I discovered that when I got to do things my way, it turned out that there were actually still a bunch of chafing points. Different, but it sure wasn't great. Maybe my 30-year-old ass didn't know everything.
Eventually, along the way, I sort of came to the insight that I've built what, 15 applications in the course of my private and professional career. I've worked with 3-4 programming languages in enough depth to be competent with them. I've tried a few architectural paradigms. If I work until I'm in my 60s, I'll maybe double that. Life isn't long enough to get much deeper than that into the craft.
Given this pitiful sample size, it's nothing but hubris to think that I or anyone else would have a clear grasp of what is the best way of doing things.
I had a very similar experience as you. But I think you're being too humble.
Whether you wrote 15 applications or just one or two, does that really matter? Designing, exploring, writing, iterating on and maintaining these applications _for years_ have given you insights, battle scars and tacit knowledge that can only be gained through experience and continuous learning. Not to mention the different environments technologies and foundational knowledge you explored and internalized.
You've accumulated a hard earned skill set and the ability make wide reaching, pragmatic decisions. Do you or someone else _know_ what the _best_ way of doing things is? Probably not. But I bet you have developed opinions, taste and a toolbox of approaches with different trade offs.
That's maybe where the OP is coming from as well. The mindset of being opinionated is very valuable if you can back it up.
That doesn't mean you're always right and don't let other speak. That doesn't mean you can't change your mind or that your approach excludes other people's perspectives and incentives.
It means you can strive for _better_ and that you're crazy enough to make bold decisions when necessary.
Yeah I don't deny I'm a far better programmer now than 10 years ago, and may even be better than the average in some respect, but most of it is as you say tacit knowledge.
I don't have any catchy slogans or rules to teach.
I also understand that there is so much I don't know. Even if I hone my skills until the day I die, I'll never be so certain I know the best approach as I was when I was younger.
There is the work you do for money for food and rent, but there is also the work you do for yourself to improve your own craft. Often on the same project.
I think you’re describing the disappointing but grounding perspective gained as one gets older and wiser.
I worked the majority of my career at an "elite" Japanese corporation. It's one that has a brand pretty much synonymous with "Quality." Many of my peers were among the finest engineers and scientists in the world.
I was often the dumbest guy in the room, and I'm smarter than the average bear.
Dealing with these folks could be infuriating. Every time I would suggest orthogonal approaches (because, like, software is different from hardware), I'd be called "lazy," or "sloppy."
It made me write good code, though.
If those folks saw the way I work now, they'd be horrified. They'd call me a "reckless cowboy," or something to that effect.
But most folks in today's software industry think I'm a stuck-up prig.
I work quickly. I leave good, highly-documented code, that lasts a long time (For example, one of my C SDKs was still in use, 25 years later), and I don't want to toss my cookies, whenever I look at my old code (sometimes, though, I shake my head, and wonder what I was thinking).
I'm my own best customer. I'm the one that usually needs to go into my old codebases, and tweak them, so I write code that I want to see, in the future.
I've come to realize that the term "over-engineered" can mean a couple of things:
1) This code is too naive, complex, and byzantine, which makes it prone to bugs, inflexible, and difficult to maintain; or
2) I don't understand this code. That makes it bad.
I used to have an employee who was "on the spectrum."
Best damn programmer I've ever known. Crazy awesome. Had a high school diploma, and regularly stunned the Ph.Ds in Japan.
His code was written very quickly, was well-designed, well-structured, well-documented, bug-free, highly optimized, and an absolute bitch to understand.
I think the other thing that "over-engineered" can mean is code that's unnecessarily good for its purpose.
If you're building a quick demo of a product to get user feedback, and you write perfect code that's highly maintainable, you've wasted time - better to throw together something as quick as you can and rebuild it if it's actually going to be used by/sold to customers. That's really overengineering in my mind - doing a poor job with the quality/speed tradeoff given the purpose of the thing you're building.
> If you're building a quick demo of a product to get user feedback, and you write perfect code that's highly maintainable, you've wasted time
In my experience, this is a trap.
The demo almost always becomes the product, because upper management sees the demo, and says "Hey! It's almost done! Let's ship!"
This is how you end up with these gigantic Frankencodebases.
These days, my test harnesses and demos are generally "ship quality." It also means that I can mine them for snippets, without holding my nose.
I have spent a great deal of time, however, practicing, so that I write top-shelf code, by habit. These test harnesses are often churned out very quickly.
> If you're building a quick demo of a product to get user feedback, and you write perfect code that's highly maintainable, you've wasted time
Any time I thought I could shortcut my way to a demo by relaxing on perfect code/maintainable code it has always ended up taking just as much time and often longer.
Almost every time I've thought to myself "I will probably need to refactor this later", I did indeed need to refactor it later. But I think a lot of those cases I still made the right decision.
The initial version took M hours to develop, and the refactor took N hours. If I had done it right the first time, it would have taken L hours, where M < L < N + M. But it's not hard to construct legitimate business scenarios where the "N + M" solution is better than the "L" solution in terms of meeting the business goals.
I expect M ≈ L, though. All you can ever gain by relaxing code quality is some keystrokes, but I'm not sure typing is a bottleneck to begin with. The bulk of the work required is the same no matter what the code looks like. I've often believed that M < L when I've tried to take shortcuts, but in hindsight I'm not sure it has ever ended up being meaningfully true.
It's not about code quality in terms of variable names and things like that, it's more about architectural decisions. What assumptions do you make, what edge cases do you cover, how much flexibility you build into the design, how extensible is it, what kind of tests do you write, etc.?
> What assumptions do you make, what edge cases do you cover, how much flexibility you build into the design, how extensible is it, what kind of tests do you write, etc.?
This what I was referring to. I'm not sure there are shortcuts here beyond saving some keystrokes.
As you've mentioned tests, this is one place I see a lot of people thinking they can save time not writing them out, even opting for no tests. I expect a good test suite for the average application easily doubles the number of keystrokes required, at very minimum. The code doubling in size sounds like a lot more work, but is it really in the grand scheme? You still have to put in all the same amount of thought into those tests in order to write the code (and again if you choose to manually test it). All you've saved is the effort of typing out the tests, which I don't find takes all that long.
Maybe there are some slow typists among us? If you spend most of your time clacking on the keyboard then, indeed, the number of keypresses required could become significant. But that's not my experience.
> it's about the actual thinking, iterating, prototyping, etc.
Yes, these are essentially constants that need to be done no matter what the code looks like. Which, again, means that the only gains you might make is in reducing the number of keystrokes. To which, I dare say a lot of those keystrokes can even be done in parallel. For example, writing tests and thinking about the problem pair quite well together, so I'm not sure you even lose an insignificant amount of time on those added keystrokes in reality.
The latter points are also where you can quickly get into time trouble if you do sacrifice your code, which is how it often takes longer if you try to shortcut the code. This would be a worthwhile gamble if poorer/less maintainable code bought you time, but in my experience it doesn't even off the hop.
Absolutely, but be careful about the corners you take when making a demo, or as someone else suggested, make it clear in the demo that something isn't done.
My first big assignment on my first job I presented a demo ~1mo into a 3 mo project. It didn't persist data, or work on more than one host, and it saved state to a json file on the one server I manually copied the executable to. Everyone is the demo doesn't see that, they see it working as expected. The first question I got was why can't we ship this today, it appears to work as expected?. PMs, SDMs, etc care a lot less about the "it takes a while to set up something that scales, and stores data in a safe encrypted database, etc" when they see something that works.
Now whenever I do a demo like that I append "DEMO DATA:" to any visible string, and ensure I demo at least one failed behavior, just to guarantee that no one thinks it can be shipped that day (if I know it can't).
I have come into serious conflict with one of my coworkers over this. He thinks everyone (except him) is either overengineering or underengineering everything all the time, but usually he just doesn't understand the problem context and is making assumptions that turn out to be incomplete or incorrect. It's a great lesson in humility to realize that no matter how smart or capable you are, you don't know what you don't know until you find out that you don't know it. I used to be a lot like him until I got slapped around pretty hard (metaphorically) as a result of my arrogance.
> It's code. No one is going to read it. It will be replaced next year.
Then it truly is awful. Deeply awful. It sounds like you've never progressed past dealing with terrible code, so you have my condolences. Good code is read. Good code is not replaced in a year. Even most bad code is not replaced in a year. Truly you live in a world of absolute shit code.
Business requirements or engineering dependencies can change quickly in some scenarios, meaning code gets replaced regardless of its quality. I've had to delete a lot of code the past few years, much of it 1 year old and very carefully written. Someone wasted his time.
Obviously I don't know the specifics of your situation, but "very carefully written" doesn't necessarily equate to "good". Part of what makes good code good is its flexibility in the face of change. Barring an early-startup-style total pivot to a completely different industry, business requirements shouldn't just completely change like that, unless they were incompletely fleshed out. I could see some kind of API-adapter code having to be completely thrown out if you move away from that API or dependency; indeed you're right that that doesn't need to be great code since there's no expectation of deeper reusability there.
I do think a lot of code is written prematurely, which may be what happened there. If code is premature, it's not worth spending the time to make it good; but most likely, it's not worth writing at all. A large majority of the total time on a task is spent fully understanding the problem being solved, with much of the remainder spent coming up with a high-level approach to the solution. Actual meaty code-writing is a pretty low percentage, so even doubling the time spent here shouldn't increase your overall time that much. Since writing the code is the crystallization of all that prior effort, you can liken writing good code to "taking good notes". It indeed seems silly to say "man, someone wasted a lot of time taking really good notes about that lesson they were in." If it was not worth spending time to take great notes on the lesson , then the lesson itself was not worth it.
It was good code. Some was using a database that got deprecated within our department. Some had business requirements that change pretty frequently because despite being a large company, we have some moving targets in what we deal with.
> there was this little framework someone wrote where logic was encoded in config files.
This is one of the most common and traditional shining exemplars of awful code. Absolutely nightmarish stuff. Such code can only be written by someone who has absolutely no idea what programming even is. They looked at a problem and said: "You know what this problem needs? It needs a much worse programming language that I just invented, jammed into a config file." So the code was tremendously awful to begin with, and you threw it out. Good for you.
> I said look, we're SWEs, we know how to edit code.
Absolutely right: this is why we use programming languages to program, and not config files. Again illustrating that you never threw away good code; you threw away the trash.
> I'm replacing this with something centered around an if-else tree that just does exactly what we need right now and can be modified later, with little attention paid to quality. I'm not even gonna bother splitting the codebase into multiple files cause someone is probably gonna have a strong opinion about organization; that person can have fun doing that. So far that's been modified a lot in the past 3 years with ease, it's gotten the job done, and anything more structured would've broken during that time.
A chain of if-elses is probably not paragon-level code, but definitely an improvement over what you described as coming before. My overall impression is that nobody has really taken the time to sit down and really dig deep into the problem, extract out the invariants, model things mathematically, identify the important operations you need to perform, develop useful data structures to support those operations, etc. My feeling is that if it was actually set up well -- really well -- you'd look back on your current situation and cringe.
It's statements like "anything more structured would've broken during that time" that make me think you really just haven't been fortunate enough to even experience what good code actually looks like. You can't imagine the situation being better. But it can.
Does it need to be? Maybe not. Either way, this is still a story of replacing awful code with better code.
First off, I deleted the part you're replying to cause I didn't think anyone would bother reading it, but you quoted it during that time, so I apologize for the ninja edit.
The code I deleted was sensible for the business requirements at the time, and the author thought it through. It wasn't a whole DSL (which I agree is a classic mistake), was just a list of regexes and a few other variables. We had to support new pieces of inventory in our system, and adding onto the list/config was easier than rewriting code. You might fault the author for not predicting that assumptions would change and building something more flexible, but I don't think it was doable. I think the author's real mistake was spending so much time on something impossible to predict.
If I'd put more thought into structuring my new code (as I normally do for other projects), it would've been thrown away regardless, because more things they told me were "invariable" changed after I did it. And if I assumed everything was variable, it'd take way too long to write the system, so the tradeoffs pointed to writing bad code and not caring. I had several other projects on my plate that were worth spending the time on.
We had the opposite problem with another system a teammate wrote for maximal flexibility. It was composed of several carefully abstracted layers. This made the system just too complicated to deal with, as nothing even used that flexibility, and some new features even started breaking his abstractions. When he left the team, I rewrote it to only do what we need today with 1/10 the LoC, and it's way more stable. My new cross-system design has that whole thing slated for removal too. That code will live only a year.
Playing games like Satisfactory and Factorio also forces you to come to terms with the imperfection of living systems. The first time you play, you can't possibly know how big you need the factory to be, and you don't have the tech unlocked for a lategame factory anyway. You just have to admit that you'll build a temporary facility now, and build a new one after you've unlocked the tech you'll need to scale up.
I haven't played those games, but it does happen in real manufacturing facilities.
A facility I worked at started off in low volume, high mix manufacturing (a job shop) and eventually moved into more high volume, medium mix manufacturing. What makes sense for low volume doesn't make sense when you move to high volume.
"Refactoring" happens at different levels. You can look at the entire facility, specific product lines, or specific machines. Generally you care about efficiency and yield. Safety is baked into everything you're doing, so sometimes you need to sacrifice efficiency or yield for greater operator safety.
You can have a product line with initially low sales volume, so you use less efficient or less tooling and automation heavy processes. As your volume increases, you can start to invest in new tooling, machinery, move the production to existing machines, or change the shop floor layout. For example, you can move from hand clamping some pieces for welding, dedicated fixtures / jigs to clamp and hold the parts, up to using robot welding machines for high production volumes.
You can have machines set up for batch processing as independent operations. If it makes sense through having higher volume, you can dedicate a specific set of machines and move them into a production cell.
At a machine level, you can realize that yield is too low and you can look at the design of the tooling or look at the sequencing of operations if it's a CNC machine.
Similar things happen for service operations. It makes sense to have someone manually process or copy and paste some stuff if it's low volume. Once it ramps up, it can make sense to automate in Excel with VBA and then eventually move to a more dedicated program.
Isn't this just cargo culting and then realizing when you're cargo culting vs when changes are actually necessary? I also don't like fawning over good code, or what makes good code, but I think it's actually a good thing to anticipate certain architectural changes. Idk, but I don't think you can tell a stakeholder that you can't implement a feature X because you didn't give enough of a shit about architecture and ran out of flexibility to change something.
Over-architecting a design to be flexible in one way can be a huge problem when changes come along and they are for fundamentally different kinds of changes than were expected when the design was created - this can actually be worse than having an under-architected system.
The phrase conjures up images of tight code that executes the minimum number of instructions with the least possible memory use and efficient reads and writes. In a piece of code that's executed maybe once in blue moon. Nobody bothered to profile the program and find out where the real hotspots are, so someone optimized the wrong thing.
Writing code designed to be flexible according to how the programmer believes it will need to change is also a kind of premature optimization. The code is written in the optimal way for changes along direction X to be easy. Unfortunately, when the programmer guesses wrong about the nature of future change, that work to optimize for change along X is wasted, or worse. Worse, if it makes changes along direction Y much harder.
I went through a similar evolution. Now, I don't judge the code I read. If I can understand it and it works correctly, it's fine.
I think it's a good thing. Much of what used to get me irritated (and, from my observations, what gets other irritated) are just matters of style, and what style is being used isn't really important.
Meta-disgust (“ugh disgust about code”) feels like another instance of what the article talks about.
If code doesn’t really matter cause “business,” then I think you are right when you say business is more your interest. That’s cool! I go through similar feelings at times.
This reminds me of my first experience with how executives think vs engineers. I was working for a day trading firm and was a hardcore C++ nerd at the time. We were having scalability issues and the CTO asked me if there was as anything I could do. So I tell him “Well, if you give us four or five weeks I think we can optimize the code and get about 30% better performance.” He just looks at me for a moment and then says “Or I could just buy another 25 servers, would that work? I can have them here in a few days.”
I'm a young guy working with way more tenured and experienced programmers who didn't want management positions, preferring to code. Respectfully to them, I can't stand what they do. They obsess over details that do NOT matter to the business, and our whole department is paralyzed like this. Meanwhile big-picture things like internal APIs are either neglected or too big-brained for anyone to understand. I think they're just too skilled, and we need some worse coders with smaller egos to get the job done.
For example, they swear by writing low-volume web backends in C++ "for performance" and object to any kind of framework. Ironically, having to move more slowly and carefully as a result has led to big compute inefficiencies, on top of the more important hit to dev productivity.
These might be evident in retrospect, but it's usually tricky to predict what will matter and what won't during design time. (Someone smarter than me said something about premature optimization once.)
This is also usually where experience can make a big difference.
When in doubt, best to start with something quick n' dirty and not worry too much about what happens when it becomes inadequate. Just make sure it can be easily replaced when needed, which means treating everything you write as disposable. When using hindsight, you can build something better.
There's a big back-and-forth thread about this where I explain why well-written code on our team ends up being deleted within 1-2 years, and the author's only mistake was spending so much time deliberating over the code.
Eh, if you're given a well-separated API to deal with, the code can be pretty terrible without being hard to replace. What makes it hard to replace is when someone puts a lot of thought into it and doesn't want you to rewrite it.
A good engineer creates solutions to the problem before them. That problem always includes constraints such as budget, deadlines, maintainability, etc.
Very often, the mathematically or logically "best" approach is not the correct engineering approach because it doesn't meet those practical requirements.
Engineers who insist on a sort of purity are, in my opinion, not the best engineers even if they are genius at writing code.
> No one is going to read it. It will be replaced next year
I've never seen this happen even once in my 25 years as a developer.
Quite the opposite, in fact. Codebases grow and become more entrenched every year the organization stays in business. The goal becomes to shoehorn into the product more and more features that the original designers never dreamed of. And those original devs are usually long gone. To do that shoehorning long-term in the face of developer churn requires intense discipline around communicating to the next person what you are doing and why.
Interestingly, I feel like this sort of attitude is a real issue at the precise moment the author describes it as a boon.
When you are early to mid career, it is crucial to look for ways to amplify the good you can do in your workplace and solidify your brand as an individual. To do this, you should be looking, ironically, to elevate others. Doing so is the only way to build a reputation that people are going to actively WANT to talk about (e.g. "oh, having trouble? You should call in Jim, he helped me with a related thing"). This is invaluable.
Perhaps I am speaking through a lens, but had I taken the authors advice and taken a more combative role at such a juncture, I believe I would have far fewer opportunities now.
This transition from junior to senior includes another important skillset: balancing social dynamics against engineering realities.
The key is illustrated in the book club parable: The elitism is directed outside of the group and becomes only a means of alleviating the fear of judgement for misjudging the paper. The grad student's approach clearly communicates the socially agreed upon reality: the whole paper is crap. This stance and boundary provides a clear decision space to the learning junior members: "if you think you see a mistake, those here will be happy to hear it; no sacred cows".
Bringing this practice into a situation where the target is a member of the group's work changes the dynamics such that you have to mind your Ps and Qs again -- and so, dampens learning.
My mantra related to that is "be mean to the code and nice with to programmer".
In order to learn and make things better you _have to_ be critical. But the way things are communicated is very important so everyone is on board.
"We could do better here" - no matter who exactly was responsible in the past.
"This made sense at the time but with what we learned..." - remind each other that improvement and learning is part of the whole deal.
"I like the simple and expressive core idea of this, but if we expand this further..." - elevate and develop the good stuff that's already there.
I make jokes about my mistakes, bring them up early and often. Everything is a bit lighter and easier with a bit of humor and without the fear of making mistakes.
And vice versa it is just as detrimental to be afraid to bring mistakes and inadequacies up and criticize them. It's much more fun and productive if things are continuously improving.
This works great until you get an engineer who pushes back constantly claiming it’s a time trade off or that this code is correct and you just don’t understand it. Sometimes humor doesn’t cut it.
I have worked with a few such "strong personalities" and while I agree they can be slow on receiving criticism, I disagree that one should therefore resign to more forceful communication.
There are many alternative avenues in a professional setting when presented with an opportunity to argue, even if you're being blocked by a superior.
The crucial thing to remember is you cannot ensure your "social workaround" will work, but you can always ensure you act professionally. Whether that impacts the sprint is a matter of circumstance, but mild, cheery professionalism always improves the workplace culture.
I think early to mid is too soon to be focused on others' work. Early to mid, you should be focused on the quality of your own work, because you're still developing your taste and judgment, and you need the direct and vivid feedback you get from immersing yourself in the consequences of the decisions you make. A huge trap in software development is to get disconnected from feedback and be a slave to rumor, ideology, and religious "best practices." The air is full of bullshit (in no small part because everyone is trying to "solidify their brand as an individual" and "amplify the good they do" before they actually learn the job) and the best way to learn how to sort through it is by grounding yourself in the consequences of making this decisions versus that one, choosing this approach versus that one.
If you start "amplifying" too early, you won't be amplifying selectively, and your coworkers would be just as well off sorting through search results themselves.
(Of course it's a progressive transition, and you're never too inexperienced to advise a coworker not to force-push master or submit a PR with failing tests.)
I get where the author is coming from, but at the same time, having spent most of my career to date as a front-end developer, I can tell you many stories of how hilariously off the rails this approach can go.
With a decade under my belt I'm feeling that I recently finally started learning and the conclusion so far is that half of the effectiveness of software engineering comes from obeying ultimately simple and common sense rules that anyone can follow, like "use idiomatic expressions", "read the documentation", "prefer pure functions and immutable data structures".
I'm an average(and kind of lazy) developer, but I found early on that I have an edge over more talented and hard-working people - I gather knowledge instead of compensating for the lack of it with hard work.
Can you frame it as elitism? I hope not, because I deeply believe the worst and laziest developers can use some of those rules so that they're both effective and still bad and lazy.
> prefer pure functions and immutable data structures
This sounds like common sense until it becomes common nonsense, because you use Python or Javascript or Ruby or whatever language where you don't have an optimizing compiler and optimized immutable data structures, so what could have been be a single-pass low-memory scan over a big dataset in 10 minutes that could run on a toaster is implemented as an inefficient clusterfuck that takes 12 hours and 4 GB of memory.
It's absolutely important to treat mutability as either a side effect or a local optimization from a design perspective. But Python is not Clojure no matter how hard you try, and at some point you're going to want to mutate a dictionary.
Anyway, the point is that what might seem like "common sense" in some situations is not always obvious or unambiguous in general.
I had a similar experience when I got into software.
As a junior, more than once I wished to just to quit the field - even after years of CS studies & it being one of the best opportunities one can have in my area.
The draining nature of the "this is shit, we need to do clean code" and "TDD is the way" discussions being constantly repeated day-in day-out, can quickly kill any interest in working on code. It's not that far from being forced to write a book using a 100 most common word list and if you use any word outside of that, you're a shit writer because it will make it harder for others to read, because you used a word that's not in the most common word list...
I'm quite glad that I moved away from that corporate environment into startup space a year later, which showed a whole different perspective. Where code itself is useless and what mattered is if it delivered value. If your hacky solution can deliver value - then you can justify making it better. Otherwise - who cares.
I do think there's way too much attachment to code & its perceived quality in the dev community. On the other hand, if you work in a team where majority of people are well into their careers - there's a lot more nuance when it comes to the extremes such as "TDD all day all night" and "daily pair programming". They are seen as tools to utilise when appropriate, rather than mantras to be repeated mindlessly.
It's almost like children versus grown-ups. Children have simplistic, almost magical thinking. Adults know the world is more complicated, and the childrens' simplistic solutions don't actually work very well.
One of the reasons is small sample size. Children don't have the sample size to see why their simplistic solutions don't work. Neither does a programmer with two years of experience.
Colleague of mine working in small company. New 'CTO' hired last year. He's very much of the "everyone's voice deserves to be heard... all input is valid!"
My colleague has 22 years of engineering experience across a wide range of problem spaces and industries and team sizes (large bigco to mom/pop orgs).
Another guy (X) started was a year out of high school.
X insists that 'tech XYZ' is the best. Current tech stack was partially rebuilt by my colleague, but wasn't finished (because... lots of reasons, mostly resourcing).
XYZ is not only not a great fit, but the ecosystem supporting the problem space is small, especially considering what's already in place. Terabytes and years of data need to be migrated (both physically to new data centers and code-wise - new structure handling has to be added to accommodate current and future needs).
In planning meetings, "all voices need to be heard"... so X pushes XYZ a lot. And randomly rebuilds small bits in XYZ. And when it doesn't work - blames everything else (it's the network, it's the supporting libraries, it's ...).
The CTO will not push back. "That sounds great! That sounds like it'll solve all our issues!". There's a criminal deficiency in the understanding of the current tech stack or problems, along with no experience in migrating anything. But any criticism is taken as "we need to be more inclusive and let more people speak up - some of the best ideas can come from people who've not traditionally been heard".
Up to a point, that can make sense. But when do you draw the line? 3 months? 6 months? 18 months? People insisting on promoting child-like understandings of problems and solutions - while not ever delivering anything resembling a working solution - at some point should not be listened to.
Why does my colleague stay? He's only part time right now, and was close to leaving, but there's been some shift to refocus the CTO on something else, which may - over the next month or so - leave the few competent people there alone enough to get things back on track. I think if this was a 'full time' gig for him, he'd have left already.
One of the startups I worked at had a simple rule - if you wanted to complain about something being bad, you had to come with suggestions how to make it better. And they had to be real, tangible option that could be implemented, not "how about we use technology X".
If you had nothing - you could flag something as an issue - but you were expected to either say how you suggest so solve it or shut up. You did your part of flagging the issues and that was it. It was a startup working with banks & everyone knew there are issues all around, but a lot of them we couldn't solve ourselves, because of the banks being unwilling, regulation being not clear enough if we're allowed to change anything or sometimes both.
If you came with a plan how it can be fixed while minimising the risks associated - no matter if you were junior or senior, at least you'd be listed to. If the ideas are good, the other devs would pitch in to make a realistic implementation plan. Beauty of laws & banking - if something is unclear, no one wants to touch it, so as long as you pass that hurdle - you'd get the go-ahead :D
Yes, quite a few issues ended up being ignored, but at the same time - if no one knows how to fix it, what's the point of draining everyone by continuously complaining how bad it is.
We had relatively few of meta discussion that were not product related - ever since them, I do believe that devs start focusing on code quality and "the right way to do it" only when they lack the power to make any of the product decisions.
The saddest thing about this story is that X is missing out big time here.
I say that because I started out in a small, very cool agency. However _technical_ leadership and seniority was lacking (they admitted that and made an effort to change that.) I then stagnated on a technical level and had to accumulate that skill set own my own, often on my own time as well.
X being in the presence of a veteran like your colleague should be a golden opportunity for them to learn and grow. Of course open communication and letting everyone voice their opinions has to be part of it. But everything should be weighted, discussed and criticized based in it's objective merits for this approach to make any sense at all.
I thought you meamt the die hard proponents of these software dev tropes are the childlike ones.
Imo I think that is more apt. I see a lot of resources wasted in the name of conforming to standards when the standard does t really apply to the partocular scenario
Either you misunderstood my post, or you are explaining what you think badly, because I think I am in agreement with what you wrote in your last paragraph.
Having spent 2 decades programming I feel I was indeed surrounded by opinionated people who were narrow minded and combative, and often I felt looked down upon regarding my tech stack or code or interests ( eg I remember there being a strongly negative bias against bitcoin in the early days). Now I know according to research most would have been on the autism spectrum and generally less socially adjusted as others with other careers.
I think my point is engineering may lend itself to strong opinions in people with poor social skills.
> eg I remember there being a strongly negative bias against bitcoin in the early days
So, in the one example you gave, they were right. And yet I get downvoted for suggesting: hey, guys, that senior developer who's insisting on doing something a certain way? Sorry you have to hear this, but they're often right.
I think there is some confusion here. I will clarify, but please don't get offended.
For your first point, no, they were wrong, I did very well on bitcoin - hence my example.
For your second point - I am confused - I agree senior developers know what they are talking about, I was one for nearly 2 decades.
I was referring to peers not superiors.
I was saying I have encountered narrow mindedness and ignorance. I did not say that is all I encountered. I also encountered kind, intelligent, superior and inspirational people.
Edit> On second thoughts - I re read your comment - Who downvoted you? When? Was it in regards to this article?
The elitism tends to show up once you've been around the block a few times but before your ego and position is secure. Years 6-8 or so; ie, post-doc age. Not really properly mid-career, at least outside of "retire at 35" FAANG world.
Which matches my observations. Folks may pretend otherwise but nearly all mid-career people intuitively understand there's lots of bigger fish in this world.
Elitism is usually meant as an insult for snobs pretending to have better tastes than anyone else. This is wrong because in questions of taste there can be no criteria other than subjective taste, however in contexts where there are objective criteria of quality elitism is just being a professional good at his job. When you open up a horrible spaghetti codebase that's inconsistent in style and full of nonsensical abstractions with 90% of bloat you should feel digust. Of course all bad code has a reason behind it, some more justified than others, but no reason should ever make a bad codbase feel good.
> however in contexts where there are objective criteria of quality elitism is just being a professional good at his job.
Imagine I look at the Linux kernel source code and I feel it's lacking in automated integration tests, and that C is a poor choice of language for security-critical code.
Am I a competent professional, applying objective quality criteria?
Or am I an arrogant dilettante, to imagine I know better than some of the most influential living programmers?
It depends on you and the context, are you actually working on Linux kernel and have worked on a lot of similar type projects?
Objectivity does not imply that it's easy to discern adequate criteria or that they are easy to know or that there is a consensus about them, just that it isn't purely subjective, and code isn't.
Maybe this helps in some places, but the negativity described here will not win you any friends when the subject is your coworker’s pull request and not a random journal article. I always try to see other perspectives and understand why a choice was made and get people to think instead of judging what’s been done. Working with “the smartest person in the room” who believes there’s only one right way to accomplish a goal can be hell. People often mistake confidence for leadership.
Sure, we should all make an effort to be diplomatic and not unnecessarily negative. But you're making a huge assumption: that the "smartest person in the room" is wrong. That's a nice, convenient, simple world where you can just ignore people that you disagree with by a-priori assuming they're wrong. Sorry to burst your bubble: sometimes (often?), these people are right.
People with decades of experience are often integrating thousands of lessons they've learned, in order to satisfy hundreds of individual constraints as best as they can. Consider the "Thinking Fast and Slow" anecdote about the fire chief who orders everyone out of the building, despite nothing obvious being wrong, because they have a gut feeling. If there's a culture of "Who made you the boss? Look at this guy, acting like he's the smartest guy in the room. If you're so smart, you should be able to explain to me why you believe we should all leave the building. All voices deserve to be heard. Sometimes the best ideas can come from ..." then everyone dies.
Look, I know we hate it when that arrogant sportsman is actually good at his sport. It's a bruising to our ego when the senior developer seemingly arrogantly insists on some standard or some change that isn't immediately obvious to you, and she doesn't immediately have the time to explain to you the 600 reasons why. She's probably a hell of a lot more critical of the idea than you are! She's already thought of, and worked through, every single objection you're raising, plus 100 more, and still decided this was the way to go. Yes, mentoring is important, and she probably does spend a lot of time doing that. (Maybe you're not willing to listen?) But not every moment has to be a healthy-debate all-ideas-welcome teaching moment. Sometimes you just fucking listen to the smartest guy in the room before the building collapses.
I do not agree with the author marketing elitism here as a desirable characteristic. There is nothing wrong with raising your standards for quality or working to be excellent within your profession but elitism generally tends embody arrogance, social stratification or other exclusionary attitudes.
You might inclined to disagree with "Well, he said 'this technical elitism' and then gave his own definitions so... ". Yes, he said that, and then proceeded to suggest it's not just acceptable, but desirable to view efforts below your standards with disgust.
There was no implication of mindfulness or humility in this process, only that it was an effective way to self motivate. There is a difference between what is sufficient and what is unacceptable. If you set your standard at excellence then anything less would be unacceptable. Tempting a mechanism as it may be for self judgement, it is a slippery slope because we often judge others to the standards we set for ourselves.
This doesn't really reflect my experience. Sure, some people (me) may not have read the paper, but the presenter sure as hell has, and by the end of the hour, I do have a good idea what the paper is about, and about its strengths and weaknesses.
if I had only one wish, it would be the end of this reference. Python has no zen, at least not anymore, and if you just want to refer to some abstract moral or best concept, please pick another quote
The fact that it's python is irrelevant, in the context it's about how beginners can read "wisdom" such as the Zen of Python and identify with the goals, whilst veterans can react to the opposite of such writings and consider what the goals prevent against.
I discussed this with my friend - with every release Python becomes more like C++ in that there's a million ways to do the same thing and you're struggling to choose the "correct" one.
To add to that, it's introduced as a 'simple' language. But, in 2023, the cumulative effort one puts into getting things "right" is bigger than effort you would put in learning something that's described as high learning curve.
As an anecdote: Someone described elixir to be a "high learning curve" language if I wasn't exposed to Functional Programming. When I inquired how long they mean by that, the answer was 2-4 months. Which really says something about our attention spans.
It's not hard to sympathize with Verna's feelings. These are all based on a similar set of philosophical guideposts: maximal output for minimal input, maximal possibilities of self-expression from a minimal set of generative rules, and understanding the deep "essence" of the craft, so that when you add your own contributions, it is by finding the "essence" of the addition and harmonizing it with the essence of existing work. And these are appealing because they give the feeling of tremendous power and the sky being the limit.
But the real world runs on the philosophy of Visual Basic, punk rock, and mixed martial arts, which are all based on a different set of philosophical guideposts: a) focus on practical solutions to real world problems; b) make getting started as easy as possible for everyone; c) it doesn't matter if added components harmonize with the original; what matters is if they contribute significant value on points a) and b), i.e., it's okay to get messy.
I came back to Python after several years, and there's type-checking now, which we're required to use at work. Way to ruin the entire point of Python.
Also, Python's async stuff was always terrible until they introduced the new async/await feature, but I suppose that's part of the "many ways to do the same thing" you mention. They should've done it from the beginning IMO, but it was hard to predict maybe.
I don't have many complaints about the language itself, but I've found the packaging ecosystem to be the most complicated and frankly dysfunctional that I've used in the last decade.
I never understand this complaint when Javascript/Typescript is sitting there with a mess of .lock and .json files across multiple tools that sometimes interoperate and sometimes don't.
Package management isn't a solved problem; Python employs standard patterns for it; Significantly better than chained Makefiles from my C-development days.
I didn't say it was a solved problem - it isn't! - but rather that I find python's solution particularly bad. The situation in C (and C++) is indeed even worse, I just haven't been doing that for the last decade. Things I've used in the last decade that all work better, in my opinion: Bundler, NPM, Maven, Cargo, and even Go's packaging.
I always liked python, but I didn't use it at all really for a very long time until recently, and I've just honestly been surprised by the poor state its packaging seems to be in.
NodeJS has NPM, which is simple. Usually when you want to run someone else's NodeJS project, you `npm install && npm start`. Most Python projects have a Dockerfile, which shows you how bad the actual package management is.
TS is just JS but with extra things that can go wrong (Babel etc) and not being able to use simple `require` syntax anymore and some not-very-automatic type checking that you don't need; I don't use it. It's like taking a steak fresh off the grill and smothering it with mayonnaise.
Says the language whose code-block-endings are invisible! I will never understand the desire of people to make the important punctuation in their language invisible. Python, CoffeeScript, YAML. These people just hate being able to see the important flow-determining punctuation in their language!
The reason why I can agree with this post, and find it useful, is that the author purposefully limits it to the "mid-career" phase. I think that if you want to grow from the "mid-career" phase to the "end-career" phase you have to put aside your elitism more and more. When you read the Nature papers as an expert you do so in a way that is both extremely directed and, at the same time, extremely open-minded to what you find. You can't truly grasp open-mindedness if you think you are better than other people.
Um, a post-doc is not "mid-career." A post-doc is a beginning researcher. Just like having a black-belt does not mean you are an "expert," it means you have achieved satisfactory competence in the basics of the art and can now begin your serious study.
> But elitism only becomes useful at a particular stage of development. Earning the white belt is almost purely knowledge and a bit of practice. Getting to the black belt requires not just skills, but a mindset: determination, resolution, and yes, snobbiness. You must believe that having a black belt is worth the effort and that having a black belt is better than not having one.
I think it’s even simpler than this white/black belt metaphor.
Driving your career forward requires delegation, scaling yourself through others. Doing this effectively requires having strong opinions. The author is referring to these opinions as elitism, which is jarring to me. It could be elitism or simple pragmatism.
Quite a few posts here are referring to code quality. People often forget that programmers aren’t paid to write the prettiest code or have the most beautiful abstractions. VALUE is what we want to produce.
I once found myself insulting a monolithic code base only to later realize that mess of a code base has shipped in over 10 million devices and a product rated over 4.5 stars on Bestbuy, Amazon, and many more retailers. It’s entire ecosystem had directly and indirectly generated billions of dollars in sales.
Meanwhile, my own teams’ clean code with well thought abstractions hadn’t generated any revenue at all. In fact, this other “piece of shit” that came before paid for all our compensation.
The guise of a Meritocracy is often just a flimsy pretext for competitive Nepotism.
However, the ethics of posting a poorly written chat bots nonsense points to a deeper issue, We must consider if derivative copyrighted material when misappropriated by users is still plagiarism due to missing citations.
Sometimes it tye badness was really effort due to unfamiliarity or not instantly understanding what I was looking at.my laziness.
Sometimes it was because it disagreed with what ever framework or methodology I was using to give me confidence in the face of ignorance. I feel like an imposter but at least I know design patterns so this guy who did MVC wrong is worse.
Sometimes it was looking at something genuinely bad.
Now, later on, maybe my emphasis is more on business outcome than perfect implementation or maybe I've been involved in making enough abominations due to time pressures and architectural compromises that I can read those forces in other people's work.
Either way, I don't feel that kind of disgust anymore. It's code. No one is going to read it. It will be replaced next year. It works or it doesn't. Having to rip stuff out when the business changes or someone ways to use a different stack for resume reasons is part of life.
I wonder if this is an adaption or a maladaption.