Hacker News new | past | comments | ask | show | jobs | submit login
Learning Without Burnout (junglecoder.com)
223 points by tate on March 16, 2021 | hide | past | favorite | 94 comments



> especially cultural factors that make software development harder for those who can’t pass as YOUNG white males.

FTFY.

I’m 58, and learn faster, and more comprehensively, than I ever did, when I was younger.

I think my grasp of “the big picture,” or a high established baseline, is a big reason for the fast uptake. I don’t need to be told what iterators or allocators are. After having designed hundreds of applications, drivers, tools, APIs, SDKs, libraries, and systems, I know why we do stuff. It isn’t just rote dogma.

This gives me the flexibility to adapt my view of the world to deal with new paradigms and concepts. Also —and this is important— I can “mix and match” my approach. Most of my work, these days, is a chimera. I will use classic patterns, from thirty years ago, and combine them with new “cutting edge” stuff.

That takes flexibility, curiosity, knowledge, and motivation.

Also, teaching is important to learning. Especially teaching smart people. It challenges us to really understand what we’re trying to impart, as we are guaranteed to get challenging questions. When I took martial arts, many moons ago, the “postgraduate” stuff was to become a teacher. The Sifu (you know which art) would insist that students achieving new belts, immediately start training those in earlier belts. By the time you became a blackbelt, you were ready to teach an entire class, unassisted.

BTW: I never became any good at it, but I did learn some cool approaches to stuff.


Maybe that's just me but in college my brain was high frequency [0] but low clarity. Some people might have wiser approaches (I've seen a few I believe, where they would be calm and try to see large and abstract without rushing). 10 years after college my thinking changed and instead of sweating hard I started to understand things at a higher level without this hectic feeling. It requires less efforts and at the same time let's you reach more creative, larger solutions. (pretty counter intuitive when it started to happen).

[0] and I've seen this on dev discords where 20yos goes into all directions trying to apply all configuration of 'lessons of the day' in random ways and high stress. Very different brain activity.

ps: this brings a lot of questions about the psychology of learning.. how much do you learn because of youthful ego, social status desire, peer pressure, elder validations, future anxiety versus joyful and creative ways to see things and discuss ideas.


I'm 22 and I kind of relate to high frequency, low clarity but I like this style of learning.

It lets me quickly learn the basics of things and if I want to dive deep I can revisit things later on.


I guess it's matter of outlook, with time you want less 'revisit later on' and a more balanced 'enjoyable, peaceful and deeply learned quickly'.


If nothing else as time goes on there's less and less 'later on' and more and more stuff to revisit; perhaps you'd naturally see this kind of pattern if you accumulate more and more knowledge debt and spend more time paying it down to get a firm foundation.


Well I'll see what happens. Right now it feels quite enjoyable and not frantic learning, and it doesn't feel like I'm leaving gaping holes in my knowledge I'll have to fill in later.

I'm also naturally diving deeper into topics as I spend more time doing work related to them so all in all, I'm quite happy with how things are going so far.


> cultural factors that make software development harder for those who can’t pass as white males

Racist remarks like this need to stop. Being White or not has nothing to do with learning tech. Available resources (internet, libraries, universities) are available usually for free or small fees for everyone regardless of race (at least in the European case). Most other countries in the world haven’t even a significant White minority, and when they have one, they are not plotting against non-White to hinder their learning of technology (often quite the reverse, in fact).


I actually got in touch with the author because of this remark, because this subject wasn't even discussed in the article at all. It looks like virtue signaling to me. Such remarks really make me cringe.


Exactly.

All this pervasive whinging about "white males" is getting really tiring, like when the Smithsonian described traits like hard work, objective thinking, etc. as aspects of "whiteness"[0]. I mean yeah, if you erroneously associate all of these positive traits that drive success with "whiteness" or "white males", then I can see how you would conclude that "white males" have some sort of unfair advantage.

[0] https://www.newsweek.com/smithsonian-race-guidelines-rationa...


It seems to me that you have precisely missed the point. In particular, you give away your own worries in your frustration that something being described as part of _white culture_ must be implied to be _bad_.

Have you investigated the source material? Engaged with it by doing the hard work of reading it and thinking about it objectively, which you seem to believe are positive ways to approach knowledge? Or are you reacting emotionally to your perception of a one-pager extracted from a whole body of work? Have you considered that some of the traits you're describing may have alternate combinations of traits that could also result in positive outcomes if they existed as the dominant traits?

These are just a few questions that arise for me from your snippy post. If you engaged with the material linked from that Newsweek article, you might come up with more.


Yep. This is especially true in 2021, where diversity hiring and affirmative action on racial and gender lines are at an all time high.

I am not saying they are wrong. But the argument that white (and asian?) males have an inherent cultural advantage that's so outsized that significant differences in hiring/promotion/outreach standards cannot overcome it, is bogus.

Qualifier: Arguments around lack of maternity accomodations and an inherent class divide are completely valid. I am not referring to those.


Indeed

It's not only racist, but also does not promote outperforming.

You can either sit and complain or try hard to skew the odds that are against you.


It's not clear what you're saying here, other than concluding that the disclaimer in question is racist (despite the fact that people who can't pass as white males, at least in the US, quite frequently report a difficult experience---even those who are successful).

Do you think that having the internet, libraries, and universities is the only thing that can make it easier to learn technology? Does having time factor into that? Time to use the internet, time to go to the library, time to study for university. If having time factors into it, would you consider that having money factors into having time? If having money factors into having time, would you consider that race often factors into having money? If you would say that race may factor into having money, would you consider that this may be due to systemic effects? And if you would grant that there may be such systemic effects, do you think these effects may persist to some extent even for those who do manage to learn tech? Obviously there are a lot of links in that chain, and you may disagree with more than one of them, but I'd be interested in which link in the chain breaks---or if you're trying to say something beyond this chain.

I'm interested in the “quite the reverse” note, as well---what are some examples of countries in the world with a white minority where the country in question plots to hinder white peoples' learning of technology?


Racist remarks like the author said need to stop. It only promotes racism. This critical race theory is doing more damage than all history combined.


> Available resources (internet, libraries, universities)

(mentors, support from authority, sufficient time/financing to be able to use the available resources)

If a professor believes consciously or subconsciously, by looking at you, that you're less likely to succeed, that's a significant barrier that may well be immediately fatal to your interest in a subject.

The frequency with which this kind of thing is said out loud is astonishing. E.g.: https://abc11.com/education/nc-state-professor-suspended-for...

If you've never personally witnessed quiet sexism (against women) and quiet racism (against non-white and in particular black people) in computer science, then I question where you have been.


> If you've never personally witnessed quiet sexism (against women) and quiet racism (against non-white and in particular black people) in computer science, then I question where you have been.

In an overwhelmingly white European country. Out of the 900 people I went to uni with around a hundred were women and < 10 were not white.


Were you friends with many of the women and non-white students? Did they share their experiences with you?


That's not what "personally witnessed" means.


I am aware. I'm just asking a follow up question to understand your understanding of those folks in your program.


> I think my grasp of “the big picture,” or a high established baseline, is a big reason for the fast uptake.

This is a valid argument and true also in my experience.

The counter argument is that "software building" is to media companies (faang) what a press is to an advertising company. Problem-solving is not the relevant value-creating activity. The perceived need is for "creators" and "operators" for the "machine", rather than engineers and problem-solvers.

Screens today are spaces that need to be filled with "cultural content". The technology itself is a public good, and they want willing bodies that don't ask questions to turn the wheels of the "machine".

This is the rationale for ageism, whether valid or not.


I completely understand why FAANG-type companies (and all their imitators) want young kids, right out of school.

Young folks learn rote (they just spent a whole bunch of years at school, doing exactly that), and adapt to it quickly, even if they don't fully understand the context or goals, so they are more likely to follow direction and process; especially if you can gamify the process.

Me, I like to know why, and to "play the tape through to the end." I'm also apt to say stuff like "Are you sure that you want to do that? You know it's likely to give you a race condition, if you use a different device. Might be worth it to consider a different approach.", and whatnot.

That kind of pisses in the punch bowl.


I don't think this is the reason. They want people with no families and no responsibilities because those people will move to Seattle or San Francisco and work 100 hour weeks without having to worry about uprooting spouse and kids and ruining all human relationships.

I actually hope one of the outcomes of post-pandemic normalization of remote work is less tacit age discrimination from companies that used to greatly prefer people willing to relocate.


I don’t think tacit is exactly the right word there. Effective (or de facto) is probably closer. If the company is willing to hire someone and that person decides they’re not willing to relocate, I don’t think that’s tacit age discrimination.


Companies that have shitty working conditions (so, almost all of them) prefer young people, because they are enthusiastic about working and learning and being treated as an adult professional. They don't know any better yet. Whereas people that have work and life experience will immediately see through the company's bullshit and adjust accordingly - either leave for a better place or find a way to slack off as much as possible etc.


I'm not quite sure how you can make the relationship to content creators either an issue or a rationale for anything.

Software building has always been for a purpose, whether scientific, engineering, publishing, trading. The value creators need tools and software is that tool.

Whether individuals are rewarded in the same way or the same amount is not because of FAANG. While large, they are by no means the majority of the economy.

Everyone needs software now. This pandemic shows the value of that, as well as of the research it empowered, the education it enabled, the community it connected.


The look/feel/interaction in media apps today is essentially a "cultural good" in the same way the design of a TV-franchise is a cultural good.

It needs to be updated every few months to feel fresh, and the time horizon is very short. There needs to be analytics/engineering-work to enable this, but the goal itself is to adapt the user experience such that it maximizes adoption and screen time. What matters is fast iteration, creativity, and "fast plumbing".

The perception is that young people are more enthusiastic about this sort of work. More experienced people would be unnecessarily concerned with code quality, stability, technological trajectories, and so on.


It isn't uncommon to see people rely on the argument of neuroplasticity as a crutch. That one person seemingly doesn't learn as well and as fast past one's twenties is a thinly veiled ageist take but also an easy hand-wave to discard personal/professional change.

It's always good to see your kind of comments.


I guess it depends how you view/define learning.

If by learning we mean learning _any_ information, then it seems likely the rate of learning would always increase over time since as you gain more knowledge it's more likely that you've done something similar in the past.

If we're strictly talking about learning _new_ information (As in, not closely related to previous knowledge) then I can understand the argument that someone learns slower as they age.

Basically it seems like it comes down to crystalized knowledge and how you count that.


> If we're strictly talking about learning _new_ information (As in, not closely related to previous knowledge)

Is it possible to clearly separate the two, or are there universal concepts underpinning every domain?


I don't know. Probably not, but I'm just musing.


I think as you grow older you build a large web of knowledge where you can connect disparate pieces of information into a coherent framework. This should make one easier to learn new material (eg reading your first scientific paper can take a very long time, but by the time you have read your 1000th, you will have built models to help digest information).

However it may be that this developed framework can hinder new learning if new skills are fundamentally incompatible with existing framework. There old habits can slow down incorporation or new skills. For example, learning Japanese as an anglophone may be harder than learning Japanese as a Korean speaker. The anglophone would need to actively ignore learned habits to learn Japanese.


Perhaps that suggests the approach you sometimes here from medics (particularly for surgery) of "see one, do one, teach one" isn't quite as alarming as it initially sounds!


I tend to "learn by doing." I don't really like pointless academic exercises. I need to be shipping stuff.

When I want to learn something new, I'll set up a project, with a definite ship goal, that uses whatever I am studying.

It may not be perfect, when it ships, but it does ship.

Shipping is also learning. There's absolutely nothing to compare with releasing finished product, and supporting it. I've learned a lot of the "soft" skills necessary to ship and support. These are different from simple algorithms or debugging exercises.


> Also, teaching is important to learning.

Hear, hear. I didn't really know how to fly until I started teaching people how to do it. I thought I did, but not really. Not deeply.


Soon to be 58 year old here. In a lot of ways I'm better at this software stuff than I was at 28 - looking back and comparing to now I really sucked at it back then. But the aforementioned cultural factors mean that it's a lot harder to find work (I think this is what you were alluding to with your edit to change to YOUNG). Part of the advantage now is knowing what things are important to learn and knowing what will become ephemera because you've seen so much ephemera come and go over the years.


When you say you learn faster and more comprehensively than when you were younger - was it gradual improvement, or did you have some significant turning points where you optimized your learning process?


It seems to have happened in big "jumps." Sort of "Ah-HA!" moments.

But really, it's an accretion. That is especially true of the stuff that I need to ship software, which is quite different from what I need to write software.


It's similar for me. I was less motivated and could grasp less as a college student over now with 10 years experience.


I've gone completely the opposite. I seem to have made more mistakes as I've gotten older (35y) and learning seems to go more slowly. Some of them are mistakes you'd expect someone with 0-1 years of experience to make, even though I've got 5 or so. It's all downhill from here, I guess.


When unable to access the minutiae of some byzantine concept with my hominid navigation apparatus I often comfort myself with the knowledge that all that I know will ultimately be lost to the chaos of the universe and that the cumulative impact of everything I’ve ever done is utterly inconsequential.


You are not the universe. You are you and this is your reality. In that reality, it matters. A lot.


Such is the illusion we all have trained into ourselves.


It only matters if you care. GP already mentioned that in their reality it actually does not matter.


THE universe? Which one? There are so many!


> [all] will ultimately be lost to the chaos of the universe and that the cumulative impact of everything I’ve ever done is utterly inconsequential.

What would be the alternative? An universe where our changes are more "permanent"? Ignoring the paradoxes that entails would it really make them more consequential or just more lasting? It would be completely subjetive, just like is subjetive to believe something matters even if it ends in a few years down the line.


It would be nice if thermodynamics weren't such a harsh mistress.


Immortality would be a possible goal if heat death or big crunch wasn't the inevitable fate of everything.


If you leave stuff as-is then you are probably right.

But what if technology can solve that issue aswell?


That would assume that physics as we understand it is fundamentally wrong. You might as well hope for an eternal afterlife in the garden of Eden.


How many times have our understanding of physics have been fundamentally wrong in the past?

What is the chance it will never happen again?


The chances that the laws of thermodynamics are fundamentally wrong is pretty low. Asimov has a nice essay on this topic: https://chem.tufts.edu/AnswersInScience/RelativityofWrong.ht...


Thanks, that’s a load off my shoulders.


Allow for the possibility that you have something important to contribute to the world.


Allow for it, but then realise in reality even something important is trivial.


As a person who subscribes to Autodidacticism; here are my suggestions;

* Know the difference between "Knowledge for Knowledge's Sake"(KKS) and "Knowledge Needed for a Job"(KNJ). While there may be some overlap, they are separate pursuits and need to be tackled differently.

* For KNJ, use any and every means to short-circuit the learning ramp and get started on the job. This includes;

    - Identify the "Guru" in the company and ask for a one-on-one brain dump on the project/product you would be working on. I do this in the first couple of weeks over lunch and late afternoon meetups. Keep your ego aside and revert back to "student mode" that you had when studying in college. Take notes, book/article references and anything which will shorten your learning ramp.
   
    - Based on the above, focus on the domain knowledge and technologies required for your job. This will be an intense 2-3 weeks and just like you are preparing for an exam.
   
    - Get started with some simple bug fixing/minor features implementation etc. When you get stuck and/or lost do not hesitate to ask for help from team members. Within a couple of months you should have enough knowledge and confidence to take on bigger responsibilities.
* KKS is far more amorphous. Actively cultivate the mentality of being a Dilettante i.e. dip your toes in anything/everything that catches your fancy. Wander aimlessly through the various Technology (and Other) domains trying to grasp the "essence" of the subject. Do not look for outside validation and do not compete with others. The aim is to educate yourself and understand the subject to whatever level is acceptable to you. You will have to read a lot, do a little of what you like and follow your curiosity/interest/instincts. There is no goal and no end to this; it is a lifelong endeavour to gain knowledge. Once you get hooked onto something you will find that you soon transition to becoming an Expert without your actually trying to be one. All the Geniuses/Scientists/Greats we have read about have followed this process to become what they are.


The main thing to do - and what I told my team yesterday too:

* This is not college, you don't have to remember everything, use the internet.

* Learn on demand. Never read books from cover to cover.

* Do something real with what you're learning. Example, as you're learning a new programming language, be building something real with it.

That's it!


> * Learn on demand. Never read books from cover to cover.

As a blanket advice, I really don't agree. It depends on what books, it depends what they're trying to teach you. Huge difference between a reference book and... pretty much any other kind of books.

Many times you don't want to know something, you just want to know something exists/is possible/has already been done. Skimming through books cover to cover is one way to do it.


Couldn't agree more.. Reading CLRS cover to cover and reading SICP cover to cover are two different things.


CLRS? SICP?

What are these?


Classic Comp Sci books, specifically:

CLRS - Introduction to Algorithms (after the authors - Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein)

SICP - The Structure and Interpretation of Computer Programs by Hal Abelson, Jerry Sussman, and Julie Sussman


Which one is the more “cover to cover”-y?


SICP


"Don't feel compelled to read books cover-to-cover" would be a more useful guide.

Many people do, and feel guilt over setting down books. That serves you poorly.

Approach books as an inquiry or conversation. See Mortimer Adler's How to Read a Book (frequently mentioned on HN). Read with a purpose or goal.

(Entertainment and distraction may be goals or purposes.)

Some books reward a close comprehensive read. They are rare.


> Don't feel compelled to read books cover-to-cover

I more often find myself compelled, by managers (who believe without any real evidence besides wanting it to be true), NOT to read a book cover to cover even when the subject matter demands it. I’ve never, not once in 30 years of software development, seen a coworker reading a book. Reading for true comprehension is frowned upon as a “waste of time”.


When it comes to engineering and building stuff, it's a lot like art. First you draw the lines in a rough way, build out the "stick figure" so to speak, then you draw the rest of the owl, slowly turning it into a fully finished piece. It's way better than quitting after having drawn a perfect eyeball. Capture the essence of the vision as soon as possible. You can't come back to finish an eyeball as easily as you can come back to finish a quick sketch.

Also, regarding reading books c2c, it depends on the book. The smartest people I know have hundreds of books, but have fully read and re-read like a dozen of the ones that impacted them the most. For the other books, they've picked up snippets, but haven't come close to fully reading them.


> Also, regarding reading books c2c, it depends on the book. The smartest people I know have hundreds of books, but have fully read and re-read like a dozen of the ones that impacted them the most. For the other books, they've picked up snippets, but haven't come close to fully reading them.

I don’t claim to be one those people you describe, but this is also what I’ve found works well.

Some books are packed with amazing information, line after line. There is great value in reading the entire thing - the overarching narrative of reading it linearly provides additional value to reading it piecewise.

Other books might present one or two great ideas/concepts, but spread them out over a few hundred pages. These are the books I read piecewise, to “get the gist”.

The tricky part is knowing which category a book falls into. I imagine there is a level of required prior knowledge/experience before you can do this.

Ironically, there are books I used to think weren’t very valuable, or presented “obvious” ideas. Later, I’ve come back and found them incredibly informative. For what I thought was “basic” in my first year of university, I now respect as nuanced and important first principles, for example.


>* Learn on demand. Never read books from cover to cover.

I think you are my type of person, in that you learn a little bit on demand, go to work on it and quickly get nearly a complete view of everything and depend on being able to look up stuff you need when you need it.

That's great, however you seem to be different than me in that you think your way of learning is the superior one instead of just a different one.

I have known some very good programmers that are read the book cover to cover type of learners. In my experience read the book cover to cover types are more strict types of personalities and thus often we do not get along, but when we do it's great because we complement each others skill sets and predilections for how problems should be solved.


I have a similar learning style. As with all things I think there is a tradeoff.

It feels superior when you aren't doing anything too complex and when you can iterate quickly with little cost because it's far more time efficient than thorough learning with only minor drawbacks.

When I think it falls short is if the problem you're working on requires deep domain knowledge or failure is costly. In this situation, someone who has deep knowledge and who will thoroughly work through the problem will be far better off than the person who learns on demand.

The caveat to this is that I think the former situation is _far_ more common in industry than the latter, so to me a learn on demand style is far more useful in practice. I'm sure that some domains would heavily favour a thorough learning style though.


I tried learning everything until it became too much. Now I try to learn the stuff I need. Gives me an appreciation for simple tools.


I don’t think it’s superior.

It’s just works for me and my team.


I think of it as "index everything" but dig the details when the mission requires it.

So read widely, see a lot of keywords, understand what seems to be commonly accepted as important, but spend the time getting more words over learning to perfection.


This sounds like dangerous advice. This will put your team in a position where you constantly just know enough to solve a specific task at hands but never plan ahead.


Sorry, I really disagree with never read books cover to cover. There are definitely some programming books that are worth reading cover to cover.


Not to derail the conversation too far, but what are some you've found?


Code complete.

The elements of computer systems.

The pragmatic programmer


There's a little dynamic theory of learning around this. When I get excited I start trying but if I don't reach some escape velocity all the excitement and learning start to hurt my brain. Whenever I finally reach some level of concrete realization then that knowledge turns into pleasure and pride.


> make software development harder for those who can’t pass as white males.

I think bay area startup scene might be different but I've been consulting last 12 years in corporate America (banks, healthcare, big retail) and 90% of teams usually are Indian and Chinese.


As an Asian male, it seems to have become really fashionable to denounce white males in the past few years. As if you can correct past injustices by punishing those in the present.


Same here in north Texas, for at least the past 20 years.


This holds in FAANG too from my experience.


As a university student, I feel the same way that there is an infinite amount to learn, but towards mathematics and computer science, which themselves are very broad fields. Since the software industry appears to change trends at the drop of a hat, learning a specific framework or library has a short shelf-life (insert deprecated but popular at the time technology here).

Perhaps out of necessity, one useful criterion I've had for myself is to emphasize composable and widely applicable knowledge. For myself, some examples come to mind:

- learning to use a text editor well (Emacs) allows me to edit effectively across programming languages, file formats and operating systems

- learning logic, algebra and programming language theory allows me to reason about and structure programs (written from assembly to Haskell) more precisely, and write interpreters and compilers grounded in well-defined semantics

- learning build systems (e.g. Nix) allows me understand how to package software written in different languages, and how to structure my own projects to make them work across platforms, and how to make reproducible builds

- learning category theory allows me to ask the right questions when I learn a mathematical structure, from topology to algebra (what would the morphisms/products/limits/functors be?), and gives insight into how programming languages organize their own abstractions

Of course, specific knowledge about X technology would still be needed to get things done, but when the winds of change blow, hopefully the deep concepts I've spent time studying will resurface in a new light. Only time will tell.


I think you are on a right track and seem to know way more than many CS grads know when they finish university!


"But, whether through side projects, blogging, or building a portfolio, I do think that finding a way to publish your learnings is valuable. The nice thing about publishing and shipping smaller things on the side is that you do end up building a signal to employers that you have “passion”."

Think about what this really says about our industry. Tech companies are incapable of figuring out effective ways to test people they hire, so they rely on social signaling, which involves doing work outside of work for free and then marketing it on the web. It's a crazy and inefficient system, but people who are invested in it keep propping it up.

BTW, most companies out there have employee agreements that technically make your side projects their IP.


Moving from academia to work, one of my biggest challenges was actually using tools.

If I couldn't roll out my own stuff, it felt like cheating, and I would always get fixated on being able to write my own libraries.

Whenever I'd lift perfectly fine and working code off StackExchange, or use some libraries that did mostly everything for me, I'd get the same anxious feel that I got in college, when writing a report/paper and not citing my sources.

This kind of behavior made me waste a lot of time and energy, trying to re-invent and optimize the wheel, over and over again.


Well, I believe much of the satisfaction in programming (at least for me) lies within building something with your own hands and creativity. When you are simply writing glue code between libraries and services, that satisfaction is stripped away and leaves you without much fulfillment. Of course, there's a point of diminishing returns when it comes to writing your own libraries - the pendulum can easily swing to frustration - but I think finding a balance in having contributed your own creative energy and third party libraries is the sweet spot.


Well it's pretty easy to find a balance if you at least try, you can use widely available, documented and tested open source libs, for most of your (technically hard) problems. and not roll your own there. (Not invented here syndrome?)

And you should also leverage the fact that you own 100% of your codebase, to add as much custom stuff and custom business logic where it pertains, but it is not wise to reinvent the wheel (or the engine, or the carburator) in 2021, when everything from, hosts, db, etc, is available at a few clicks...

also depends much on what tools you're using/projects are doing, YMMV


Oh I concur... I'm in the 'if I didnt build it I dont understand it' and have all kinds of anxieties about that.


I have this sort of "mesh" of knowledge now. Stuff from 30 years ago is still valuable now. Moving from bare metal to chroots to zones to VMs to cloud just expands that mesh. What was once physical is now virtual.

Being virtual, it has plasticity. But the underlying evolution is still there, so there is sort of a distilled fundamental nature to compute, network, storage.

Hard to explain, but the mesh is also sedimentary. Nodes in the mesh have these layers of history now that provides more axiomatic understanding.

If you know zones, you grok VMs, you understand EC2, then pods then containers. Different names for very related concepts of "compute".

All a long way of saying that when I learn now, it's more about adding to that mesh than expanding separated islands.


When learning new material that doesn't solve any immediate problem, I like to think of that material as being stored in my "table of contents". If I run into that problem in the future, I'll know where to look it up, but if I never run into that problem, it was just a single entry in an easily forgettable table pointing to deeper knowledge that I can disregard.


> When learning new material that doesn't solve any immediate problem

I find learning unrelated material sometimes give you additional perspective on later problems even if they're not immediately related. Same with having a small crack at languages you're not using - it can give you a new perspective on the languages you are.


From my experience what keeps a lot of people from learning or rather frustrates them, especially in tech, is the fact that the more you know (starting to get the big picture) you begin to realize how much stuff you actually don't understand or know about.

Do you need to know how Ethernet works for making a website? Probably not. Do you need to know how HTTP works? Most likely, at least if you start to fiddle with REST and APIs. Do you need to know how TCP works to understand HTTP? Well, some might argue. Do you need to know how IP or networking in general works? Well, if you want to configure your own web server you probably should know the basics. The rabbit hole goes on ... and most people just give up. Eventually someone will end up at epoll/select and will dive into native ASM of some high-perf Intel NICs while toying around with jQuery, you never know ... Yes, I exaggerated a bit, but you get the point.


Learn things regardless whether you receive short-term rewards for them.

People saying "learn by doing" or "learn just-in-time" are essentially saying; only learn easy things that you receive immediate rewards. This results in shallow and high-supply knowledge.

The reason some knowledge is rare/valuable is precisely that it is impossible to learn from a short and immediate feedback/reward cycle.

This is the reason maths feels more difficult to learn than coding - there is no external reward. And why C is more difficult to learn than python.


> If you’re known as “The ActiveRecord Bender”...

^^^ This made me giggle like a school kid. :)

Some very good points in the article though.

My take would be:-

* You don’t have to know everything.

* You can’t learn everything.

* Many things seem deceptively simple until you really learn them, be aware of that.

* It’s useful to learn that you don’t know about something. Then you can decide how deep you want to go into it.

* Many things are deceptively simple once you dive in, be aware of that too.


Software is eating the world! So you don’t need to know everything to earn money, just be able to solve problems logically. Then the rest is a hobby so treat it like that and not the olympics. TLDR: fuck it!




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: