If anything "learning to code," is just bad language to describe something more worthwhile: teaching people the fundamentals of algorithmic and logical thinking. We don't teach mathematics to kids so that they will all become mathematicians and we certainly don't teach them literacy so that they will all become the next J.D. Salinger. They are just tools for solving other problems. "Learning to code," is just another tool that will become ever more prevalent in the years to come.
The problem of not teaching these fundamental principles is that when it does become useful to do so it will be too late. We cannot afford to wait yet another generation to teach these ideas and principles. I think we're far enough along by now in the development of computer science to have a basic language of things to talk about. If we at least teach people these things then how many other judges in technology patent cases will be able to at least have an intuition to call out a patent troll? How many office workers will be able to find hidden information in their data or automate the repetitive tasks they do and move on to better things? How many of them will know that at least it can be done and perhaps they should hire a programmer?
As it stands right now most people don't even have an intuition of what can be done with a computer. Their world is comprised of apps and silos. When they get onto an airplane they have no idea that they're sitting in a flying solaris box with wings. They don't realize that their e-reader is a general purpose computing device capable of more than just downloading books from that one store they bought it from.
I don't think anyone advocating algorithmic literacy is suggesting that we train legions of crafts-people.
Exactly. He says only follow what you're passionate in. Most normal people I know aren't passionate about mathematics but they're forced to be competent.
If we taught mathematics like we taught computing, people would only ever use calculators and never see equations. They wouldn't know how to solve x = 5 + y and y = 2x + 1 or if solving it was even possible. Mathematics to them would be only what the calculator can do.
Think of all the problems you've had as a programmer.
Now think of how many of those could have been lessened if people had a better understanding of what's happening rather than trying to remember the incantation you just recited.
With this education they know when a task should be automated and they know when a task can't be (at least easily) accomplished by a computer.
The computer is no longer magic to them.
Advances in many fields have come by harnessing the power that computing provides.
Do we really want to see a future where people still think computers are magic?
"teaching people the fundamentals of algorithmic and logical thinking"
You don't need to be a programmer to think logically. There are a lot of other fields that teach that. Basic algorithmic knowledge is useful in daily life, but you won't necessarily gain that while "learning to code," and it's not necessarily more useful than anything else you might study.
"how many other judges in technology patent cases will be able to at least have an intuition to call out a patent troll?"
You going to have judges study electrical engineering as well? How about plumbing as well? Coding isn't special here, it's just the one thing you and I know. We're insisting that others toolbox's aren't full enough because they don't know the one tool that we happen to know. Perhaps that's a bit of a hacker-centric world-view?
Agreed. For all the talk of logic it seems like the root of disagreement is about a very basic logic problem.
Problem solving skills != programming skills. In fact most of my problem solving skills are from learning to solve math problems and not from learning to program.
* Not all problems are solved via programming.
* Not all problems that can be solved via programming, should be solved via programming.
* Not all problem solvers are programmers.
* There are many ways to learn how to solve problems. One does not need to become a programmer to do so.
I think programming is a pretty great skill to pickup in today's world. However I'm not ridiculous enough to suppose that the only way to succeed in solving problems is through code.
The judge who happened to know enough programming to dismiss Oracle's argument that a 9-line function should be worth millions certainly helped out. [1]
>> You don't need to be a programmer to think logically.
That's true.
On the other hand, knowing the basics of coding makes it easier to do more even as an end user. Like writing a simple macro in Excel to do something it can't otherwise do.
I see coding as a useful life skill. I know how to do simple things like change a light switch, fix simple leaks, change a tire, hem a pair of pants, make a cheesecake, etc. but I'm not an expert in any of those things, nor do I want to be. Having said that, I also know when a problem is beyond my skill and to let a professional handle it.
There's nothing wrong with coding being one of those skills.
But did you learn those life skills upfront because you thought "it's important to know how to make a cheesecake", or did you learn as you needed them? Because Atwood doesn't seen anything "wrong" with learning to code; he does view learning it upfront as premature optimization:
Here's a person who a) found an interesting problem, b) attempted to create
a solution to the problem, which naturally c) led them to learning to code. (...)
This is how it's supposed to work.
I agree. Let's try substituting some other things.
"Don't visit a foreign country"
"Don't learn a 2nd language"
"Don't listen to any music but pop music"
"Don't read anything but children's books"
You could argue that 99.99% of people don't need to visit a foreign country. But most people that do have their horizons broadened. Maybe they realize their home country is not as "special" as there local "we're AWESOME" propaganda suggests. Maybe they realize how lucky they are to live in a well to do country. Maybe they get a picture of how others live (better to aspire too or worse to be thankful for what they have)
99.99% of people don't need to learn a second language. They can get by just fine with just 1. But learning a second language opens up so much of the world. Often learning a foreign language leads to extended time in a foreign country which leads to exposure to new ways of looking at the world and learning what you thought was "common sense" is actually only your culture and not a universal truth.
Learning a little programming opens your eyes to how this digital world works. It opens your eyes that an iPhone or Android is really not magic, just millions of simple parts all working together. Learning a little programming can arguably make you less complacent with limitations of various OSes, Apps, websites, etc. and push for a better world.
It's not about becoming a programmer. It's about having at least a basic understanding of how the modern world works. "Don't learn to program" seems to me like "Don't learn Biology". "Don't learn History", "Don't learn Physics" All of those aren't needed by most people either but learning them gives you more tools to understand the world.
I think Jeff's problem is he really trying to say "You don't need to MASTER programming" and the rest of us are saying "You should learn at least a little programming" and so we're talking past each other?
Wow. Please look past the title of his first post to see what he's actually saying: not everyone needs to know programming (which is distinct from “computer literacy”).
If you read him carefully I think you will find that isn't the case. Much of a professional programmer's time is spent mindlessly debugging crapware, in other words, code plumbing. Plumbing is a separate speciality for good reason. In depth plumbing courses such as those at toiletacademy should not be required of all students.
On the other hand, I certainly agree that everyone should be taught mathematical thinking. Nearly everybody in the first world is already taught elementary mathematics. That is certainly not something Jeff Atwood is opposing here.
> Their world is comprised of apps and silos.
Programmer's worlds are comprised of apps and silos just as much as everyone elses. Indeed, everyone who is a victim of the UNIX pestilence lives in a world of incompatible competing applications.
> They don't realise that their e-reader is a general purpose computing device capable of more than just downloading books from that one store they bought it from.
That is because most modern e-readers are deliberately designed to be that way. Their manufacturers intentionally implement C.R.A.P (content restriction & punishment) to prevent users from realising the full power of their devices.
So what? Much of a professional writer's time is spent wrangling with arcane grammatical and thematic constructs which 99% of the literate populace forgot about after high school...but it's still important to spend years teaching people how to read or write.
Everyone (in middle America) gets through at least hih school algebra...but why is mathematical and logical thinking, common to programmers, so rare? Even in the sciences it's considered a specialty to be able to aggregate data and external information, or to delimit one's own data output in a way that's useful beyond a narrative paper.
I think it may just be the case that it's nigh impossible to gain these logical/abstract insights without the practical experience of simple coding. I've tried teaching many people the difference between "8" and (the literal value) 8...it never seems to stick until you write and run a simple piece of code to test it out.
> why is mathematical and logical thinking, common to programmers, so rare?
What evidence is there that programmers have better mathematical and logical thinking?
> I've tried teaching many people the difference between "8" and (the literal value) 8...it never seems to stick until you write and run a simple piece of code to test it out.
It's not just algorithmic and logical thinking, both have been around for thousands of years. (Most prevalently after the rise of general writing.) I'm sure both Lady Ada Lovelace and Charles Babbage, being competent mathematicians, wouldn't have had much trouble following an insertion sort algorithm (since humans do this naturally already when sorting) for a pile of cards or be much fooled by the 'tricky' true/false Knight questions. When it comes to implementing these things on a machine (and building the machine, which nowadays is a freshman/sophomore CE project), though, they struggled immensely. And their struggles with the purely abstract side of things were not just from having a lack of expressive language, for even with the most expressive languages today concurrency and other parts of programming are still mentally challenging.
The reason computing is different from mere algorithms and logic is because programmers and machines face incredibly deep abstract and physical hierarchies and orders of magnitude for even the fairly mundane things. It's even further different in that programmers have perfect (or near-perfect with error correction) digital precision and control at immense scale down to whatever the relevant atom is (a bit, a microsecond) and the ratio between the upper bound and lower bound keeps increasing. Physicists have a similarly large order-of-magnitude gap to cross, but carbon nanotubes are state-of-the-art whereas programmers have been perfectly manipulating gigabytes of data for years.
Dijkstra called computers a radical novelty[1], his reasoning is why I dislike these poor metaphors and analogies about programming "being another tool in a toolbox of job-skills like plumbing or life-skills like tying a knot", or "learning to program is like learning how a car works", or even the notion that "learning to program should only be for the passionate, for those who enjoy it, or for those who want to make money with it." It was ridiculous and wrong when Socrates said similar things about writing, it's still ridiculous and wrong when people say it about programming.
Why should people care about computer systems in a plane? Do you know all hydraulic, mechanical, electrical, audio, video, etc. systems in the plane? Would average person know how plumbing in a plane works? Does he know what is the name of the plastic the storage bins or his seat handles are made of? Does he know how the metal that makes the wings is made? Most of the people don't know anything about 99% of things they use beyond what they need to know to use it. It is called division of labor, and without it civilization could not exist. Computers are no different. They do a lot of stuff, and most people don't care about the details, and they shouldn't. There are people that take care for it instead and get paid for it.
Algorithmic literacy very little to do with computers. People identified the need to systematic approach to solving problems long before computers, and some modern techniques existed very well without them. For example, TRIZ (https://en.wikipedia.org/wiki/TRIZ) was invented in 1946, without any computers existing. Systematic thinking and systematic approach to problem solving should be taught and learned, but programming is one narrow application of it, and should not be confused with it.
If you look at other articles at CodingHorror, I don't think Jeff Atwood associates coding with things like "algorithmic thinking" (which is not necessarily bad, he just has a different focus), but do courses of "learning to code" teach it all that much? For beginners learning to code is more about just learning to "make the computer do" simple things via learning all the terminology, syntax, the deciphering of compiler error messages etc. It takes plenty more effort to learn some theory along the way and this is most likely to happen in an academic setting and the popular courses are really more about craftsmanship.
Robert Sedgewick has a nice "Introduction to programming" course followed by an "Introduction to CS":
It sure could be useful to a wide audience, especially for people in science. For people in general? I think not more than hundreds of other skills. People in every discipline tend to think what they do is the pinnacle of all understanding, this is especially visible in academia and in schools, but in the end you cannot know every thing that potentially could be useful. Of course this shouldn't stop people _interested_ in programming from learning it.
A) What Jeff thinks we're saying: "Everyone should be programmers so the world would be a better place"
B) What Jeff thinks is the right: "Everyone should be technically literate for their own good"
C) What we're actually saying: "Everyone should be technically literate for their own good"
You can observe that B and C are the same. So we actually agree, but some haven't noticed it yet. Only because the sentence "learn to code" can be very ambiguous. After having studied natural language programming, I've come to realize that 90% of disagreements between human beings derive from semantic misunderstanding of ambiguities. This is just yet another example. Kinda funny to be honest. Watching the blogs argue with each other is similar to watching Abbot and Costello arguing who is on first [1].
I don’t think so. When does “programming” ever mean technical literacy?
What Atwood is arguing against is the idea that “everyone needs to learn programming”. Because of the title he chose for his first post (which didn’t actually represent his point correctly), now many other groups of people, with a lot of varying views, are arguing with him on points he wasn’t even making. You individually are arguing that everyone should be technically literate, but that’s not what he was arguing against anyway.
No, Jeff is actually wrong here. He's had two articles and numerous comments to point out what he really meant by the statement "Please Don't Learn to Code", and he still maintains that programming is only for programmers.
We have a major non-programmer programming community with R now. There are, and have been, many others.
I too felt a lot of readers had the wrong reaction and mis-interpreted Jeff Attwood's original post.
I was an undergrad in the tech bubble a decade ago. CS was the most in demand and popular major on campus. When the bubble burst, a lot of my former classmates abandoned programming. I was not surprised. A lot of them were CS majors because that's where the money/attention/parent pressure/sexiness was during the bubble. The moment it faded they left for greener pastures. They had realized, very abruptly, they were miserable during those long lab hours and hated debugging at 2AM before a deadline.
Coding is a beautiful and difficult thing, but like any career, it can be very tough sometimes. As Jeff says, the only path to happiness is to follow your love (or as Bret Victor would say, follow a principle). The challenge now is the same as a decade ago: quieting all that noise and figuring out what you really want to do.
Doing a couple hours a week learning Javascript and you will now change careers?
Or, inversely, learning Javascript on Codecademy will get you a lucrative job on frontend programming, if you are not talented?
Yes, of course doing what you love and striving for excellence and doing your research on your own are all great admirable things everyone should do. But they are hard, and not at all at odds with this tiny time investment. What's all the fuss about?
Let people do what they want to and get off the high horse, unless you are pulling people up with practical contributions, not just "don't do this"
During the last bubble, the hot scam was for "private vocational schools" to advertise to recent grads (i.e., fine arts and social sciences majors) to develop tech skills to help them get a job.
The company I was at was looking for cheap labour and hired several of these people who paid a small fortune for a six month "intensive" course. Needless to say, they were all duds looking to cash in on a hot job market.
At least the companies now will save the small fortune, they'll just point the duds to Codecademy/Udacity/Coursera.
People like the ones you mention will always exist, but they are not Codecademy's responsibility. It's even likely they would find a place where you get a relatively respected diploma instead of a certificate in PDF.
I too felt a lot of readers had the wrong reaction and mis-interpreted Jeff Attwood's original post.
I have a personal peeve for the desperate pivots of a piece to try to see what can stick. It's bad enough when an original author does it, but even worse when fans of the author did it.
His original piece was asinine. Perhaps intentionally, because its easy-to-refute nature saw it linked by the thousands, everyone lining up to take an easy hit. And now they'll link to this for followup points.
I read the article that spawned this, and it was not very interesting...although at least he took a clear stance. This one feels like a watered-down ode to truisms. What did I just read?
I've taken the liberty of rewriting his essay below:
If you like coding, or dogs, press the like button. If you don't like dogs, but prefer cats, press the like button. If you enjoy life, or stuff in general, press the like button. Maybe learn to code if you like it, but if not thats ok you can still learn it if you want, or not.
I'm getting tired of people having opinions about what other people should learn. If you think you will like messing with computers, give it a shot. If you like it, maybe you'll work hard and get good. If not, you'll probably do something else. Just like any other profession. Everyone needs to relax and stop worrying about what other people are doing with their lives.
We're not worrying, it's just an advice. I think others should learn to code for their own good, not mine. It's the same reason I think they should learn science, politics... And how to cross the road without getting hit by cars. I strongly believe you should learn this things because it would make your own life much better.
If you don't wanna listen to the advice, then don't. We don't really care. If you rather not learn how to cross the road because it's not something you enjoy doing, it's you who is gonna get run over, not me.
Any "noncoder" who has ever repeated the same menial task on a computer for a week straight could benefit from programming. It's not a career. It's a tool and a useful one.
People in general need to love coding like they need to love hammering in nails or vacuuming the house, or manually massaging data for reports to various organizations.
>And if you're reading this and thinking, "screw this Jeff Atwood guy, who is he to tell me whether I should learn to code or not", all I can say is: good! That's the spirit!
And fuck Jeff for changing his tune from "Not everyone should learn to program" to "Well if you think you need to program, good for you! That's what I meant all along!" He still needs to walk it back. You can't possibly know when programming could be used without a basic understanding of programming.
A long post, verbose and unclear in its message. Do not follow his advice.
About coding:
- Coding is an efficient way to represent processes or business processes
- Business processes are the fundament of any business
- 99% of businesses today are digital in their core or heavily support by systems
About passion:
- Most even do not know what their passion is anyway—so let at least all people try to get some success stories and basic experiences with code before they decide it's not their passion
- Creating and building is seen as always an satisfying activity—be it creating a cake, a wooden table or some piece of code—independent of your passion
- Coding can only be an instrument or a part or an building block in order to create something; code is as ubiquitous as reading, writing or math—it doesn't have to be your passion, it's just an instrument to your passion; coding equals not coding: a SAP, web, game coder or a Photoshop scripter do code but at the end of the day they have totally different jobs and passions: a passionate game coder would die when writing SAP code because his passion isn't to code rather to create games while a SAP coder's passion is to organize and represent business logic as efficient as possible. Both love to code because 1) creating is a satisfying activity and 2) coding allows to create a product of their passion (a game or enterprise system).
It's about understanding that any process can and must be automated with software/systems/code nowadays and coding is just a vehicle to your passion.
I'm not a programmer, but I recently launched a Wordpress site that I wanted to modify. Mostly just CSS/HTML stuff, but I'm beginning to realize I'll probably need to learn a little bit of PHP.
I had started the CodeAcademy lessons before I launched the wordpress site, but it got a little boring and I didn't see the point.
Once I launched the site, the CodeAcademy courses became a lot more fun. The process looks something like
1. Hm, I want to change how this widget looks.
2. Hm, there is no plugin for that.
3. Hit "View Source" and try to track down what the widget looks like in HTML.
4. Find it, realize you don't really know what it means.
5. Do some googling, make some changes, doesn't work or screws it up.
6. Do CodeAcademy lesson on HTML or CSS or whatever.
7. Eventually fix it.
8. Feel really, really good.
I think for most people it'll be hard to learn to code in a vacuum (myself included). Having a project where coding may help you significantly is a much better way to learn it. It may be messier, but it's much more rewarding.
This is not just a lesson for people who think they should learn to code, it's a lesson for people who want to learn anything. The most important thing you can do to learn something is to start a project that you're passionate about that will eventually require you to learn whatever it is you need to learn.
Jeff wants a world where we programmers can make computers perfectly usable without understanding how they work. I don't think this will ever be the case. The abstraction will always leak a little, and people will benefit from having at least some idea of how computers and the programs that run on them work under the hood.
Regardless, in the mean time I would prefer everyone to have as much technical competence as possible, for both our sakes.
If you've ever seen a 2 year old or, worse, a baby boomer playing with an iPad you'd realize we're basically there. At this point Jeff's car analogy is about right. Is it useful to know how your car and computer work under the hood? Yes. Is it necessary? Increasingly less so.
There's an issue with comparisons to plumbing and mechanics. The kinds of devices "ordinary users" interact with in those realms are physical, and it's possible to see how they work at a high level just by looking at them. (Yes, cars and such are getting more and more like moving computers, but the basics haven't changed.)
This is not at all the case with computers. They are magical black boxes if you're not familiar with how they work. Understanding even just the ideas of programs manipulating data is important, however abstracted the interface is. And what better way to teach that than with basic programming lessons, even just something like Karel++. Not to mention all of the related concepts of logic and problem solving.
More importantly, computers are black boxes made by people. I think a big part of “learning to code” is not just being technologically literate. It's a way to teach an understanding that these devices are doing what someone told them to, and that users have the power to make the computer do what they want. That nearly everything we interact with was designed and built by other people is an incredibly under appreciated fact. Learning how to use tools is empowering, and a fundamental part of being human.
Even cars still break down, and when they do it's very useful, if not necessary, to have an idea of what's going on. At the least, you become harder to rip off when you take it to a professional.
Computers and cars themselves, generally, are not necessary, just very useful. An understanding of math or good writing is not necessary in any really strong sense. We can strongly recommend, or even mandate, something that isn't necessary, and often do. I don't see why programming is different, or even less useful. I will continue to recommend it to anyone who's open to it.
There is this saying from Zhuangzi 2000 years ago: don't let things "thing" you. If we let our kids use iPads without telling then these things are hackable they will become their slave. We are already too much the slaves of our gadgets. I fear that and will show to my kids how to open a washing machine, a computer, how these things work. I will also tell them ho to cooka chicken, how to fix a leaky faucet, etc.
I fear the point is still missed. He used the analogy of the auto mechanic, which I liked very much: "Isn't knowing how to change a tire, and when to take your car in for an oil change, sufficient?" Exactly.
Learning to code is like learning the basics of auto mechanics. If you own a car, it would be very beneficial to learn the basics of how a car works so you know to change the fluid levels, why they are important, and all 101 level maintenance. In a similar fashion, learning to code is like learning the high level basics of how computers work. We don't expect an business guy to learn to develop kernel level drivers, antialiasing algorithms, or detail on public-private encryption just like we don't expect every car owner to know how to repair a broken starter motor or diagnose a vacuum leak.
For a business guy, truly learning things like the detail required, difficulties in collaborating on code, and how larger programs take more time to change are all vital pieces to understanding their business.
As some who used to work on cars for a living, I can say that the analogy does not fit very well.
Learning how to check your oil level, or even how to change a tire is more akin to learning how to install a new wi-fi dongle for your laptop, or how to install a new OS.
Learning how to program the computer is akin to learning how to modify the drivetrain of the car. You modify the engine to make it perform better (versus a standard baseline) by modifying or even replacing the components of the engine.
You modify your computer with code to make it perform better (again, versis a standard baseline). All computers ar equal, what differs them is the software. How each programmer can modify the order of zeroes and ones to make the computer work in a certain way.
Some people go as far as creating an engine from raw materials. In computers, we know those people as the embedded folks.
You interpretation of his analogy is what he wants I think. If learning how to change your oil is like learning to install that new wifi dongle, then you should know how to do that. You certainly don't need code to do that. Similarly you don't need to be a mechanic to change oil.
You don't need to be a mechanic to change oil, but a small amount of mechanical knowledge helps quite a bit. I am talking about knowledge such as what is a drain plug, what is oil, what is an oil filter. Most people have at least a rudimentary knowledge of how a car works. Computing is still magic to many people, and it should not be.
Most people don't have a clue about cars either. These are not the 80's, where most cars shared a common oil filter, used 10w-40 oil, and where very alike.
Nowdays the complexity of car has turned it into "magic", too. Choosing the wrong oil weight for a modern turbocharged car will wreck the turbo and the engine. Another example is the new style of oil filters. They are cartridge style with very delicate torque specs. an over torqued oil filter housing cover will create a significant drop in oil pressure due to to a leak.
Heck, even changing a tire is now a mess due to the recently implanted tire pressure systems.
Trust me I wish that it wasn't magic to people either. I need lots of hands to count the number of times I've explained some pretty basic (in my eyes) things about computers. Computer literacy is very important, but people don't seem to see it that way.
They are taught to trust the computer. To let it do its magic. Why then, should they go and learn how it works? For what purpose?
Imagine a world where people actually knew how to code. That means they would have enough skills to do basic math, and science. They would ask questions, rather than accept the answers given. They would challenge authority, rather than sit down and take orders.
This ongoing "Learn to Code" discussion reminds me OWS. Ask ten different people what it stands for, and you'll get ten different answers. If you try and argue against any part of it, someone will tell you that you "just don't get it."
In the end, all I get out of it is the impression that we "hackers" sure think highly of ourselves.
Jeff is having too narrowminded a view. Whatever takes people off TV and Facebook to acquire a skill is a net win to society.
They could do even better things? Probably, but that's not the point. If you had a similarly structured online course to communicate better, then you might have something useful to say.
"To win, you must fight not only the creature you encounter; you must fight the most horrible thing that can be constructed from its corpse." (http://www.acceleratingfuture.com/steven/?p=155)
However, like you, I don't feel particularly inclined to take a crack at it today...
I understand Jeff's motivation for writing his piece.
Alas, I just hope people learn to code if they want, or don't if they don't want to. That's all. No one else can make the decision for them.
I also go out of my way to teach people coding concepts. I recommend Zed's books to beginners. I recommend stackoverflow or quora for asking / answering questions.
Just be helpful to beginners and help them ignore the rabble rousers until they've got their bearings.
All this blogging and tweeting on the subject is just going to confuse folks in the early days, when they should be focusing on learning instead.
I understand Jeff's motivation for writing his piece.
You mean that he's trying to revive his blog now that he's out of the stackexchange thing, building a readership that he can monetize in the future? Hence these sorts of confrontational, pablum entries that bizarrely make the front page of HN. I don't get it.
If these were usenet posts they would never, in a million years, see any attention.
I've never understood why coding horror gets so much attention. Every article usually fits in one or two buckets. The first type is an inflated, long winded, well marketed article about a ridiculously simple topic or idea that tries to sound more insightful than it is ("Look at how well designed my new cat food dispenser is.") The second type is when a more controversial subject is broached, like performance tuning or computer literacy, and the result is always the same: shallow analysis, often peppered with completely incorrect statements, a massive blowback from people who know what they are talking about, then a series of retort posts that dig the hole deeper until the meme finally burns off.
I don't care if people learn to code or don't learn to code, but it would be nice if just for once, we could stop seeing a meta circle jerk at the top of HN every single day.
Oh, how I hate that paper: it is pure speculation (Nobody has any idea what complexity is actually essential and what accidental), and it seems to ignore the fact that the past of programming sports a couple of silver bullets already. Saying the past had silver bullets but there are no silver bullets left to be found is like saying we've already invented all things, and there are no more patents to file.
Firstly, it's pretty obvious to a skilled developer what aspects of that program's complexity are accidental vs. essential. Layers upon layers of indirection for no particular end is one example of accidental complexity. As is tuning assembler for a particular processor.
Understanding and constructing the appropriate usability and function of your software for your target audience on the other hand, is essentially hard - it's "product/market fit" for software.
What past silver bullets are you referring to? Brooks' point is that individual technical gains don't tackle the essential complexity at its core, whereas his "promising attacks" have mostly been validated: buy vs. build, requirements refinement & rapid prototyping, incremental development, and great designers. These advances have had way more impact than, say, the move from COBOL to Python.
I completely disagree that it's obvious to a skilled developer which parts are accidental/essential. Programmers (and everyone else) often don't realize they have a problem or are missing something until someone solves it for them. Similarly, programmers implement the exact same concepts over and over, and many of them don't even realize all of the things they're implementing are specializations of general patterns.
As for past silver bullets, the move from assembly language to C along with some extra tooling was easily an order of magnitude difference in development speed.
So, what I think you're suggesting is that a developer wouldn't understand accidental/essential complexity differences until a more skilled developer figured out the essence of certain classes of problems and others learned to apply their experience.
But, isn't that the case with any domain built on knowledge? My point was that a skilled developer had learned much the state of the art knowledge up to this point, and understands the pitfalls over over-complicating a solution that one can get into with certain solutions of the problem.
Brooks' paper wasn't even getting deeply into what we are talking about, he was just suggesting that software development is inherently complicated, that it's always going to be hard, because unlike Newtonian physics, you can't wave away the complexities and distill the essence of a problem into simple equations. The complexities ARE the software. Abstractions, though helpful, are flawed and leaky, unless they're designed by a master.
If software development is a fundamentally hard activity, like playing a musical instrument well, you're not likely going to make it a "breakthrough" with a piece of technology, you're going to improve it with better programmers. A highly talented programmer will be less productive using COBOL and older tools and technologies, but will not be an order of magnitude less productive. Back in the 80's, this was bitter medicine that management types had to hear, thinking that programming was rote activity with barely any need for higher pay or recognition of the 10:1 or more disparity in productivity among individuals.
> A highly talented programmer will be less productive using COBOL and older tools and technologies, but will not be an order of magnitude less productive.
I'm not sure about that, especially if you replace COBOL with assembly language. Definitely if you replace COBOL with machine code encodings, or punch cards.
Part of the disparity amongst individual programmers can be attributed to tooling, and we have no way of knowing how much that is.
I think we can agree the paper is saying: "Programming is inherently complicated, and we predict that our current tools already reduce complexity almost as much as possible". I just disagree with that prediction, and think our tools currently encompass massive amounts of accidental complexity which easily dwarf the essential complexity.
This whole weird saga has led me to this thought: why are people so concerned with what /other/ people should learn? Why do you care?
It seems like the subtext is that if they know the things we know, they'll think the things we think. If we can just get people to be programmers, they'll see the world the way we see it. Or so the thought goes.
There's a big difference between literacy and knowing how to code.
Literacy means access to the sum total of the world's knowledge. That's huge. That gives you access to everything if you're willing to put in the effort. That's freedom.
"Code literacy" means knowing how to tell a computer to do something reliably and the associated lessons that come with that.
That's good, but it's not the same. One is the ability to gain new skills at will, the other is just another skill to master.
I say all this as a coder who thinks its important, I just think the hyperbole around "programming being the new literacy" to be overblown
Literacy means access to the sum total of the world's knowledge.
Funny you should use the word 'overblown' after saying something like this. Literacy, at best, gives access to the world's words, not the world's knowledge, and actually only the fraction of the words that are in the language in which one is literate. I'm not just speaking of natural languages, either. A considerable amount of the world's knowledge is expressed in the languages of mathematics and music, for which literacy in English is weak preparation. Another enormous chunk of the world's knowledge is experiential, for which any language is a weak substitute for apprenticeship (at best) or mere doing (at least).
The world's knowledge lies behind and endless chain of conceptual hurdles, and the ease with which one can jump over these hurdles while running on the legs of literacy is determined by everything else that one studies.
The study of mechanics seems like it would have a much more profound impact on average people's lives. Often, it seems, I find myself hitting walls in the physical realm where someone with the right background would just build a simple machine to overcome the problem.
I would never discourage the pursuit of knowledge, I'm just not seeing why programming gets its very own pedestal. It seems no more important than many other subjects that are not commonly learned by people.
Where I'm from, most schools already have shop classes where practical arts like mechanics are taught. So the status quo here is that mechanics is on a pedestal relative to programming.
That's interesting. The closest thing we had was auto shop, but that's like the learning MS Word equivalent of a programming class. What we did have, interestingly, was some programming lessons, though probably lighter than it should have been.
I don't know why this needs to be controversial. To keep it simple:
Logic/algorithmic thinking skills -> good for most people, should be acquired at school, programming is one quite good way to acquire them (IFF done right)
Actual coding skills -> pretty much only useful if you plan to work in software; depending on your role, essential (sw dev, but that takes more than codecademy...); desirable (product/marketing: you will better understand the tech team, how to prioritise, what's possible etc.); "nice to have" (everyone else).
So, I guess these coding schools are good, either if you want to learn a new way of thinking (and have lots of spare time and patience), or if you are a non-tech product person (as long as you realise that you're only scratching the surface, and that if you're short on time, maybe you should focus on getting things done).
I think a way to counter this would be to collect short articles by people who learned to code because someone encouraged them to, and are benefiting from it. He argues that people who are going to be great programmers will discover coding as their passion just by looking around. I disagree. Many in my generation learned to code when a teacher showed them LOGO. How is that fundamentally different from organizing study groups, or trying to make a website, book, or email list that will get people who are new to programming to discover the joy of programming?
No matter how smart their arguments are, some coders in this "debate" unintentionally display the subtle insecurity that their special skill is coming to the masses and that fact will ultimately decrease their uniqueness, salary, and apparently...their self esteem.
If one does feel that twinge of jealousy or defensiveness, then perhaps the most progressive solution internally is to grow an identity beyond being just smart and unique and slamming less intelligent people when they try to get a nut in this World.
after re-reading the original post few more times, I still think he meant just what the title said - don't learn to code unless you love coding and it's your passion.
I like how in the first paragraph he assures himself that the criticism came from people who didn't read the article properly. After all, if you didn't read it properly you'd surely agree.
Poor way to bolster support for your follow up really, 'you didn't understand what I was saying, I'll use different words'.
It is sad that economics is listed tier 4 and apparently as important (if not less important) as drama... Such common perceptions are probably why there are so many financially illiterate people and so much bad economic policy out there.
I appreciated his sharing the email from the attorney/CPA. Now I know there are others out there like me that enjoy computing more than counting money. We should start a recovering CPA support group.
The author completely fails to understand that there are a large number of people who would benefit from programming but have been deterred from doing so.
It's been about 10 months since I started coding, I have to say it is totally enlightening skill.
As a normal user, I never knew the way computers work. When once my PC was attacked by virus, I thought it was actual biological virus. Yeah, I mean I was slave back then.
Now I am trying to be master. Is there anything wrong?
PS. Car and Computers are different IMO.
The fact remains that literacy with computer syntax is what literacy with arithmetic was a hundred years ago. There are so many ways to take advantage of it today, it is insane not to encourage just about anyone to stick with it long enough to see what possibilities might be revealed.
Learn enough to not be spied on and manipulated by sociopathic web developers and cybercriminals.
Maybe you can contribute something back to programmer/user community that demonstrates ethical principles. Maybe you found a startup. Maybe you make a career out of writing programs.
At the very least, you make it harder for the bad actors who use their coding skills to take advantage of people who are not tech savvy.
This is exactly what I am striving for. I don't want to program full-time, but I like to be able to understand enough and be able to muddle through somewhat if there's a project I feel like working on.
Maybe I end up doing more than that, but it's nice to have, like learning a foreign language. Not everyone becomes fluent, but a lot of people know enough to get something out of their studies.
I like restoring old computers, replacing capacitors etc. to get them running again, keeping my bike in order (though not my car), and messing around with HTML for simple things on a blog. I'm working on learning Ruby and slogging along iOS tutorials while designing my own board game.
I did take a few CS classes in college but was a liberal arts major. If I had known more, might have flipped it, but at the time thought programming would mean cooped up in a cube all day.
"programming would mean cooped up in a cube all day"
Unfortunately, I think for most programmers, it still does.
My guess is Atwood was once of them.
When I started to reach some sort "enlightenment" (series of epiphanies, ah-ha moments or whatever you want to call it) with respect to "programming" I found that I wanted to share my knowledge with others. I think I still have that urge now and again. (And even now I'd still be happy to show you or anyone what I've learned if they were really interested. However I must admit I've never had much interest in computer games or understanding how they work, and I prefer using Lua over Ruby.)
When I keep seeing these ridiculous blog posts like Atwood's I have a hard time seeing his sort of thinking as helpful to anyone (even himself). That's just how I see it. But then, what do I know? (A chorus of underappreciated programmers in cubicles responds: "Not much!" ;)
They're right. But so am I. In the grand scheme of things, none of us really knows much. That much, I do know. (Credit: Socrates)
If anything "learning to code," is just bad language to describe something more worthwhile: teaching people the fundamentals of algorithmic and logical thinking. We don't teach mathematics to kids so that they will all become mathematicians and we certainly don't teach them literacy so that they will all become the next J.D. Salinger. They are just tools for solving other problems. "Learning to code," is just another tool that will become ever more prevalent in the years to come.
The problem of not teaching these fundamental principles is that when it does become useful to do so it will be too late. We cannot afford to wait yet another generation to teach these ideas and principles. I think we're far enough along by now in the development of computer science to have a basic language of things to talk about. If we at least teach people these things then how many other judges in technology patent cases will be able to at least have an intuition to call out a patent troll? How many office workers will be able to find hidden information in their data or automate the repetitive tasks they do and move on to better things? How many of them will know that at least it can be done and perhaps they should hire a programmer?
As it stands right now most people don't even have an intuition of what can be done with a computer. Their world is comprised of apps and silos. When they get onto an airplane they have no idea that they're sitting in a flying solaris box with wings. They don't realize that their e-reader is a general purpose computing device capable of more than just downloading books from that one store they bought it from.
I don't think anyone advocating algorithmic literacy is suggesting that we train legions of crafts-people.
edited for grammar