Hacker News new | past | comments | ask | show | jobs | submit login
What is code? (economist.com)
80 points by wyclif on Sept 9, 2015 | hide | past | favorite | 53 comments



If you haven't seen the Bloomberg version of 'What is code?', take a look.

http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod...

It's much longer, but also much more detailed and in-depth.


Code: The Hidden Language of Computer Hardware and Software by Charles Petzold is an excellent book I have recently begun to read. It's even longer that the Bloomberg post, and also such much more detailed and in-depth (-;

In fact, I found the book very enjoyable. It gives a nice pedagogic twist on how information can processed with hardware.

It has been discussed on HN here: https://news.ycombinator.com/item?id=1936474


I just picked this book up as well. I've been trying to find resources that go from first principles to real-world implementation of something and this is the best example I've found so far. Enjoy!


I really liked that article and it was super fun to read, but as I read the OP, I found myself thinking, "wow, this is a much more succinct and far clearer version". I sent Paul Ford's article to my friends and family, and none of them read it because it didn't pay off with an answer to the title question for many thousands of words. This article seems like a better one to send people who really don't know what people mean when they talk about "code".



> Worried that your job is in danger of being automated away by software? Learning to code could be a useful insurance policy.

I often worry that we're setting ourselves up for major failure here. There is no fundamental principle that suggests most kinds of programming are any less susceptible to automation than any other kind of work. Especially the kinds of things that are taught in most "learn to code" programs – simple web applications – are increasingly commoditized or able to be built out of commodity pieces, and that trend will only increase. Another popular reason to learn to code is to do data analysis or procedural tasks for your line of business, but in the long run, those tasks are probably better served by more purpose-built tools that have more in common with spreadsheets than the types of things professional software developers do.

I guess my point is that there seems to be a meme that extolls the virtues of "learning to code" as an end, rather than as a means to be more effective at something else. I'd rather see articles about what those somethings-else are, and how knowing how to program can augment them, but those seem fewer and further between.


Whether coding is a means or an end can be interchanged once one can code, so the importance of coding as a general skill still stands. And just from personal experience and observation, I have seen friends generate work for themselves (granted, at where they work) simply by being able to code. They would be asked to fix the web site or modify a spreadsheet macro, and they'd be able to do it.

What coding has going for it:

1. Coding is higher up the abstraction stack. The higher the safer. Lower would be performing concrete instructions like picking berries or mopping floors. The more abstract the work, the harder it is to be automated. Because...

2. Computers can compile but still cannot abstract. Most abstractions embody arbitrary structures external to the computer. So no matter how smart a computer gets, people will still need to communicate these wants. There will always be a place for the translator, and we might as well call this the coder.

3. There is still so much room for innovation in the way we choose our abstractions and build our structures. There are human trends, technological trends, and design trends, making software a constantly moving, upgrading, ephemeral target. There is still plenty of potential "disruption" at every scale, and coders can be a part of it. There will always be a place for the architect, aka coder.

4. With more services exposing APIs and more programs being scriptable, it is becoming easier for coders to apply themselves in any logistical situation to hook things up and to further automate. This is the plumber, aka coder.

Granted, the above is only why coding is comparatively a more useful skill than other skills. Coding can still be extremely hard to acquire for some people. There will always be better coders than you. And, if everyone can code, the skill itself will lose value. So though coding may not be automated like everything else, whether it will save anyone is likely to be a different story entirely.


Great comment, thanks. I mentioned in a couple other comments here that I'm very much a believer in programming as a skill that augments other work. I would just like to see a boom in that sort of work, rather than lots of people training up for pure software jobs.

If a smart 18-year-old is trying to figure out what to do with their lives, I would rather society tell them to, eg. become a biologist and make sure to take some good programming classes, rather than to learn how to make mobile apps.


This reminds me of how Steve Jobs said computers are tools that extend our intelligence. If only we taught our children how to do just this! The only thing that comes to mind education-wise is the LA iPad scandal :(


>There is no fundamental principle that suggests most kinds of programming are any less susceptible to automation than any other kind of work.

There are two ways to "automate" programming. Make the language/framework easier to understand/write in, which isn't fundamentally different than it is now, or make the computer understand natural language and do interpretations on its own. The first is likely to happen, but learning to code now will only make you better at writing in whatever simplified language is developed next. The second is AI complete and won't happen till we develop REAL AI... which we aren't really that close to doing.

Learning to code is just learning to be very precise with your instructions. That is a useful skill regardless of whether you are actually coding or not.


What I'm thinking of are a subset of your first thing: powerful task-specific interfaces that act somewhat like DSLs, but usually come with a sophisticated GUI. Some examples: Excel, Photoshop, AutoCAD, Ableton. I think (and hope) these sorts of solutions will proliferate and a bunch of good ones could make a major dent in the world's need for people who program professionally.

I think the "we won't be automated away until REAL AI happens" thing is a trap. Certainly it's true that there will be some programmers working professionally until that point, but I see no reason to expect that it will be a huge number.

I wish I could agree with you more that "learning to code is just learning to be very precise with your instructions". I'm sure there are some programs that stress that, but most of the things I see (especially those targeted at adults) are more focused on the minutia of how we actually write professional software in 2015, because they are targeting job re-training rather than the acquisition of a skill that is broadly useful.

In summary, I'm just not convinced that "learning to code could be a useful insurance policy" is the right message to be sending. Maybe I'd be on board with something more like, "learning to code is probably better than the alternatives", but that leads me to think that we could use some more alternatives.


> I often worry that we're setting ourselves up for major failure here. There is no fundamental principle that suggests most kinds of programming are any less susceptible to automation than any other kind of work.

I think there is some kind of denial in our community about this.

A decade ago I (like anyone else) hired web designers for every iteration of my company web site. Now I use Bootstrap and give the designer some small tasks to customize it.

More than a decade ago there was SOAP to make web services and then JSON came up and simplify the web service ecosystem.

We can say that there will be no work for simple development tasks, only work for more complex things, that eventually will be part of some future product.


I don't think it's denial, because it has simply been true that even as many tasks have been automated away, there have been multiplicatively more tasks that haven't yet. Nobody really knows whether we should be optimistic or pessimistic about that trend continuing, but I think we should be more aware that it isn't fundamentally different than similar trends in other types of work in the past.


> it has simply been true that even as many tasks have been automated away, there have been multiplicatively more tasks that haven't yet.

In part this is the denial I talked about. Many of the multiplicatively tasks we have are the result of not finding how to solve obvious problems with pragmatic solutions. Let me explain:

Everytime we came up with new programming environments and languages we repeat the same cycles. It is very rare to have a real breakthrough. For example, once NodeJS become popular we are happy when someone introduce libraries that already existed in another language instead of automating the production of these libraries. All this work can be reduced to some software to generate stuff for multiple languages. Surely you can't apply this for different paradigms but there are few programming paradigms being used in production.

If you follow this approach you will only need a proportion of the people that is working now.

Another example: big corporations. I don't talk about Google or Facebook, I talk about the typical big corporation. If you walk around you can find that replacing a lot of IT employees is relatively easy with good organization. In a way we are protecting our own work not thinking very hard about how to replace people.


> There is no fundamental principle that suggests most kinds of programming are any less susceptible to automation than any other kind of work

Well, as a fundamental principle, there's the Halting Problem.

The simplest programming tasks will be simplified to some extend or be commoditised. In much of the same way we can purchase website templates and website builders can get the trivial sites up and running with no knowledge. In the past, HTML was required knowledge to publish a blog, but not anymore.

But there's a reason most professional applications are not developed with such tools. Given that, I expect the careers of programmers that aren't IDE-pilots are safe for the foreseeable future.

Increasingly complex task will be, and already are, being automated. But so far, all that progress has only increased the programmer's leverage.

> I guess my point is that there seems to be a meme that extolls the virtues of "learning to code" as an end, rather than as a means to be more effective at something else.

I completely agree with that. Outside of IT, there are many areas that would benefit from some programming knowledge and are currently under-served.


Your appeal to the Halting Problem is basically why I included the "most kinds of" weasel words. As much as we don't like to admit it, the vast majority of professional programmers are doing things that are properly categorized as "the simplest programming tasks". You're right that so far the automation trend has only increased programmers' leverage, but I don't think there's any more reason to think that will be an indefinite trend than there was to think automotive assembly would always be a high-skill job.

I think programming should be thought of as a useful skill (like math or writing) that for most people comes in handy in small but myriad ways, while a smaller segment of people gain more expertise and do it professionally (like mathematicians or writers). Maybe that's how most of this nascent movement is thinking of it, but it seems to me it's being sold more as a solution to the future job woes that many foresee.


>I don't think there's any more reason to think that will be an indefinite trend than there was to think automotive assembly would always be a high-skill job.

It isn't. One day even programmers will, in general, be replaced. The way I see it is that programmers will be one of the last ones to be replaced because once they are replaced, everything else that can be replaced that isn't will soon follow (as replacing programmers will send the process of replacing such jobs into overdrive).

Everyone learning programming does not solve the bigger problem for society, but it is probably one of the safest careers even if it isn't safe.


Ah, see, I disagree. There are things that are less fundamentally automatable, like interpersonal relationships, art, and research. We need to figure out how to make those sorts of things make more sense economically, rather than racing to the bottom on traditional types of labor.


Art is becoming automated. Even if it never makes it to the top tier of creativity that humans possess (and it may very well make it there), very few humans will make it to a tier high enough to compete with the automation, especially on the skill per effort chart.

For research, I haven't paid much attention to automation for stereotypical research (scientists doing science in labs), but there is a lot of automation happening in discovery process for legal research that is cutting down the number of individuals needed. I doubt it has maxed out the potential for automation anymore than other areas.

As for interpersonal relationships, they may also be possible to automate (robots, simulations, etc. that manage to cross the uncanny valley combined with increasing knowledge of how human interactions work lead me to think it is possible eventually). I expect that we will even see the world's oldest profession one day have competition from automation.

I think one key to remember with automation is that people are willing to skimp on quality to save money. So even if we cannot replicate the work of humans to the same level of quality, automation can still compete when comparing quality per cost such that the paths are not a viable career option for most people.


> I completely agree with that. Outside of IT, there are many areas that would benefit from some programming knowledge and are currently under-served.

I'd like to emphasize this sentence. There is not just binary code that is used in software. Great challenges arise in other domains e.g. such as synthetic biology.


That's exactly my point: I want to hear less "learn to code", and more "learn xyz for which programming is useful". My sense is that being a professional programmer is much more of a no-brainer right now than those xyzs, but that it is very short-sighted as a society-wide meme. I want to see software create a boom in jobs outside the creation of software.


>"Writing a program and then running it is magic, in a way."

Even after 30 years in the industry it still feels this way (most days).


After 30 years of bringing things into existence via mere incantation, I still forgive them.


Once upon a time, would-be magicians believed that a mastery of obscure knowledge and arcane incantations could change reality.

We are their heirs. Unlike them, we don't have to dream or lie. We speak correctly, and earth and fire heed our call. Unliving metal stirs at our command. We summon daemons and bind them to obedience. We build constructs of pure thought-stuff, and conjure them into being through effort of will.

We are wizards and sourcerors, conjurers and magicians. We are what poets can only dream of.


We speak wrongly, and we become the sorcerer's apprentice :D

I love that little piece, I've got to save it somewhere!


"Sourceror's Apprentice" needs to be a job title for a junior engineer somewhere.


You should attend career day at local schools.


This is the part where I blush and attempt to hide in the corner.


Am I missing some sorta punchline? or are they shamelessly ripping off the bloomberg headline to grab more eyeballs?

I guess it's because both magazines are in the finance space and Bloomberg made waves only a few months ago with their piece that I'm having trouble not seeing this as trying to ride on coat-tails.

Edit: mediumdeviation makes a fair point, I didn't really think about the fact that economist explains is a regular column, I still think they coulda come up with a more creative title (How Programming Works, How Programmers Work, What is a Programmer, What Code Does, Java: More Than Just Coffee, Code: The Hitchhiker's Guide, etc)

re: the newspaper thing: I get a weekly, bound, glossy set of papers that can only be described as a magazine, if they wanna be called a newspaper maybe they should look more like Barron's (please don't I hate that set up).


How many other ways are there to title an article explaining, essentially, what is computer code?

"The Economist Explains" is a regular column where the newspaper (not magazine, mind you, the Economist is very particular about that) explains a wide variety of issue, mostly current but not always, as in this case, and makes them accessible concisely to those not in the field.

That Bloomberg published an in-depth report on the same issue a few months ago, and this short column appearing now is merely a coincidence, and suggesting otherwise is completely ridiculous.


I'd be... surprised if it was "merely a coincidence". That's almost never how the publishing industry works.

(And I do have some info that suggests otherwise, but as I can't source it, it's anecdotal... but strong.)


    $ whatis code
    code: nothing appropriate.


  $ md5 <<< code
  7844a93ad4b97169834dade975b5beff


Code is the highest level of specification, my RE teacher always told me.


Respectfully, that's incorrect. The requirements (ideally, in a document) are the specification, the code is an implementation of the specification. For example, you might want to software to take the FFT of something. Whether you choose to code it as decimation-in-time or frequency is up to you if the requirements do not specify.


...And once implemented, the implementation is more specific than the "specification" (a document detailing the requirements).

That's why the OP said, correctly, that code is the highest level of specification, because nothing can be more specialized than code (as you said correctly, an implementation is less specialized than code).

Also cf. Operational Semantics (no, not denotational), which is literally the highest level of specification, and is runnable code.


Case in point: OpenSSL. More of a TLS specification than the RFC is, nowadays.


it's more specific, yes, but those differences don't matter if they aren't requirements.


Those differences definitely matter if people come to depend on them.

This is really a debate between prescriptive and descriptive schools of specification ;)


Requirements are ambiguous, don't include all the detail etc.

Code is the most rigorous documentation there is.


It is both the documentation and the thing, almost always at 100% fidelity. That's unusual, actually.

When you make a machine part, first you make a solid model. Then you use that solid model to make manufacturing drawings (where the tolerances and details are called out). This comprises the "intent". Then you use the solid model and the manufacturing drawings to create one or more gcode programs that runs on the CNC machines. Then the programs along with tool specifications and clamping setups are all used together to try and make the part. During machining many checks are performed and once the part is done, it's usually checked again on a computerized measurement device. If you did everything just right and everyone was careful, hopefully the part you produced is within the tolerances and your part is to spec. This is the artifact.

By contrast the code is (apart from bugs in the compiler or bit errors in the RAM) perfectly translated from the intent (the code) to the artifact (the executable).

The differences are staggering.


No, the thing is the program, not the code. This is an important distinciton: code that isn't compiled, consumed is not a thing, except perhaps as a subject of study. Programmers' job is to create running programs, not code.


If the translation from code to program is perfect then while you're technically correct, the distinction is meaningless.

The whole point of my comment is that it's incredibly rare to have such accurate translation from the intent to the artifact and once the translation becomes good enough, they're indistinguishable at least from some perspectives. This is in great contrast to the majority of other things which are made, like machine parts, sheet metal parts, 3d printed parts, injection molded parts, extruded parts, cut and welded parts, etc.

Obviously there are also categories of things which are both the intent and artifact at once like written documents and the like, but often even with those there are various translation layers that take place. For example, an author has ideas for a book in their head, but their ideas cannot be published, only their written work. Similarly a business might agree to buy something from another business over many years and they ask their counsel to draft a contract that "prevents us from getting screwed" but the contract as written may not do so perfectly.


How do you define a bug (see my reply above)?


It depends on the bug, doesn't it?

Most software is poorly defined to begin with and so the programmer is responsible for figuring out what the program should do in the absence of a spec. That's one class of bug.

Another is when the spec says to do a particular thing, but that thing is wrong relative to what is actually desired by the people writing the spec.

Another is a correct spec but wrong interpretation on the part of the programmer and so the code does what the programmer wanted it to, but not what the spec said it should do.

Another is correct spec and correct interpretation, but an error in the translation from programmer's brain to code.

Another is correct spec and correct interpretation and correct translation to code but some kind of particular underlying problem with the compiler, architecture, OS, whatever that by using a different (but supposedly equivalent) incantation the problem disappears.

The last class of bug I mentioned is incredibly rare, though, so in many cases what you write is what you get. That's not at all the case with many, many other things.

In most engineering you design a thing and the design is as perfect as you care to make it. And then all the quality leaks out translating the design from theory to practice. That's because the physical world is messy relative to mathematical designs. I can specify the length of a part as 1.0000000000" but it's very difficult to make a part that's accurate to more than 1.000", trying to make one that's 1.0000" long means that you have to specify the temperature!

In software engineering or programming, there's rarely an actual design and that's why the quality isn't there. Once you have a "design" (the source code of the program) it gets translated with near absolute fidelity into the "product" (the binary executable). If you design it wrong, well, sorry.

If I make a program and I say that a number should be 1 and I specify it as an int, it's exactly 1.


> Most software is poorly defined to begin with and so the programmer is responsible for figuring out what the program should do in the absence of a spec. That's one class of bug.

This is exactly my point. The fact that the programmer knows what the program should do implies that there exists knowledge outside of the program of what it should do. This knowledge consititutes the requirements, whether they be written or not.


To all who disagree with my statement that there is an independent document (or notion) that specifies requirements and program != specification, how do you define a bug?

I say that a bug is where a program violates a requirement. If you say there is no independent source of requirements than the code, then I don't see what can be bug as the program is self-consistent.



1. Code used in the article seems to be from https://github.com/inueni/birdy.

2. The howlong = (today-ltdate2).days line doesn't follow PEP8.


It remind me a post from Bloomberg...



[flagged]


no more


what is love?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: