Hacker News new | past | comments | ask | show | jobs | submit login
What Is Code? (bloomberg.com)
1384 points by 1wheel on June 11, 2015 | hide | past | favorite | 355 comments



I hate to sound hyperbolic, but I can't overstate how impressive this work is. For me, it evokes nothing so much as Tracy Kidder's The Soul of A New Machine [0] for opening up an obscure world (the one many HN posters live in, but obscure to most people). I am amazed both by the technical fidelity and by the quality of the story telling.

[0] http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...


I agree. This is the best piece of writing I think I've ever read on the web. This touches on everything, and so accurately, and so concisely... this article is giving me a stroke I think.


I've never seen anything like this on a website before. The writing, the formatting, the structure, the animations; it's near perfect.


are we reading the same article??!?!

the content is quite good but the layout, colour, animations etc. are a mess.


There is a fine line between mess and art. :)


And this article just crossed :D


Look


Same, one of the best things I've read. It may rank above Programming Sucks[1] which is my go to reference for friends when they ask me to explain to them what it is I do.

1. http://www.stilldrinking.org/programming-sucks


The big red semicolon picture near the bottom had me cracking up so hard. I will need to re-read this article every week for the rest of my life, in order to fully enjoy it.


> Writing this article was a nightmare because I know that no matter how many people review it, I’ll have missed something, some thread that a reader can pull and say, “He missed the essence of the subject.” I know you’re out there, ashamed to have me as an advocate. I accept that. I know that what I describe as “compilation” is but a tiny strand of that subject; I know that the ways I characterize programming languages are reductive. This was supposed to be a brief article, and it became a brief book. So my apologies for anything that absolutely should have been here, but isn’t. They gave me only one magazine.


Keep writing. The space is there for you.


Just to clarify, my earlier comment is a direct quote from the article. I am not the author, just thought it was an apt anticipation of some of criticisms in this thread.


This is too freaking awesome!

Isn't that a seriously mind-bendy kind of article to appear on Bloomberg? Also, isn't it very cool that a whole class of people who may not know a thing about coding (but may be interested) might get to know something about the craft and culture?

And it's presented in a very fun, off-kilter sort of way. That must have been a hell of a lot of work. I actually skimmed the second half and the little robot told me I read it all in 16 minutes which was not possible and who was I kidding!

I had a thought the other day while browsing Etsy. If software really is a craft, could I fashion a bespoke software creation and sell it on Etsy? I know this might seem like a non sequitur. But, you know, what is code? Why couldn't I do something like that?

It's such a strange but vital profession. (Seriously, I would have thought there are a _lot_ more than 11,000,000 professional coders worldwide) and one that is still coming to terms with itself. Inspiring. Note to self, do not think outside the box, code your way out of the box.


could I fashion a bespoke software creation and sell it on Etsy

There's tindie (etsy for electronics), but due to the infinitely cloneable nature of code giving it away works much better than trying to sell it for tiny amounts. In someways the demoscene is this area of software craft for the sake of it.


Thanks for the heads-up on Tindie. Looks interesting.

And I agree with you about the Demoscene. Very much one off creations which is more what I had in mind. I'm imagining extending this idea to software objects that people would like to own, that was personalised to them, that had a strong crafting element, and so on. The reason I'm having trouble articulating it is because I don't think the category of thing exists (yet?)


Apple's App Store is the Etsy of software.


I was thinking that. But then why is software-dev-as-craft cordoned off from jewellery-making-as-craft and print-making-as-craft and so on. What makes software so special it needs its own little commercial corner of the world? Serious question :)


The activity on the article's accompanying github (https://github.com/BloombergMedia/whatiscode) is really interesting. Users have suggested edits not only to the code in the article but even to add citation.

This adds another dimension to the content by including the open source community such that the subject matter (coders) can influence (and improve!) their article's content.


Thanks for the link! I had no idea it could have been open sourced!


This is supposed to be an introduction just to the abstract concept of code, yet it includes a section that asks the reader to take a test on whether or not they agree with the author on the effectiveness of domain-specific snippets of JavaScript (http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod...) - one that replies to your selections with obtuse references to the code's use of promises and callbacks.

As an outsider, I just love it when I read something presented as an introductory text and I'm confronted with an elaborate series of self-serving in-jokes that go "ha ha ha, ha ha ha, you don't know what I'm talking about!"


It's just a fun little quiz. I kind of like it as a "reality check" to show the reader that while they may understand the concepts, the reality is much more difficult and fraught with subtle considerations. It also serves as a subtle reminder to readers (who may be the frustrated business-type from the opening of the article) that there's a reason software projects are so hard and cost so much money. Software development isn't something you can grok from reading an article, even a book-length one.


"This is supposed to be an introduction just to the abstract concept of code"

Are you sure that's what it's supposed to be? I mean, the author didn't supply unit tests, all we have to go on is the specification, which was that the editor of BusinessWeek asked Paul Ford "Can you tell me what code is," and Paul Ford said "No", and instead wrote this.


My Dad always tells me he flat out does not understand what I do. He respects it, knows it's challenging and fun, but just doesn't get it.

I've sent this to him -- he's about 1/4 of the way through and thoroughly enjoying it.

This is a very fun read that's worth leafing through


My father sent a small comment chain to me as well on a topic like this on one of his blog posts.

<First guy> June 3, 2015 at 10:12 am I think software developers like to impress people with how many lines of code they can write.

<Second guy> June 3, 2015 at 3:31 pm That is not true. A good day is when you leave the office with more powerful software, but fewer lines of code.

<First guy> June 4, 2015 at 4:31 am So why is software always getting bigger ? Is it because the marketing people want to add new features all the time ? Does this even apply to free software like browsers and email clients ?

-----

Personally, I like writing less code, or reducing code to less code. Less to think about.


My personal favourite is talking The Business out of things. Solving issues without any coding at all!


Sure, it is a problem. But if you reboot the server everyday, it will not become a problem.

No coding at all required.


Many years ago I managed to convince my management that, if they had to use deltaLOC as a performance metric, that they at least use abs(deltaLoC).

I then spent the next year cutting huge chunks of crap out of a C++ application that I had inherited.

Was a most satisfying experience.


Hmm, shouldn't it be Sum(abs(deltaLoC per change))? Otherwise if you added 50 lines a day and removed 50, your abs(deltaLoC) will be 0.


Let's add some whitespace to the README. Let's remove some whitespace from the README. Let's add some whitespace to the README.


> Does this even apply to free software like browsers and email clients ?

The more I get involved in open source the more I think most code bloat is due to people needing their egos validated by getting a commit into a project, regardless of whether the commit is all that useful or not.


I've been involved for a long time, and it sure doesn't seem that way to me. Lots of that "code bloat" is simply making things portable across compilers, interpreters, operating systems and languages.


Same. I sent this to my boyfriend yesterday: he's trying to learn programming (I'm trying to teach him Python; in response, I'm trying to learn some foreign languages) and I'm hoping that this gives him the lay of the land of how our strange world works.


Wow, I did the exact same thing, for exactly the same reason. My dad called me a few hours later and said, "I finished the 38000 word article you sent.". I checked and he wasn't far off; it's around 29000 words.


The certificate of participation at the end claims it's 38,000 words...


Oh that's where he got the number. I read it on mobile, so I didn't notice that.


> There have been countless attempts to make software easier to write...Decades of efforts have gone into helping civilians write code...Nothing yet has done away with developers, developers, developers, developers.

I still believe. Someday, somewhere, something incredible will emerge for the right-brained bourgeoisie and literati.


Theres tons of successes, we just refuse to count them. Photoshop (as hinted in the article), is a super special purpose language for doing image operations. It no longer "looks" like coding, so we don't count it as coding for the masses. Excel is a much more general purpose language used by tons of "non-coders" (and arguably the most popular programming language on earth). Again, doesn't (often) look like normal programming, but then again, shouldn't this be expected? If it looked like normal programming, it would be normal programming and not successful.


Excel is a perfect example. The core Excel experience is basically functional programming on a virtual machine with a matrix address space laid out visually right in front of you. It even looks like traditional programming if you dive into the VBA stuff, which plenty of non-technical specialists, including MBAs and managers, do on a regular basis in the pursuit of solving their problems.

Any specialist user willing to invest some time in learning their tools can do this. A culture develops around it.

And replying to parent: those efforts around teaching 'civilians' to code are probably misguided. The investment needs to be in adding scripting and programmability into existing line of business tool, not on encouraging people to sit in front of an isolated REPL disconnected from any business value or context.


+1; many companies have tons of internal processes which rely on Excel sheets. When these become painful enough, another team (internal applications) can come in, evaluate the situation, and build out a custom solution which uses an actual database, but Excel still provides a ton of value, since at the most basic level it's a database table with no validation and a free-form schema.

The downside is, when all you have are Excel sheets, everything looks like rows and columns (and not e.g. objects with behaviors). If Excel had more robust import/export mechanisms that normal users could understand (e.g. built-in REST client with JSON + XML serializers + many database adapters w/ lots of helpful wizards or tools to guide you), it'd be way more powerful. Then again, if someone is at the point where they'd be able to look at some JSON and compare it with their spreadsheet and be able to describe the mappings, they're possibly better off going to some training sessions on ${your favorite programming language} to learn how to do this the easy way.


Nicely put. It's actually rather agile: Excel becomes a prototyping tool to allow the business users to describe what a solution looks like, helping to guide development of that custom solution. Sadly few IT organizations are confident enough to trust their users and work like this, instead starting from zero with a pedantic requirements gathering process before building something less flexible and useful. The problem is partly a lack of domain knowledge in the internal apps team (which is understandable), and partly a kind of technology-first arrogance which prevents that team from making use of the intellectual capital originated by the business in their spreadsheets and processes (which is inexcusable really).

Ideally, an organization comes to understand that Excel is a fantastic tool at the frontier where the business needs to adapt rapidly, but once a process is fixed, replacing it with a fixed system is worth the tradeoff in reduced operational risk.

To your second para., much of that falls to internal apps to provide decent RESTful APIs across their systems. Some companies are doing this, in the process getting to a point where the Excel frontier is just analyzing and reporting on data, not acting as a source in its own right. Then you have traceability for every data point in the organization, and you're in a pretty sweet spot operationally.


Something can't be "basically" functional programming without first-class functions.

It seems more like a non-functional declarative vm to me.


It's more a one-way dataflow language.


Thanks to John Foreman's book "Data Smart" I now regularly test small-scale machine learning problems in Excel. Sure, a developer will translate the end result to scikit-learn or AzureML, but this is stuff that's easily available to even mildly-technical enthusiasts.


Two other more contemporary examples are the Android app Tasker, and the website IFTTT ( https://ifttt.com/ ).

There's something about calling it programming that turns certain people off. I remember a story about a freshman in a physical mechanics class that complained about all the MATLAB code they had to write. The professors retort was that they were free to use a slide rule instead, and that particular freshman stopped complaining.

But you're right. The mere act of calling it programming is somehow a problem. It's as if doing programming pigeonholes you into being a programmer until the end of days.


> The mere act of calling it programming is somehow a problem.

There are some words that carry with them unshedable connotations that people want to get away from so strongly that they will call themselves something else at the first possible instant. "Programmer" is one of them.

"Poet" is another. No one makes money from poetry because as soon as a poet makes money they are something else: a musician, a performer, a copy-writer, whatever.

Just don't call them a "poet", because "poets" are poor, sad people with no future, just as "programmers" are neckbearded nerdboys who smell bad, and no matter how many programmers (or poets) fail to live up to those stereotypes people will continue to impose them on reality come hell or high water.


Until one shoots out the top. Then he/she can be a "poet" again.


Kind of like how calling programs for proofs seems to make most programmers uneasy. (Not that most programs that you'll run into are proofs in any interesting sense.)


I concur with your assesment. The point is not to turn a syntax into an AST into processor code. The point is to provide things of value and 'easy' computing platforms targeting users who are not professional programmers create tremendous amounts of end user value.

Sadly, when I point this out professional programmers often go 'pffft - that's not real programmming' as if being knees deep in stack traces and gigantic code bases was a something with intrinsic value.


I'm not sure I agree with you about Photoshop. Perhaps (probably) there are photoshop macros or pipelines that are closer to programming, but most people use Photoshop purely in an interactive mode. They enter commands directly, and the logic stays in the users' heads, not in the computer.

Photoshop is more like a REPL tied to an image-processing library than it is a programming language.


A Photoshop "program" would consist of the composition of image layers, adjustment layers, blend modes, styles, masks, text blocks, shapes, paths, and so on. You "run the program" when exporting to a bitmap format.

It probably doesn't help to think about it that way when using Photoshop, but it might be a useful mental model for developing Photoshop, or as an example of how a general visual programming language UI might work. Importantly, Photoshop does not give you a bunch of little boxes with arrows crisscrossing everywhere like all the clumsy and disappointing visual programming experiments I've seen.

Maybe "visual programming" is like "AI". Whenever you make something that actually works, it goes by some other name.


Sometimes the boxes-and-lines metaphor for creative design is what you want... professional video products work like this:

https://www.blackmagicdesign.com/ca/products/fusion

It's like After Effects meets Simulink.


Maybe. But by that standard, using a Xerox machine is programming because you can layer some pieces of paper and transparencies together and then copy it onto an image on a single sheet.

I think to be programming, there has to be some kind "logic" (conditionals, mathematical functions, loops, etc.) embedded in the structure (cf. https://en.wikipedia.org/wiki/Jacquard_loom) and I'm not sure Photoshop qualifies.


To me, the defining feature is automation. A "programmable" system is one where smaller actions can be composed into larger ones and saved for invocation later (by name, in response to some event, etc). I don't know much about photoshop, but judging from what people have said, it seems to meet this definition.

I will concede that "programming," in the sense of "I've been programming for the last couple hours" or "He isn't very good at programming" implies the use of a Turing complete language. Photoshop would probably fail here, along with more programmer-y things like writing html.


I have read elsewhere that programming is giving the computer instructions. So clicking the close button on a window or typing into a word processor is technically programming (although it is not coding). My search-fu was unable to find that though.


I don't think I would consider the arrangement of paper to be a program because it doesn't change the way the machine itself operates (though perhaps one could argue it does at the level of photons and toner molecules--or that a computer just blindly "goes through the motions" with its inputs as much as a copier does with its paper input). But inputting the number of copies, darkness, collation, etc. surely counts as programming in the familiar sense.


Re: pipelines, you can feed external data into a Photoshop doc [1]

[1]: https://helpx.adobe.com/photoshop/using/creating-data-driven...


Remember when search engines had Boolean operators?


Honestly I never heard of search engines working with boolean operators.. I just did a quick search and found this interesting article which gives information about websites using AND, OR, NOT, venn diagrams and concept declarations to deacrease the search node graph....How long ago this search mechanism terminated ???


I... you're trolling, right? Before Google rose to fame, Altavista was probably the most popular, but in order to find what you wanted, you frequently had to add AND, and NOT to your searches, and still needed to dig in a dozen pages in order to find what you were really after. Then Google came along, and the rest of them died.


Possibly not trolling; if the parent post is the usual age of a new college grad (22 years old), Google had eaten the search market by the time they were 10.


I thought Google did it, too, for a while.



I'm pretty sure they did, because I never used a search engine before Google and I remember using boolean operators.


[deleted]


To me "programming" is text-based.

I'm afraid your interpretation of programming is an exceptionally heterodox one, then. It would make visual programming an oxymoron, which is at odds with most research.


I would say that Photoshop is a toolbox of predefined tools. If using that is coding, then people are also coding when they choose and use a screwdriver, some sandpaper, a hammer and some glue in succession from their physical toolbox to get a job done. That the operations happen electronically and that they are implemented as complex mathematical transformations of pixels doesn't change that.

Excel isn't much different in my view: most people are only using a very limited set of predefined tools to get a job done. Often badly: it is well known that there are many bugs in important, company critical Excel sheets. Excel seems like coding because it is mostly used to perform the fundamental mathematical operations we all associate with coding. But if that is coding, then so is constructing a Rube-Goldberg machine for a specific task from the parts you happen to have available. A nice exercise in problem solving under constraints. Which certainly has something in common with coding. But that doesn't make it coding.


Using Photoshop is instructing a computer to perform a series of operations, necessarily fairly high level ones, but conceptually not that different from text-based coding. I agree it stretches the definition pretty thin: where's the control flow? Conditionals?, but are those things really foundational characteristics of programming languages or are they simply the equivalent 'predefined tools' that, say, C gives us?

Excel is much closer to traditional programming: it's basically a purely functional language (absent VBA), but instead of a linear description of the program in a text file, you're in effect embedding functional code inside a virtual machine's memory.

EDIT: I suppose an important question about Photoshop is, can you do computation in it?


So, Excel is not coding, but an Excel sheet can have bugs?


Yes, just like a piece of furniture you built can have bugs


Well, if you wanted to be absurdly pedantic, you could call instructions in an architecture predefined tools, or protons and electrons.

I don't think it matters if the tools are predefined - what matters is that they can be used together to build a system greater than the sum of it's parts.


Yes, and that is exactly what very rarely happens when people use Photoshop or Excel and happens in coding all the time. Which isn't strange, because the former two aren't intended for that, while the latter is.

I think it can matter a lot whether the tools are predefined, because the exact nature of those predefined tools determines whether they are easily composed into something greater than the sum of its parts. You need iron ore, wood and a forge to construct a different hammer. Of course you can cobble something hammer-like together with the tools in your toolbox at home, but it won't be like the hammer forged afresh from more fundamental parts better suited for that purpose.


Yep, great examples. Also, pretty much any experienced Photoshop user will create their own actions to automate common operations. And then you have things like workflows in Alfred.


The trouble is that so much of what we call "programming" is actually the process of identifying all the implicit assumptions that go along with an idea and making them explicit. In other words, if you knew what to ask for in an unambiguous way then most of the "programming" would be done already.

I'm working on a longer essay but that's the short version.


This is a driving force behind the pedagogy of SICP -- to think in recursive functions. If you can break down your task into sub-problems and describe them clearly you just have to put a few parenthesis around it.

Programming isn't so much about the "code" or syntax; it's semantics and intent aligned with the machine.

If you can express your problem clearly then the program practically writes itself.


Sounds like I should sit down with a copy and finally get past the first chapter.


Not just the right-brained. I want that as a professional software developer. I want a computer that will do what I thought, instead of what I foolishly typed.

Basically, I want a computer as smart as a good junior dev so I can just yell my brilliant ideas at it, and it will do the dirty work for me.


That's only because you think your thought exists and is correct. Programming forces you to confront the fact that it isn't, and that there are many aspects of it that you've overlooked.


So ve wants a computer that can ask for clarification and point out edge cases, like a good junior dev can.


Also one that can figure out the edge cases on its own, because fuck that shit... it's a really simple idea, why can't you make it work?



That's not exactly it though. The mistake is in thinking that your idea is detailed when it merely feels detailed.

Somewhat similarly, entheogens do not really give you profound ideas so much as the feeling that the ideas you are contemplating are profound.


This may be the reason, but doesn't have to be and often isn't. If you can explain your idea to another person, and know when they've executed it, why is that not evidence of your thought existing? Why is being able to translate your idea into the unnatural constraints of programming languages as they exist in 2015 the arbiter of correct thinking?


As someone who has worked with junior devs before, that sounds like an absolute nightmare.

There is, of course, nothing wrong with being junior. But the rate that requirements are misinterpreted even by intelligent humans is, I think, a fundamental reason why programming isn't doable by the masses yet.

It's not because computers are hard, it's because knowing what we actually specifically want them to do is.


I remember someone referring to this as "intent driven programming". I once worked for a company that made a business rules engine. The idea was to describe a flowchart to the system and the software would ask what you wanted to do at each step of the flowchart in a top down manner till you fleshed out the whole program ..


Top-down programming https://en.wikipedia.org/wiki/Top-down_and_bottom-up_design - the Silver Bullet programming methodology of the 1970s.


Charles Simonyi attempted to design a company around this awhile ago: http://en.wikipedia.org/wiki/Intentional_programming


That's likely AI-complete.


   I still believe. Someday, somewhere, something incredible will emerge for the right-brained bourgeoisie and literati.
I don't see it ever happening: because the bar of expectations rises at the same rate at which the tools improve.

For example, think about what the NYTimes website looked like in February 2000:

   http://web.archive.org/web/19990202013312/http://www3.nytimes.com/
That probably took millions of dollars and a team of engineers back in the 2000. In 2015, a reasonably computer-literate person could do something close to that with SquareSpace or Wordpress.com in probably.... a week?[1]

But that site would never pass muster in 2015. Something 10x (if not 100x) more complex is required for NYTimes.com in 2015, plus various native apps, plus a subscription service, and so forth. So you still need a team of engineers...

____ [1] I'm talking about the act of putting the articles onto a website, not reporting and writing the articles, obviously.


Scratch seems to me like an example of what a "real programming language for everyone" could look like.

Indeed, Lego Mindstorms is based on a similar principle, and it's used for programming robots!


I don't understand why people believe 'everyone' should be able to write code without much trouble. Virtually every activity requires focus and exercise to learn. Example: you can't just take a hammer and start building furniture or you will create a mess. If you want something nice and useful, you need to think about what you want to build, how you are going to construct it, which materials you need, which tools you need. You need to experience how the materials behave, try out certain subconstructions, research what specialized tools exist. I believe this idea that 'everyone' should be able to code is like expecting everyone to be able to build furniture. If you want to, you can learn how to build furniture, but it's not easy and will never be easy. Why would coding be any different?


I believe one area of improvement for the excessive need for precision (most of us can agree that it is excessive) is that our current tools don't use context information enough. Humans deal a lot in uncertain areas and context is what helps us.


Love it or hate it, Meteor has empowered a number of sales-types and small-business owners to create real tools to solve their own real problems.

I've seen it happen in front of my own eyes!


I omitted: ...and create so many new problems there will be years of work in it to clean the mess up.

Been in "IT" for a few decades, I've seen it happen in front of my own eyes time and time again.

The only self-help tool that has lasted so far has been the spreadsheet, and even that has gone horribly off the rails in many companies.


Haha, agreed. I'd argue there's a strong chance of that happening with any tool meant to give non-professionals a 'professional' result.

WYSIWYG web editors, divorce 'kits', many DIY 'kits' for home improvement, etc. Something inevitably goes wrong and the lawyer / contractor wonders just why the hell the client didn't just call a professional in the first place.


The wrong people are working on this problem. The only ones who think about it are typically programmers who have made their peace with the machines as they are today. The interface of code seems simple and logical with little reason to try to improve on it. It would take a team of artists, musicians, human factors engineers, ethnographers and some clever computer scientists to do it. Such an enterprise would be high risk and very difficult to fund because of it.


The precision required in programming makes it hard for the right brained person who won't meet the computer at least part of the way.


I'm reminded of rms's anecdote about secretarial staff learning to write Emacs macros at MIT because they didn't realise it was programming.

This suggests that current, important and well-meaning attempts to get non-programmers to meet code head-on as code may be misguided. Programming is generally easier if you're not thinking about how much it isn't something you do.


I don't believe that there is no such thing as a right brained person. I think this is a cultural myth. Same as, "you only use 10% of your brain's power". These, and similar, are false memes - iow hokum.


ah crap, no -> any (how embarrassing (and how the hell?))


Precision is not the only important element in programming. There's a level of abstraction, such as found in the field of semiotics, that is highly important to the world of computer science.


Not likely.

I've seen non developers try to write specs in whichever format they like: word, excel, drawings, hand written, in speech, mockup tools, anything. They decide exactly how they want to express their idea without any constraints. And yet, they always fail.

There are always too many edge cases they do not think of. They only cover the "happy path" and quite often not even that. Just take the email conversation from the article as example, they didnt even touch the subject of implementation and it was already jibberish even for a developer. You need someone to actually sit down and squint their eyes over something, do research and run some test cases for a few hours before these emerge. Once you start doing this you are already by defition a software developer.


Taken to the extreme, could you not consider raising a child the ultimate programming exercise for humans?

Perhaps the "right-brained" are already very good at programming other people, working with faulty, non-deterministic, somewhat chaotic computing environments where "left-brained" patterns of software development fall short....


AI that takes natural language as input as spits out binaries for you =P


"something incredible will emerge for the right-brained bourgeoisie and literati."

Yes, a real quantum computer. As long as we're dealing with 1's and 0's, there's an insurmountable barrier for those who would get creative with computing.


I've always wanted to attempt this piece: to take all the many layers of abstraction that we deal with, parse them, convert them, and render them through my formidable linguistic talents into one elegant, beautifully constructed piece of prose that magically makes it all comprehensible to lay readers. I haven't yet attempted it, but I give props to Mr. Ford for trying. I'm not surprised he ended up with a novella.

Oh, and why does bloomberg.com want to use my web cam?


Worth noting -- it is roughly 38k words and is the longest piece ever published by Bloomberg.


A quick google search showed that about 300 words per minute is average for an adult reading pace. I'm a slow reader so I'm probably right around there. So that's ~127 minutes to read all of this, not including time spent playing with the great animations. Probably better for me to get a bit more work done before I tackle the rest of this one (only read section 1 so far).


Yes, it will tell you that at the end, and mock you if you arrive there too quickly to have read it all :).


It mocked me for having spent too much time reading it!


It can mock me all it wants. How is 10 words a second fake?


Does this brilliant webpage not know the concept of skimming and related approaches?


for some readers is skimming is like just skimming but not understanding the true essence of the article...and for some readers they like to read it a loud than with eye...I believe every human has their speed and capacity to do any task..in this case some people raise their speed of reading after reading certain paragraphs(not skimming) but reading literally.....and then they skim in the middle...if the speeds drops out...they read literally again....flow goes like a pulse....


It knows and mocks accordingly.


> Oh, and why does bloomberg.com want to use my web cam?

To capture your photo in the certificate of completion.


> I've always wanted to attempt this piece: to take all the many layers of abstraction that we deal with, parse them, convert them, and render them through my formidable linguistic talents into one elegant, beautifully constructed piece of prose that magically makes it all comprehensible to lay readers.

This is my own, personal, incomplete work in progress in that vein: http://www.leannotes.com/


I think I like your approach better... and no Clippy.


I agree with this and below comments. It took... considerable dedication... to keep reading it all the way to the end. I can't even comment on it directly as it was all over the place but with a nice integration/flow of the sub-topics. All I can call it is an experience lol.

I downloaded my certificate. Might put it on my resume. Evidence of willpower if nothing else. :)


What an ambitious and beautiful piece!

A story like this is probably dangerous - it touches on so many ideas everyone will find something to gripe with, and it's hard to make a comprehensive and consistent story.

The last time I read something that so awesomely bridged high level abstractions and low-level implementations with a human touch was Godel, Escher, Bach (albeit with a very different feel). Well done.


Interesting, I wouldn't put this anywhere near the level of Godel, Escher, Bach.

If you've read GEB and do software development, what actually did you find interesting and beautiful in the article? I've just finished reading the whole piece and I don't think I've learned much at all. Presumably as someone who already knows programming and theoretical CS I'm not the target audience, but then I'm also surprised why this has so many upvotes here on HN.


This article is a sociology of code, not a technical manual. You absolutely could get a lot out of it.

I get the GP's point about this being reminiscent of GEB, not in the sense that it covers the same topics or is at the same 'level', but in that it describes an intangible idea by approaching it from different angles and describing that same core concept from the shadow it casts in different directions. In GEB that core concept of self-reference was tackled from multiple perspectives so that an image of this common theme emerges as you read these different views onto it. Similarly, this article tries to conjure an image of 'code' as a cultural artifact, by portraying the shapes it casts in different directions - on the people who create it, the people who have to fund it, the tools and artifacts it generates. And it does so, like GEB, with wit and intelligence.


He does mention a "golden braid forever weaving", which is reminiscent of: "Gödel, Escher, Bach. An Eternal Golden Braid"


or 'a mind forever voyaging'.


It's 2015. The audience of this article shouldn't even exist. The reader, as described in the article, is a VP who has so little understanding about what it is his company does, that the only meaningful abstraction he can mentally picture is that of his employees "burning barrels of money".

Imagine an auto company VP who says "I don't know anything about engines and drivetrains and all that technical stuff. All I know is that when you guys are in a meeting talking about your variable valve timing system, all I smell is money burning!"

That would not be acceptable. Yet, here we are, over 30 years after the original IBM PC was released, and there's still a corner-office audience for "what is a computer?"


You're living in a pretty isolated world, my friend. I'd say 90% of my friends would be the audience for this. I've been coding 20 years, most of my friends are successful, grad-degree educated people in a variety of fields, and some of them are even my coworkers.

Who is supposed to teach people what code is? Our schools? Who with a CS degree and programming experience would willfully choose to teach in the USA's education system?

Or maybe the companies who make all their money from code? I think not- it wouldn't help the economic position of Apple, Google, FB, or Microsoft if everyone knew what code is and how it works. It strengthens the tech economy's stranglehold on society when code is treated as something inscrutable.

So there's really very few resources for people- even educated, successful, technically literate folk- to grok "what is code?"

Many coders would do well to read a similar article, if there was one, called, "What is Society?"


> Many coders would do well to read a similar article, if there was one, called, "What is Society?"

There was a thread a while back where software developers told me I was unreasonable to expect them to know who the vice president of the country they lived in was. I feel you here a whole bunch.


> There was a thread a while back where software developers told me I was unreasonable to expect them to know who the vice president of the country they lived in was.

It is absolutely unreasonable. The average person has no clue who the vice president happens to be ... so why should software developers?

We can complain about how clueless average people are, sure, but there's no reason to apply a higher standard to software developers.


Of course there is. Software developers (in general) are smarter and have had better educations than "average" people.

Most smart, educated people I know can name the vice president of the United States, and I don't even live in the US. I think it's completely unreasonable for someone not to know who the vice president of the country they live in is.


The corner office in the earlier example depends on the software being written to make money. If the software is bad, they will make less money.

By and large, knowing the vice president's name will not cause any appreciable dip or rise in how much money a company makes.

So, in the context of 'why is it important for business people who's livelihoods partially depend on code to know what code is?' not knowing what programmers do is worse than not knowing the vice president's name


Wait, what? The average person doesn't know who the vice president of their own country is? I'm Canadian and I still know who the US vice president is. Surely most American adults do too.


Nope: http://www.dailymail.co.uk/news/article-1368482/How-ignorant...

Lots of people hate politics and really avoid any study of it.


> Stumped: In the U.S. citizenship test, only 38 per cent of Americans passed

> Although the majority passed, more than a third - 38 per cent - failed,

Daily Mail is fucking useless.


LOL, that's funny. I'll admit I saw a few articles on this survey and the Daily Mail's was first on my results list. Does cringing while posting it lend forgiveness? ;)


From your link: "...who is the Vice President of America? (29 per cent did not know)"

So the majority of Americans do indeed know who the vice president is. (Although yes, a large minority don't.)


The data is from the US Citizenship test, so those taking it aren't "Americans" yet and have been specifically studying for the test.

I stand by my assertion that the majority of Americans have no clue who Biden is and I swear to you, I wish I didn't either. The whole political arena is absolutely revolting and I'm not surprised the majority of people are apathetic to it.


Actually it says they gave the citizenship test to 1000 Americans to see how they would do. 29% didn't know who the VP was (so 71% did). 1000 isn't a massive sample, but I expect it's enough to say that the majority of American adults do know. Anyway, it isn't a very interesting discussion regardless.


I'm Aussie, I know the vice president of the US off the top of my head, but I couldn't tell you the Australian vice prime minister. Or even if that's a real thing.


What's the Vice President relevant to? It seems to me that for most of us it would be more relevant to know the name of e.g. the head of the FTC - and I don't think that's something you'd expect of people. Even if you did want to know about the Vice President, knowing his party or policies is surely more important than his name. A name is just a trivial fact, not important to actual understanding - and you could look it up if you ever needed it. I'm reminded of http://unreasonable.org/Feynman_and_the_map_of_the_cat .


I think that Parent made the observation that it is sad that in our information society people (in responsible positions (regarding ICT)) don't know about the fundamentals of the information society. At least, that's what I took from it and I concur.

We, as a society, should have started integrating computational thinking (Wing, 2006) as a core competency in the k-12 curriculum from the late 1980s onwards. We didn't.

anecdote

In 1991, I was in 5th grade, I saw the first computer enter the classroom in my primary school. It wasn't used but for some remedial mathematics training for a students or two and I believe the teacher did a computer course with it.

In 2010 I became a high school computer science teacher. There were three computer rooms (about 30 computers each) for the whole school (of about 1500 students) running windows XP + IE 6. Besides my class, the computer rooms were mostly used for making reports and "searching for information". Some departments did have specialized software installed (most of which came with the text books), but used it sparingly at best. On top of that, these software was mostly simple, inflexible, mostly non-interactive, non-collaborative, and "pre-fab" instructional materials. Often this software was not much more than a "digitized" version of parts of the text book with some animations, games, and procedural trainers mixed in.


My anecdote: high school student in a smallish (100k) northern Canadian city, circa 1992-1996. computer science courses ran every year. We were all taught DOS, Windows, programming in QBasic and Turbo Pascal, and (!) building web pages / using the Internet (in 1995!).

The day Netscape 1.0 came out, the teacher had us all download it from a few of CDs passed around had burned after downloading it on the class modem. The classroom was networked (coax) and figured out how to get Trumpet Winsock to work over the next few weeks to share the network with the computer with the modem. By 1996 we had a frame relay connection.

There was no curriculum other than what these two teachers could envision and sell to the achool board. Pretty thankful for that.


anecdote: I graduated in 2008 from HS. When I graduated from there the most advanced computer class offered was essentially how to open up photoshop, word, and excel. A little bit of this is adobe flash go have fun. And make a pure html web page with inline styles - no css. Now that HS has a 3d printer. Once, the computer teacher sincerely asked me how to make a href open in a new window.

I graduated college in 2012 and I learned what css was and why it was better. I took an intro to java class. Learned some stuff volunteering for NPOs doing website work. When I graduated I knew I wanted to do something in IT but not program. Now I am a software dev.

Turns out its easy to get a job and teach yourself how to code when nobody understands what coding is.


We do not love in an "information society", and people do not absolutely need to know the internal workings of computers, any more than a person in 1970 lived in a "radio society" and needed to know how radio or television waves propagated in order to hold a non-technical job.


Another anecdote.

In 2015 my teenager will start High School in a small, rural, school district and each student has a Chromebook. No, I don't know what they will be used for besides googling information, but times are changing.

I emphasized small and rural for a reason.


I know teachers at the local school district in my city. Chromebooks are used extensively in classes (elementary school, even down to 2nd grade) for researching (googling), typing practice, typing book reports, online testing, some educational games, reading, etc. They even have some educational websites they use to learn super basics of programming.

It's really amazing stuff. I wish these sort of programs existed in the US back when I was in elementary school... back then, we were lucky if we got 30 minutes a week to play with Claris Works in the school's only computer lab.


I see similar things happen around here (oddly enough often also in small rural schools?). As a pessimist, I wonder:

- are teachers able to choose, remix, adapt, or make new "computational" instructional materials like they are able to do with conventional instructional materials? (Stencilling and xeroxing for the win :-)

- are computers only used as supporting tools (i.e., typewriter, encyclopedia, drawing board, ...) or is computational thinking integral part of the curriculum?

- is there software like programming tools, CAS (such as maxima, or matlab, or mathematica), CAD, and other configurable and programmable professional tools available? (and will they be used beyond a module of two weeks here or there?)

- has the (core) curriculum changed at al or are we still teaching topics like it is 1982? Computer technology makes it possible for students to tackle complex authentic problems instead of "school problems" (Death to linearity! Away with nice round numbers!)


You make a good, if harsh point.

But this:

"it wouldn't help the economic position of Apple, Google, FB, or Microsoft if everyone knew what code is and how it works. It strengthens the tech economy's stranglehold on society when code is treated as something inscrutable."

Is just cynical. Tech companies aren't so fragile as to depend on general ignorance among the human population. If more people understood code, these companies could create more code. I think you point to an underlying misconception that some how technology is just a barrier to entry, and doesn't provide intrinsic value. But I do not believe this to be true.


> Who is supposed to teach people what code is? Our schools? Who with a CS degree and programming experience would willfully choose to teach in the USA's education system?

Grade school teachers aren't expected to be specialists in the field they teach. They're expected to be specialists in teaching. Usually they have a bachelors (or masters) in education and maybe another degree, but it may or may not be the thing they teach (if they even teach only one thing at all).

To put it another way, this is a bit like asking what physicists would willingly teach in the USA's education system? The answer is obviously not very many, but that's beside the point. Physics still gets taught.


> Grade school teachers aren't expected to be specialists in the field they teach. They're expected to be specialists in teaching.

Yeah, and this breaks down pretty fast, particularly when it comes to teaching the stuff you can't easily hand wave, e.g. science. Even at the middle/high school level, where teachers are supposed to have studied topic they teach at the undergraduate level in some capacity, you find plenty of foreign language teachers who are terrible speakers of the language they teach, or math teachers who basically have the same level of math as their students, with the distinction that they have access to the answers for the exercises they give. I spent some time in a US state university for grad school, and the level of some students who majored in education and later went on to teach was abysmal. It's hard to tell if the hegemony of standardized testing is the root or a symptom of the problem, but the overall picture is bleak.

This is precisely why if we want a great education system, we need to incentivize people who are practicing professionals first to then go teach. My best teachers in high school all shared those traits: a historian who had spent many years doing field research teaching history/geography, a geologist who was also a researcher for a major lab teaching natural sciences, etc.

Not everyone is made for teaching, but we as a society need to become much better at encouraging and enabling the people who enjoy it to teach in parallel to their professional activity (I would happily teach math from 8-10am before my day job 2-3 days a week if there was the structure for it.).

The people who want to become teachers just because they like kids but don't have any deep knowledge/understanding of any particular subject can teach kindergarten.


This is incidentally a weakness in the US education system. You split schools into elementary-middle-high (for no especial reason) then treat it all as 'grade school'.

In every European country I've lived in schools are generally split into primary and secondary. Primary school teachers are general educators whose primary skill is teaching but who do not need advanced knowledge in any particular subject. They usually have the same class all day, which gives them a deep insight into students' progrewss (although from a kid's point of view, if you don't get on with your teacher then school may suck). In secondary school teachers may teach more than one subject but they're required to have studied their primary subject at university and done some additional study in other subjects they teach, as well as additional study in educational methods. So your math teacher has a math degree, your history teacher a history degree and so on. People who plan to teach usually develop themselves academically in two subjects, sometimes three if they're closely related or you have special experience. For example: http://www.teachingcouncil.ie/_fileupload/Registration/Gener...

California seems to be moving towards this model too: http://www.ctc.ca.gov/credentials/leaflets/cl560c.pdf

The downside of this is that if a particular teacher is sick and another has to take the class as a substitute, the substitute teacher just babysits, or comes into class with a homework assignment from the sick teacher that should be doable during the class period. The upside is that in general students are keenly aware that teachers know what they're talking about and are considerably more expert in the subject than would be possible using the assigned textbooks.

I've ranted on HN before about how destructive of educational ends the practice of using 'Teacher's Editions of textbooks with scripts and answer keys is. It degrades teaching to a branch of bullshit artistry, and students who take a subject seriously can quickly detect a lack of true expertise.

Obviously this short comment isn't meant as an accurate descriptor of the whole state of the US education system, just the legacy of the non-specialization referred to in the grandparent comment.


The US works pretty similar to how you indicate Europe works:

1-6 (or 7 in some places) is taught single teacher who probably majored in education with the whole class (with maybe a once a week art or muisc class taught by a specialist if the school is well funded).

7 or 8 through 12th is taught with 6-8 subjects a day (or in some places 3-4 subjects with alternating days having different subjects) with teachers who teach a smaller number of subjects. At HS level, in many places in the US it is expected that the teacher will have a Bachelors related to the subject they are teaching.


Thanks for the clarification. I should have addressed this in more nuanced fashion, given the reality of 50+ state educational systems and innumerable school districts witht heir own rules, rather than a single national standard. Instead, I critiqued the weakest instances of the current system as if they were the norm.


Also note that California needs to mandate degrees in related fields at least partly because the California school systems are in so much trouble. Pick a random school district in a suburb in the US Northeast, and despite the lack of requirement, it will tend to be the norm (many inner-city districts burn out teachers so fast that I doubt any set of requirements would suceed in getting quality teachers).

The general consensus is that California school systems are poor, and what little I've seen of them seems to hold true: I grew up in the Northeast, and communities that are comparably affluent in Southern California to there are much worse off school wise. The most common point to blame seems to be Prop 13[1], but I also notice that California has many more decisions centralized, and does also tend to have more parents who are non-native English speakers, so I don't know what is to blame.

1: http://en.wikipedia.org/wiki/California_Proposition_13_%2819...


> You split schools into elementary-middle-high (for no especial reason) then treat it all as 'grade school'.

As someone who went to school in the US, no, you're wrong.


Ironically, this sort of dogmatic assertion with no follow-through argument is exactly what I've come to associate with the US educational system.

While 'grade school' often refers to elementary school, surely you are aware of people's tendency to talk about 'k-12' education in a lump and the lax academic requirements for substitute teaching in many school districts even at high school level.


I'm sorry, I didn't realize that facts needed more backing

You are simply factually incorrect about your assertion that higher grades are run the same as lower grades. I don't particularly care about your inability to accept that, but I do find it surprising that you would assert your inexperienced viewpoint is correct.

Also your insult was not missed. Thank you for that, it verifies to me that you have nothing of import to say.


At least it prompted you to explain what you meant, which was far from obvious in your first comment.


Your characterization of US education in middle and high school is largely incorrect. I don't want to take time to explain all the ways.


Exactly, I did my first instructed programming in a 6th grade computer class in BASIC on Apple IIes. I think my teach at the time taught typing as well.


> Who is supposed to teach people what code is?

'But no one told me I needed to know this' stops being an excuse as of adulthood, if not earlier. Anyone who cares what coding is only needs curiosity and an internet connection, the web is full of introductory material for almost every level from almost every angle. If a professional in the current world doesn't know 'what coding is', they just don't care.


What do you mean? Theres tons, TONS of resources for learning the basics of programming.


Yes, but a lot of them go through the same hoops - Hello World, variables, conditionals, loops, arrays, functions, OK that's it take this pile of building materials and just turn it into a house mmkay.

There are two big problems for would-be programmers: there's a shortage of obvious standards on architecture/program sturcture (not least because it's hard to prove mathematically which structures are optimal), and endlessly proliferating options. For example, betweeen HTML 5, CSS, and JS, it's quite complex to put together a web page these days.

I mean look at this page on the DOM: https://developer.mozilla.org/en-US/docs/Web/API/Document_Ob... there are hundreds of subtopics, and it's not obvious which ones are most important. The Introductory page on the DOM is less-then-inviting to a non-programmer, not least because it presumes code as the optimal medium for production, when most people would rather work through a GUI and have the computer take care of the abstractions.

I wish sometimes that programmers were forced to decompose their latest and greatest algorithms into electronic circuit diagrams or diagrams or something, to remind them that translating functionality between different paradigms is a Hard Problem and that ,amy people do not like all the typing and syntactical overhead of text-based programming.


That page on DOM is the reference Manual. Do you complain that Gray's Anatomy is just too thick, how do you navigate all that info to find what pill should you take?


We're talking about how it presents to novice programmers.


So how does that make this article a bad thing?

Also the TONS thing is kind of bad, because it becomes a lot harder for the layperson to filter out the bad resources from the good.


What? I imagine there are many, many auto manufacturing VP's who think, "I don't know anything about engines and drivetrains and all that technical stuff [– and I don't need to]."

Why should the VP of Human Resources need to know how a drivetrain works? Or the CTO? Or the CFO?

They are experts in their area focus. It's ridiculous to expect every manager to understand everything about their business. Would you expect the CTO of Starbucks to be able to tell you how all of their drinks are made?

I wouldn't. And I wouldn't care if they could.

In general, a good executive doesn't need to know the minutiae. They need to know how to motivate people, how to keep projects on track, how to recognize talent, how to delegate, how to budget, how to distill information for other executives, etc.

Sure, knowing the minutiae usually helps. It's easier to sniff out all the BS people feed you, etc. But it's far from the most important knowledge and skills a great leader needs.

I've only skimmed the article so far, but the part that stuck out to me was this (technical manager talking to the VP): “My people are split on platform,” he continues. “Some want to use Drupal 7 and make it work with Magento—which is still PHP.” He frowns. “The other option is just doing the back end in Node.js with Backbone in front.”

Now, that's an example of a terrible trait for an executive. TMitTB clearly has very little ability to communicate with people outside of his area of expertise. The ability to convey complex ideas simply is crucial. Why would a non-technical executive care about the framework you're using? That's asinine. Worrying about the implementation is TMitTB's job. When meeting with the VP, TMitTB should talk about the business impact of options. This option is cheapest but doesn't give us these features that the marketing department says they must have. This option is best, but it's much more expensive to hire developers with those skills right now.


"When meeting with the VP, TMitTB should talk about the business impact of options."

I don't think he even knows what the business impacts are. TMitTB is just a tech guy who works for the new CTO. Presumably, the CTO (being an executive in charge of technology) can speak both the language of tech and the language of business and could make a business case to the VP, in terms he understands, as to why the company needs the new software. The CTO should not have sent her tech guy to talk to the VP.


TMitTB (in his "mid-30s") seems in over his head. And he's not a "tech guy", but a "Scrum Master". And he spends so much time (and money) at conferences that he is specifically brought to task for wastefulness ("he has apparently spent all of his time at conferences and no time actually working"). When he eventually delivers, months late, it's "a plain and homely thing" and he's still cagey about a go-live.

Yet at the end of all this, "TMitTB will get his bonus."

WHAT!?

What's the message here? Big companies are hard and inefficient places? Programmers and techies are confusing and dress funny?

Other than being technically illiterate, the VP seems to be the hero of this story. Not recognized as such, of course ("Money? Hours? Due date? Value? Bah!").

The CTO ("who has several projects on roughly the same footing [e.g., horribly mismanged] scattered across the organization") and TMitTB should have been fired long before the 30,000 words came to a close.


This is the only valid answer, here. Why is the guy not talking in terms of technical debt, business impact, KPIs etc.?

If he really knows his shit, he should be expected to break his technical insight down into layman-friendly terms.

Hell, we're expecting just that from our doctors all the time.


> there's still a corner-office audience for "what is a computer?"

There's a technical audience for "what is sales?", and that's thousands of years old. Generalists, especially good generalists, are rare.


There are always going to be people like this, in any field. TSR (of Dungeons & Dragons fame) actually had a CEO who forbid her employees from playtesting their products during work hours, calling it "playing games on company time." http://1d4chan.org/wiki/Lorraine_Williams


Dear God, what a disconnect. How does someone with that line of thinking even get a job in the gaming industry?

Then again, I'm reminded of a boss I used to have who would ask me to fix the shipping calculator on our web server, then ten minutes later he would poke his head in the door and tell me to "quit playing on the goddamn computer and get some work done!" He didn't realize that "working on the web server" is done at a workstation, not physically taking the server (remotely hosted of course) apart and putting it back together. All he knew was the customers were complaining about the shopping cart module not calculating shipping correctly.


> How does someone with that line of thinking even get a job in the gaming industry?

The page I linked spells it out in pretty plain terms. In short: being in the right place (in terms of general business experience and family connections) at the right time.


Yet, they do: many VP's in this position are promoted from parts of the company that have nothing to do with tech. I even wonder if this is the majority. Not sure. A huge chunk of VP's out there, though.


It's got to be the majority, since most non-tech companies (i.e., the vast majority of companies in the world) only use tech as a tool - it's not the focus of their business. The executives of banks, retail businesses, airlines, etc. are not likely to have been promoted from the IT Department.


Sounds right. Makes even more sense how you worded it.


> auto company VP

The VP is not in charge of a software company. Presumably some sort of widget/manufacturing operation ("cycle reduction"), so it is unfair to accuse him of not knowing what is going on in IT (at the level of engines/drivetrains for an auto company).


This comment falls neatly into the category of my favorite trendy term of 2014/2015, the "hot take". No matter how informative or well-intentioned a piece of writing may be, some people will just immediately go looking for something wrong with it, reason or authorial intention be damned.

Maybe instead of ridiculing the majority of people who don't understand what we do, we should celebrate a piece like this that makes an effort to educate. Maybe instead of lamenting their ignorance, we should commend the VPs and everyone else with enough curiosity about code to make it through this behemoth of an article.


It's perfectly acceptable. Business optimises for least-knowledge-you-can-get-away-with.

Three generations of my dad's side of the family were in the print industry: everything from printing Vogue and Playboy to fancy art books to dull but well-paid corporate stuff (annual reports, mergers and stock issue documents—500 page books that the SEC make you print filled with legalese that nobody reads).

Most people working in big print companies know nothing about print. They don't know about how paper works or how ink works. They have no understanding of how colour works or why you can't print certain colours on certain materials, or how long certain types of print work takes. Not at the junior level and not at the management level.

Hell, if you took half the people in a big print management company and asked them to explain the basics of offset printing, they couldn't give you a "lead paragraph of Wikipedia"-level description. And that technology has been around since 1875.

For all but a small set of technical and management roles, a lot of businesses are far less interested in technical know-how than "soft skills". In a shocking number of places, the ability to build a tower out of rolled up newspaper and sticky tape in a team building exercise is valued over an ability to know the details of how the industry or its core technologies work.


I think you're taking their conceit of a VP audience too seriously. To me that just feels like a fun narrative choice, and the article is just intended for any adult dealing with and slightly bewildered by code.

A lot of people find themselves in this role, people who specialize in other fields but still need to interact with developers because software is eating the world, but more quickly it's eating their role.


Software doesn't only happen at software companies.


Actually, I've met a lot of programmers who are deeply resistant to learning anything about the core business of the firm they work for, especially if it can't be automated of if it resists automation because of strongly established legacy practices, as well as the tech short-sightedness that so often prevails - eg saying that the putative listener won't need $85,000 in Oracle licensing any more, oblivious to the fact that such licenses involve periodic contractual obligations.


> Imagine an auto company VP who says "I don't know anything about engines and drivetrains and all that technical stuff. All I know is that when you guys are in a meeting talking about your variable valve timing system, all I smell is money burning!"

His job isn't to know about drivetrains and engines and stuff. His job is to manage products, cashflow, audit requirements, stock market regulations, and marketing. That's the stuff he has to deal with day to day.


> what it is his company does

The company sells products. The website with a shopping cart is not the primary purpose of the company.

The company may in fact sell automobiles.


Most VP's do not sell computers (or even software). They sell goods and services that depend on computers. What you're saying is more like "a VP at UPS or FedEx should understand everything about engines." That's ridiculous, you don't need to understand how trucks work to know they move goods from point A to point B.


Writing software is not what (many) companies do. Companies exist to create things of value for their clients. Software is one aspect of the secondary functions that enhance this value creation.

In the article, the VP worked at a company that sold things on the internet. The things they produced were their primary function; internet platform development is a secondary function, much like marketing, hiring, business development etc etc.

So the theoretical VP's core competency should not have been software dev, or even technology - that's what CTOs and technical leads are for. In fact his core competency probably wasn't product development anymore, if it ever was. He was a manager, and his role was managing resources within the company to optimise their primary function.

Hence the disconnect, and hence why articles like this (and audiences for them) exist.


It's a caricature, much like the pointy haired boss in Dilbert. The reader is supposed to think "at least I'm not as dumb as that guy".

Also, the whole article is written as if it were 1997. The graphic design is rather HotWired-like, in a world where they had the tools we have now.


This article doesn't attempt to answer the dull, dry, boring, well-understood question "What is code?" in the sense of "What is it that computers use to make them do what they do?". It tries to answer the much more interesting, intangible, existential question "what is code?", in the sense of what does it mean for the world to depend on a culturally isolated priesthood of technologists who control what computers do? How does that work?


I've seen a lot of 50something execs who have been passed by on the technical superhighway.

I met a CIO of a large insurance company who didn't know what Python was.

I know a senior banker and educator who struggles with the basics of Excel and Word. (Yes, there are still people out there who are used to the days when secretaries did all the work.)

I knew an IT exec at a large consumer products company who didn't like using computers at all.


I work at a software company where no one likes computers too.


It should make sense that this article & VP exists. 7 billion people in the world and only 11 million professionally code, and 7 million are hobby coders, according to the author of the article and the IDC.

After auditing the software of at least 20 different startups, I'd have to say there will always be people in positions of power who know nothing. Just look at our politicians.


I though the imaginary VP was pretty clearly not working in a software company. What if random companies suddenly needed to design and build their own automobiles in house? There would be a similar lack of comprehension about just what the automotive engineers were actually doing all day.

At least, I know I would be lost and would greatly appreciate a guide like this.


"It's 2015. The audience of this article shouldn't even exist."

Well, the audience most certainly does exist whether it should or not. The thing to think about is what to do about that.


> Yet, here we are, over 30 years after the original IBM PC was released, and there's still a corner-office audience for "what is a computer?"

As someone who got their start on DOS and BASIC back in the 80s, I say you raise a pretty good point. There are so few languages depicted in this web brochure that it does not illuminate anything.

So many people stopped learning in the 80s that the 50s are starting to catch up with them again.

This is the stuff that cardboard box forts are made of.


Source for accompanying interactives - https://github.com/BloombergMedia/whatiscode



I think of Minecraft as a visual representation of a database. Every block you see has a set of values, starting with the 3 that determine its location within the world (coordinates) and extending to include block type, which determines other values.

And too, even the open spaces. For Minecraft reminds you that a block can occupy any space. Indeed, an open space is a set of blocks whose block type is "open", which makes it both transparent to light sent from neighboring blocks, as well as not blocking player movement.


Most games are essentially just massive databases of pretend stuff with an enjoyable alternative to SQL as the interface.


I love how the page calls you out for skimming it instead of reading it.


Huh, is this what caused the whole page to go pixelated and ask for permission to use my camera each time I changed orientation on my tablet? I had to reload the page each time this happened and scroll down to where I was before. Very annoying.


I scrolled. I admit it. :)


I skimmed and didn't get mocked. I really am a slow reader :-P


Computer don't hurt me, don't hurt me, no more


The lyrics to the original actually make sense for this altered version:

Oh, I don't know why you're not there I give you my love, but you don't care So what is right and what is wrong Gimme a sign

---

Oh, I don't know, what can I do What else can I say, it's up to you I know we're one, just me and you I can't go on

source: http://www.lyricsondemand.com/onehitwonders/whatislovelyrics...


I'm happy to see that I'm not the only stupid singing this while reading the header XD


It's easy: the code is that part of the computer which can't be grabbed and slammed but only cursed.


It's easy to me: the code is the part of the computer which can't be grabbed or slammed but only crushed.

That's why I crush it. I crush code.


I did not expect this good of an article on this subject from a business publication. Well done.


bloomberg is mainly a technology company


We make the vast majority of our money from selling our software subscriptions and have ~4k employees in R&D. Depending on what data you use[1], if we were a public company we'd be the 4th largest in the world by revenue.

[1]: http://en.wikipedia.org/wiki/List_of_the_largest_software_co...

(My intention is not to bikeshed over who is or isn't in the "Software & Programming" industry or specific ranking, but to convey a sense of scale)


Time to follow @ftrain on Twitter then.


"C is a simple language, simple like a shotgun that can blow off your foot... Think of C as sort of a plain-spoken grandfather who grew up trapping beavers and served in several wars but can still do 50 pullups." Most certainly.


I kind of like my answer better: http://qr.ae/7NEnT9

The whole post is just a stream of consciousness brain dump that a layman would never understand. I believe it's possible to explain these things without circular reasoning.


"What are the major programming languages, and what are they used for?" is a different question than "Can you tell me what code is?" The second can include the first. But if your and Paul Ford's answers were swapped, both questioners would have good reason to say, "That's helpful but not what I was looking for." To extend your kitchen metaphor, you haven't mentioned anything about why or how your instructions would ever work. You've left out the compiler, interpreter, executor: the human.

Granted, I think Ford expanded the domain of his question a little further than he needed to, for the sake of what looks to be fun. And I think he occasionally picks a piece of jargon where a clearer, more ordinary word would have done just as well—though I'm not in the mood to dive back into the article to find a case.


In my opinion, this is what society today needs. I don't feel like we need everyone to be able to code, but rather just have a sense on some level that computers are nothing mysterious or magical, unconquerable or incomprehensible, but rather just machines of human creation.


Not sure if we were reading the same article or not, but this is not "what society needs." While there is a part of me that loves computing and wants to share it with the world, I also realize that the nuances of compilation or futures are entirely inside baseball and irrelevant to the vast majority of society at large.

The computer, for most people, is a tool. A means to an entirely unrelated end.


Holy CPU time! That site consumes 100% of my CPU (presumably 100% of one core) whenever it is in the front tab (Firefox/OS X).

Anyone else experiencing that or is it just my laptop running wild?


You'll see similar resource consumption when using event listeners tied to the mouse movement. It's generally not noticed by the general populace, but gives every developer a pause. The page does seem to struggle at times.


Very informative, thanks. In native land we can listen to mouse motion but it is more CPU friendly to have a timer and poll the mouse position periodically, particularly if the location of the cursor causes further processing (like working out what data to display in a popup hint). The good thing with the mouseEnter / mouseLeave is that you can stop the timer and only restart polling when they enter again.

Is there a way of doing this on web pages or is it really still just callbacks for mouse motion?


It's like Myspace and Tumblr barfed all over Businessweek. I opened the article in Firefox with the mobile-emulation feature turned on. Because good god almighty this thing is a trainwreck otherwise.


Interesting, I was able to read the article just fine on my 2011 Kobo touch. 800 MHz ARM Cortex A8 and whatever Webkit was around in early 2011. The border animations are off but all the text and plain images work.

Whatever effects they're running, they did an impressive job with graceful degradation.


I've heard tales of AdBlock Plus consuming huge amounts of CPU, especially on large and complex pages.


I'm only using Ghostery. Disabling it doesn't help though.


Same here on my 2015 MBPr.


You need to Konami code this bad-boy.


wow. heh. strange world we live in when the result of that is on bloomberg. What a time to be alive.


Their 404 page gif is still their best work, but this is a close second.


Their 500 page is also pretty great


"That’s how change enters into this world. Slowly at first, then on the front page of Hacker News."

How meta.


I didn't really understand that part. Care to explain?


Aggregate sites like Hacker News and Reddit make distribution of news and ideas very quick and viral compared to more organic growth such as word of mouth and google.


Not just distributed very quick but deemed worthy or unworthy very quick by the pseudo-meritocracy that is the "upvote".


I had no idea they made an hour long educational video on windows 95 with the cast of Friends! That is awesomely 90's. This is a really cool write up, clearly a lot of work went into it


Imagine a world where everyone has their own social version of a Github page instead of a Facebook wall.


Imagine a world where Wikipedia isn't an encyclopedia, but a crowd-sourced collection of all code meticulously indexed and documented that could be written for one language.


Imagine there's no heaven (It's easy if you try)


Imagine a world where everyone read and writes.


This is the best write-up explaining software I have ever seen. Wow.


Smash the patriarchy! Check the console.


http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod...

> You know what, though? Cobol has a great data-description language. If you spend a lot of time formatting dates and currency, and so forth, it’s got you. (If you’re curious, search for “Cobol Picture clause.”)

https://www.google.com/search?q=%E2%80%9CCobol+Picture+claus...

What am I supposed to be looking at here?


Superb writing. I wish all professional writers could write this well.


Paul Ford really is in a class all by himself. Everything I've read by him is truly wonderful.

As a writer, it's both inspiring ("look how amazing nerdy non-fiction can be!") and soul-crushing ("look how much better someone else is at writing!"). I try to focus on the former, but, man, he really makes the rest of us look like Celene Dion showing up at your dive bar's shitty karaoke night.



What a strange, long, rambling novella on programming languages.


I like the idea, but is there really no way to mute the audio? Sadly I did not finish the article because of that.


"is there really no way to mute the audio?"

There are so many ways to mute my computer, I wouldn't even know how to list them all. But on top of the list I would start with the volume keys on the keyboard and then with the volume panel in the menubar and then with the audio panel of the system preferences. Towards the far end of the list I would cut the wires to the speakers.


Very funny... I currently have other audio playing on my computer that I don't want to stop.


If you are on windows you can right click on the tray icon and for the volume and open the volume mixer where you can lower the volume for specific applications, such as your browser.


But you can only change the volume of the entire browser, not specific tabs. If you're playing music in another tab, you can't just turn down one tab.


You can mute single tabs in Chrome easily:

http://fieldguide.gizmodo.com/mute-noisy-tabs-in-google-chro...

Type this in to the address bar: chrome://flags/#enable-tab-audio-muting

Then click Enable under "Enable tab audio muting UI control."


If you use chrome, you might enjoy this flag

chrome://flags/#enable-tab-audio-muting


There's audio? I should have unmuted my laptop.


What audio? There shouldn't be any sound unless you activate the konami code easter egg.


Well now you've got me to investigate. There seems to be a video in the article. However, it is only displayed as a white box in my browser (Safari), without any controls. I can't stop it, I don't even know it's there, except it it playing audio. Strange...


A couple parts of it remind me a lot of JBQ's post on "dizzying but invisible depth": https://plus.google.com/+JeanBaptisteQueru/posts/dfydM2Cnepe


Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'

But then I see the interactive circuit simulation and think "Fuck it, who cares, this is awesome!". Designing circuits is one of those things that, if I were a self-learned coder instead of a comp. eng major, I would've never delved into...yet learning how to build an adder circuit and getting an appreciation of the most basic building block of computation (and how surprisingly complex it is to just add 1s and 0s) is a profound lesson that I think is essential for me, personally, to really grok programming. All the sections about culture and conferences and etc. are a little bit off-field for me...it's not that I don't think that code and life and human thought and behavior aren't intertwined... * I just think the discussion about conferences reads as if the author doesn't realize that all disciplines spawn conferences and conferences culture. There's nothing particularly unique about code conferences. Not the sexism, not even the nerdiness.

I would love to see the OP's editor respond in a not-quite-as-length essay. What did they learn about code after reading the piece that they didn't understand before?

edit: * I'm emphatically not arguing "Oh but everyone does conferences shittily so tech conferences shouldn't be shamed". Just that having it in this "What is Code" essay makes it seem as if it's a notable "feature" of programming...but that understates the problem by an order of magnitude. Sadly, it's a feature in most every discipline, and the inherent feature is the gender imbalance, not the topic of the conference.

edit: Also, I wished that the section on Debugging was much higher than it is...Robert Read's "How to be a Programmer" [1] makes it the first skill, and that's about the right spot for it in the hierarchy of things. Maybe it gets overlooked because it has the connotation of something you do after you've fucked up. But, besides the fact that programming is almost inherently about fucking up, the skill of debugging really underscores the deterministic, logical nature of programming, the idea that if we have to, we can trace things down to the bit to know exactly what has been fucked up in even the most complex of programs. And that's an incredibly powerful feature of programming...and not very well-emphasized to most non-coders.

[1] http://samizdat.mines.edu/howto/HowToBeAProgrammer.html


Not to your main point, but the circuit simulation reminds me of Silon by SLaks: http://silon.slaks.net/

Edit: Also, as a late-bloomer and self-taught (self-teaching) programmer, I am on the other side of the paradigm you're talking about. Petzold's Code is one of the first books a self-taught programmer should pick up. It is an awesome introduction.


One of the few worthy things I felt I got out of school was the moment I grokked the whole stack from sequential logic to the program counter and control logic from a cpu, how each clock tick formed a new circuit. That was really mentally expanding. I got it from reading a prescribed book for a class I wasn't taking from a professor who was a tool, so it is possible to learn these things outside of class. In fact, that's where the real learning, IMO, happens.


What book? For those of us not there yet :)


I dug around and cant find it; I graduated a while ago. But, the more I thought about it, it was actually 2 books: One on how to design a cpu on an fpga, similar to this one:http://www.amazon.com/VHDL-Implimentation-16-bit-microproces... And another book on digital design, specifically, digital sequential circuits. If you google that term you will find a few links to pdfs to study. Finally, "Computer Organization and Design" by John Hennessey is very recommended.


One of the most memorable weeks in my Engineering degree was using Cadence to build a CPU from the ground up. Every transistor, every connection, the ALU, etc was laid down by someone in our little group of students, and then wired together to make a thing with a few thousand transistors. And it friggin worked.

It also showed how the chip itself would be laid out, where the dopants would be and such.


> Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'

I completely agree. I got a third of the way through it before I just couldn't stand the obfuscation and decoration any further.

What's sad (as I [tweeted][1]) was that there's a 1972 article by Stewart Brand, published in Rolling Stone of all places, that does a better job of actually explaining what computers can do, without resorting to jargon and jive: http://stuartpb.github.io/spacewar-article/spacewar.html

[1]: https://twitter.com/stuartpb/status/609035295002984448


baby don't hurt me baby don't hurt me no more


For reference: What is love by Haddaway https://www.youtube.com/watch?v=K5G1FmU-ldg


created an account just to reply this.


Was the bit about PHP standing for Personal Home Page a joke? I always thought it was "PHP Hypertext Preprocessor". The coolest thing about PHP is the infinite recursion in its name!


https://en.wikipedia.org/wiki/PHP#Release_history It was PHP 1.0, then PHP/FI 2.0, then became the PHP Hypertext Preprocessor at PHP 3.0 I believe.


Nope, it wasn't. recursive was later tackled on (probably still before 1.0)


The article ended up just yanking my Firefox session to somewhere in the middle of the page, followed by some Clippy expy nagging me about how fast I'm supposedly reading the article.


I think of it as a beautiful, colourful, crystalline structure.


Very interesting read... I enjoyed it.

One thing I noticed though is that the author is definitely stuck in the old "Microsoft is the great Satan" mindset. If he ever finds out about all the open-source stuff MS is doing these days under Satya Nadella, I think his head would probably explode.

He doesn't know what to say to a C# developer (nothing in common), but automatically trusts a Python developer? Really? sigh


I like Paul Ford -- first thing that made me subscribe to his Medium feed was this piece about brief, remembering and old computers: https://medium.com/message/networks-without-networks-7644933...


Anyone know what happens when you allow bloomberg.com to use your camera when you finish reading the article?


It adds a photo of you to the certificate, and then allows you to download it.


Can't be good, if Bloomberg is for it


Do you consider to be a photo for your certificate of achievement to be good?


Are you naive enough to not believe the photo will be saved along with a host of other metaData to bloomberg's server?


Well, you could read the source of course, to see whether the photo data is sent to the server or whether the cert is generated client-side. And then there is this: https://github.com/bloombergmedia/whatiscode/


I could be convinced either way.


The scroll performance was bothering me so much i had to add transform: translateZ(0); to the #background-canvas element of the page to stop the screen painting on every fking scroll; to continue to read in peace without my eyes bleeding. Great article though :)


I had to switch to view/source to read the article. Halfway through there was a shopping cart on wheels obstructing the text (ironic).

* <div class="videoWrapper">

    <div class="videoWrapper2">
<script src='//cdn.gotraffic.net/projector/latest/bplayer.js'>BPlayer(null, {"id":"P4_i7PihRGiWcPh3gdNMhg","htmlChildId":"bbg-video-player-P4_i7PihRGiWcPh3gdNMhg","serverUrl":"http://www.bloomberg.com/api/embed","idType":"BMMR","autopla... </div>

    </div>
*

Also - I have no CPU activity at all, so presumably some plugins that are running for others, aren't being executed in my copy of chrome.


Sounds like a better experience than on Firefox, which fails to load anything, even text, past the first video.


> Smalltalk’s history is often described as slightly tragic, because many of its best ideas never permeated the culture of code. But it’s still around, still has users, and anyone can use Squeak or Pharo. Also—

>

> 1. Java is an object-oriented language, influenced by C++, that runs on a virtual machine (just like Smalltalk).

> 2. Objective-C, per its name, jammed C and Smalltalk together with no apologies.

> 3. C# (pronounced “C sharp”) is based on C and influenced by Java, but it was created by Microsoft for use in its .NET framework.

> 4. C++ is an object-oriented version of C, although its roots are more in Simula.

>

> The number of digital things conceived in 1972 that are still under regular discussion is quite small. (It was the year of The Godfather and Al Green’s Let’s Stay Together.) The world corrupts the pure vision of great ideas. I pronounce Smalltalk a raging, wild, global success.

Except that these examples are "object-oriented" in almost none of the ways Smalltalk was object-oriented: http://www.paulgraham.com/reesoo.html

The specious reasoning on display in this paragraph is almost offensive in its glib uncomprehension. Calling Smalltalk "a raging, wild, global success" because modern programming languages call themselves "object-oriented" is like saying women in technology are well-represented because Ada Lovelace was the first programmer.

I get that it's supposed to be tongue-in-cheek, but like the rest of the writing in this article, it's supposed to be tongue-in-cheek in a way that gestures toward what the author actually thinks. In this case, what it's gesturing at is the notion that Smalltalk has had a large-scale tangible influence (if not wholesale adoption) on modern programming languages, which, if you actually take the time to understand the subject, is just not true.



The Charlie Rose interview about this piece: http://charlierose.com/watch/60575137


> “No,” I said. “First of all, I’m not good at the math. I’m a programmer, yes, but I’m an East Coast programmer, not one of these serious platform people from the Bay Area.”

seriously?


Well, in the same article he says "A computer is a clock with benefits." So I believe that no, he isn't serious.


That jumped out at me too. It colored the remainder of my reading experience.


yeah ... trying to put it behind me. this has otherwise been a great high-level introduction to coding.


It's a joke.



I would have loved the new Safari mute features on that page....

How dare you pollute my ears with garbage without a mute option.


Fun article. Even to this day sometimes I look at the software we have all built and wonder:

> It’s amazing any of it works at all.


Resizing my browser window (Firefox 38) in the middle of reading causes a section of the story to loop infinitely.


Code is a demonstration, the ipothesis are the requisites. Actually, the test class is the proof!


very cool article

although to a layman I would try to answer "what is code" more simply: code is just instructions.

instructions for how to tie a windsor knot or cook a recipe or play a piano piece can be thought of as "code" executed by the human.


i viewed source to see how they did the custom skin and noticed this:

    if (!console.log) console.log = function(){}
shouldnt it be if(window.console) ?


> It’s a comedy of ego, made possible by logic gates.


Anyone else find the easter egg ?

(..old school video game code)


What a cpu and memory hog that page is!


Many disclaimers:

1. This clearly took A LOT of work, and I have not finished reading it. I intend to, but as another comment calculated below, that will take around 127 minutes. This comment is simply about the beginning.

2. I'm not 100% certain yet what the intended goal of this article is, so I may just be off base. That being said, my criticisms should be interpreted more as questions, since I'm deeply fascinated with how to make programming more accessible. I hope they are taken as such, and people share their experiences/successes/failures in getting people to understand "what we do". Again, like other commenters here I have suffered the fate of parents not really understanding what you do (unlike the even superficial understanding of what a physicist does).

3. People learn differently, this is me pretending to not know anything and reading this article. It is thus flawed on two axises: I can't know for sure how I would have taken it in, and even if I did, it may be great for most people but bad for me.

All that being said, I had a few issues with this article('s beginning) if the goal is to make programming seem understandable to non-programmers. It seems to jump around a lot at the beginning and focus on just how complex everything is. If the goal is "programmers are justified in their work, look how complex everything they deal with is!", then this may be an OK approach. However, if the goal is to help them understand what we do day to day, it may not.

Some examples:

1. The early references to math. I once upon a time thought math was a pre-requisite to programming. I have now met enough awesome programmers that are absolute rubbish at math that I no longer believe that to be true. I believe referring to the "math" of things a lot scares people off (makes it seem like "one of those math things math people do" and inaccessible, when in reality your everyday programmer does not do a lot of (complex) math).

2. The early reference to circuits, compilation, and keyboard codes. This is a tremendous amount of scope that is unnecessary in my opinion, and just makes everything seem so obtuse. Showing keyboard codes goes a long way in conveying how much a computer does, but I feel is very confusing in relation to programming. I don't deal with "keyboard codes". We could also get into for example the actual hardware and how even having to deal with denouncing a key is hard! But I think everyone would see why that isn't great for the (introduction) of a programming explanation.

3. The circuits I believe are pretty and let you do things interactively, but I have a hard time believing they convey any information to people not familiar with programming. No one knows what XOR means (which you can flip the gates to), and just furthers the idea that code is this weird incantation we do. More putting them in "awe" of programming than understanding it.

Then again, I've been criticized for relying to heavily on analogy. My explanation would probably start with a lot of hand waiving: "lets tell the computer to get a sandwhich shall we?", then trying to get deeper bit by bit, etc. Others have probably tried this and failed, so I am genuinely curious if people walk away from this article feeling like they have a better understanding of things.


"If you’re old enough to remember DOS, you know what a command line is."

This is a joke right?


why is it a joke? it's true.


it's funny because it's true.


You may be surprised how many programmers don't use CLI. I have clients on windows machines that saw me running git commands in terminal and said "So glad we have a gui to manage this for us"


TLDR, good lord


I would try to explain it as levels of abstraction and how they extend beyond the computers that execute the code. You can go down through the levels of abstraction, 1 by 1, until the point is made rather than attempting to start from the bottom and work up.

So, for example, when talking to the non-technical executive, the first level of abstraction is the technical expert that tries to explain complex technical issues. Below that, there might be a technical management layer that deals with technical issues on a more granular level, but still isn't looking at the code. Below that there's the actual developers who are writing code and are concerned with the actual logic the computer is executing. Below that are the framework authors that abstract away the common parts of writing an application of a certain type. Below that are the language platform authors who write compilers or interpreters that translate the code typed by the programmers into a format that either the computer or a lower-level abstraction (LLMV, etc) deals with. At this point, it's probably not necessary to go any lower, but you can go all the way down to CPU/machine architecture level, if necessary.

The key point is that even highly-technical people have to trust the layers of abstraction below the point where they have full understanding. I've been coding for over 20 years and I still only have a cursory understanding of how my compiler is translating the code I write into machine code, let alone how the actual hardware that runs the code. I took EE courses in college and understand the theory, but the implementation by the folks at Intel and other hardware vendors is opaque to me and I'm forced to trust that it works.

The coders employed by your company may be able to dig into framework code, but the chances are that they're fully trusting the runtimes that they work with. That trust may be the result of a well-earned reputation or through testing that the claims made by the language runtime are empirically true, but it's still trusting something that they're unequipped to verify themselves. This need to trust bubbles all the way up to senior management. The systems are just too complex for anyone concerned with the finished product to understand the whole picture.

That means that, as an executive, you're likely trusting your senior technical leadership. The only way you avoid doing that is to dig in and better understand the abstraction layer they're providing. You can also make that trust easier by doing the same sorts of things that a coder does with their language runtime...give tasks to your abstraction layer and test whether they're completed successfully. And, when those tasks are not completed successfully, don't accept techno-babble responses, dig in to understand the wheres and whys of where things broke down. Likely, the chain of trust of those abstraction layers was broken at some point...figure out where that point was so you can prevent it from happening again.

Every abstract layer adds uncertainty to the system. A CPU engineer can tell you how long a small task will take within a ns or so. A compiler engineer can tell you how many CPU cycles an expression will result in and compute an approximate time for a given processor within microseconds. And it continues as you go up the chain until you're talking to senior management and he's giving you swags with a margin of error of months. Understanding this goes a long way towards explaining the behaviors that are so confusing to the non-technical executive. It's intimidating, but the good news is that many of the skills of a good manager are what's necessary to achieve the necessary level of demystifying. The way that you begin to understand these layers of abstraction is through inquiry. Ask the right questions and, over time, you'll understand more and more of how software development happens.


awesome


Ignoring the content, the structure of this article is amazing. It feels like an entire magazine in a single essay. The background animations that change as you scroll, the contextual content (try scrolling really fast). I'm not even all that keen on the bright oversaturated aesthetic, but it's just so cool. I'd love to see a short piece on how they made it.


Have you seen their error pages?

http://www.bloomberg.com/lookathis



->I'm not even all that keen on the bright oversaturated aesthetic thats right


I'm thrilled that they dare to have an aesthetic that hasn't been proven on a billion other sites.


I was rather disappointed. I wanted someone to point a finger and yell HAAAAXXXXX!!!


It is, in fact, an entire magazine in one essay. It's the only thing in the most recent issue of BW.


Intro articles like this do a lot to reveal biases and misunderstandings. Like with Java.

The article says "Java= enterprise" but I can tell you the best user experiences I ever saw delivered over the web were those done with Java Web Start (not applets- applications launched in a JVM from the web). I developed several in the day that continued to run for years- because users loved them and they were safe and secure.

Why Web Start didn't take over, I have no idea. It was also a superb platform for mobile delivery.


> Intro articles like this do a lot to reveal biases and misunderstandings.

This is one of the reasons I barely recommend any intro articles in Lean Notes (http://www.leannotes.com/): almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.

Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)

It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.

The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).

[Code]: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...


I can't understand the rationale behind this gaudy redesign job that Bloomberg carried out. I just can't wrap my head around it. It just violates everything that I know about web design and usability for news/corporate websites/portals.

Maybe they were trying to pull off a Craigslist here but still I can't really stomach these changes.


This is not the main BBG website, but yeah I don't like the new BBG Home page design either, this post design (its a post for bbg/graphics) is actually really awesome and its main purpose is to be stereotypical "nerdy" .


Good article. Sub par web design.


Watch it when capitalists pushing incessantly people to learn coding. They're trying very hard to cut the costs of their input "materials" and they will do everything that they could to devalue us in every way possible.

So, if you're a talented and competent dev, be super aggressive with these predators and take everything your hands can grab before they have the upper hand and show us their true colors.

Happy Coding!


This is a really ugly, selfish attitude. It's like opposing literacy because it will put pressure on jobs for those who can read and write. It's circling the wagons around people who had the privilege and opportunity to learn these things before everybody else.


Agreed, we need more Bloomerg articles that increase the supply of financial and negotiating skills.


> It's circling the wagons around people who had the privilege and opportunity to learn these things before everybody else.

The underprivileged Bloomberg readership.


http://www.forbes.com/sites/venkateshrao/2012/09/03/entrepre...

"...the balance of power between investors and entrepreneurs that marks the early, frontier days of a major technology wave (Moore’s Law and the Internet in this case) has fallen apart. Investors have won, and their dealings with the entrepreneur class now look far more like the dealings between management and labor (with overtones of parent/child and teacher/student). Those who are attracted to true entrepreneurship are figuring out new ways to work around the traditional investor class. The investor class in turn is struggling to deal with the unpleasant consequences of an outright victory..."


This sounds contradicted to the mathematics/empirical/quantitative observations. Startups built on texting two alphabetic characters are attracting million dollar financing rounds. Startups based on erased timeouts of texts are declining 3 billion dolllar acquisition offers. VCs are trying to get deal flow by building the reputation of being the most helpful to entreprenuers. Interest rates are at historic lows and hundred billion dollar pension and mutual funds are pouring money into every 1st to 3rd tier vc to chase returns. tl;dr...this sounds like some bs.


The privilege level of Snapchat's founder would place that particular example closer to the investor segment. Quantitative observations can be complemented by fundamental analysis. tl;dr the articles provide 10 pages of relevant context.


"Privilege level". I guess that means because his dad was already rich, the valuation of billions of dollars for a disappearing text service means nothing, along with the fact that forbes inflated a two-page article into 10 means something. Ok.


It means the "balance of power between entrepreneurs and investors" is different when your monthly allowance runs into the thousands, i.e. this very successful and commendable outlier is not represenative of the norm.


I don't see that... or rather, if it's true here it's true to a significantly greater extent in most other industries.

Money continues to be available, and often lots of it. It's available on better terms than most others in most other professions can even imagine receiving.

To put it bluntly: in most industries you are meat and own nothing and never have any chance of owning anything. This has been the condition for nearly all human beings who have ever lived, today and in the past.

There are also more alternatives to VC today: larger angel rounds, crowd funding, etc. It's also easier to bootstrap since everything (but people) has fallen in price. Those two things together have made the funding environment more competitive for VCs -- they have to offer more value or compete at the higher end.


Let us keep the sacred arts secret, brothers.

EDIT: Just being sarcastic. You are clearly incompetent, coasting along in your job, and afraid of someone with 6 months of experience being better than you.


Seems to have worked well enough for doctors and lawyers. They're unionized (through the AMA and ABA), upper-class professionals who command much more respect from the general public than we do, and whereas our salaries tend to max out at around $150k, theres can easily exceed $500k (in the case of medical specialists or law firm partners).


Hmm. Yeah, I don't know. Its possible to make substantially more than $150K as a high school drop out software engineer. And of course there is these days a cash out option for some.

I'm okay with giving respect to someone who spends 7 years in school - 4 + 3 for law school, or more in the case of doctors, to learn to be professionals in their field. And I have no qualms with what they make either.


The AMA and ABA aren't labor unions; outside of government service -- where programmers are also often unionized, too -- doctors and lawyers generally don't have labor unions.


How exactly are they not unions? They lobby for legislation to erect barriers to entry to reduce competition and secure higher wages for their members. The ABA was actually the target of an anti-trust investigation by the Justice Department that resulted in them pleading guilty and paying a fine: http://en.wikipedia.org/wiki/American_Bar_Association#Antitr...

The AMA (and AAMC, and the licensing boards) should be dealt with the same way, ideally even more harshly.

All your post did was underline just how effective these professions are relative to programmers at securing higher incomes and greater prestige and shaping their public perception.


Lobbying for legislation is a tangential part of what a labor union does, and lots of organizations that aren't labor unions also do that. Labor unions first and foremost do collective bargaining on behalf of employees with the business owners who employ them. ABA and AMA don't do that.

ABA and AMA are professional associations, like ACM, that happen to have effective lobbying arms, and their members are often independent business owners rather than employees.


You must not know many doctors.


Everything I've heard about the lives of doctors and lawyers says that by and large (discounting the handful of percent of outliers) those professions make living on welfare or working at McDonald's for minimum wage look like brilliant career moves. I wouldn't look to them for models of how to do things.

I do, mind you, think it is a good idea to encourage programmers to improve their negotiating skills and understand a little about business and politics.


And you are an idiot.


Capitalists only want to reduce unnecessary costs, not all costs.

A programmer is more like factory equipment than a factory worker. If a company invests in superior equipment, they can produce better quality products and net higher profits.

If we're going down in the name of efficiency, the managerial and legal professionals are going first.


In this industry it seems that product success is only loosely related to product quality.


The accountants are in even bigger trouble than the lawyers, fwiw.


Watch it when capitalists pushing incessantly people to learn coding. They're trying very hard to cut the costs of their input "materials" and they will do everything that they could to devalue us in every way possible.

Did you actually read this article? The article doesn't aim to teach Bloomberg's audience (which consists of VPs, SVPs, and managers, as implied in the first couple of paragraphs) how to code or replace the average developer.


This is egregious if you replace "coding" with "writing" or "reading".

Historically, any perceived detriments of mass education have been significantly outweighed by benefits.


That's true, but it's also true that you can only add so much stuff to the syllabus, and then there'll be too much school.

You've got to draw the line at some point. Why not draw it at boring computer stuff that barely anybody needs to know?


Capitalists want people to be generally happy. Happy people buy more things. People who have useful well-paid jobs are generally happy (and can buy more things). Everybody knows that there will be huge drop in available jobs next few years, but software development skills will still be in high demand. Hence, capitalists want more people to learn how to code, so they could keep their jobs, so they can be generally happy and buy more things.

Capitalism _is_ profit-oriented, but happy people bring more profit.


Indeed. Labor costs.


Why did Bloomberg ask to use my camera while reading the article


So it could take a photo of you for your certificate of achievement


"How often are you going to be multiplying sevens and cats? Soooo much."

Where the fuck does this meme of "fundamental type mismatches come up all the time in ordinary code" come from? What kind of defective system are people writing where it's normal for strings and numbers to be interpreted relationally (even accidentally)?

It sounds like the author is trying to demonstrate the significance of things like syntax transformations and format conversions (like transforming an email address to a mailto link), but that's nothing like "multiplying sevens and cats". It's manipulating things that aren't inherently incompatible - if anything, it's multiplying sevens and "7"s.

All these batshit insane contrived examples in asides like http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod... do is make code seem less accessible and comprehensible to anybody who isn't already intimately familiar with what's safe to interpret as sarcasm or hyperbole and what's not, which goes exactly contrary to the stated thesis of the article.


It can happen accidentally quite easily. Someone new to a codebase starts hacking in a feature and mistypes a variable as 'value' instead of 'values'. They fail to realize there's already a 'value' variable in the global namespace (perhaps it's a gigantic spaghetti code mess of a file). They don't have good test cases that exercise this exact line and fail to see the bug. Code ships to production, three months later the line runs and explodes.


Your example is quite good, although there are far more bulletproof ways than exhaustive test cases to make sure this doesn't happen.


On the web it's sort of all strings, so it's not hard to be in a situation where you have "length=7" & "cat=tabby" and get into a problem. Beyond that, many developers are in the habit of using primitives for everything, which makes these sorts of errors much more common.


Yes, it's possible to have strings for two different things. In what world are those strings going to be cross-evaluated?


I'm glad I came here to read the comments that urged me to read on, because I stopped at the point where the VP was whining that his job was on the line and the software guy's wasn't. Made me a little sick to my stomach. In what company is that ever the case? Even if the VP's job is lost (rare occurrence in my experience), the severance package is more than the software person's salary for a year is.


My understanding was that the development manager in the taup blazer was an IT consultant brought in to run the project making it a little easier for that person to disappear to the next gig no matter how disastrous the project turned out.


That's what I was thinking too. But then they seem like employees in the rest of the story. A little ambiguous.


Hmm, that did come out quite negative. I'm sorry. Personal stress coming through.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: