Hacker News new | past | comments | ask | show | jobs | submit login
"But It Doesn't Mean Anything" – A demonstration of the value of computer code (ex-parrot.com)
86 points by inglesp on Feb 13, 2014 | hide | past | favorite | 56 comments



Paxman's point about code being 'meaningless' was to emphasise that the end goal of this initiative is stupid. It's not "learn to think analytically", it's not "learn to express your ideas precisely", it's not "learn to instruct a computer", it's not even "learn to use a computer", it's "learn to code".

Computers are clearly important, and Paxman would not deny it, but that doesn't make school IT classes any less pointless. They are terrible because they teach what buttons to click to make Microsoft Word 97 display text in bold, rather than the ability to think about problems; arguably they teach against thinking about problems (like "how do I make text bold?"), in favour of rote learning and hand-holding. Likewise, teaching kids which punctuation marks to press in Notepad to make a HTML element turn green is also a terrible idea, because it's focusing on the code. This is also the most common complaint I've heard about undergraduate computer science courses (teaching one particular language's syntax rather than how to solve problems).

I saw this summed up best on /. http://developers.slashdot.org/comments.pl?sid=4771525&cid=4...

"The problem here is that "code" is being used as a synonym for "computer magic"."

This captures the essence of this interview exactly. Paxman made a provocative remark, and did Dexter argue against it? No, she agreed with him and said some nonsense about code being for computers:

"It doesn't mean anything to you, or to me yet, but it's the set of instructions that you type into a computer to get an output."

In other words "computer magic". What Dexter doesn't realise, probably because she's not spent a significant amount of time programming, is that code is for people to understand and only incidentally for machines to execute. If programming was only about satisfying the computer, our keyboards would only have two big buttons: 0 and 1.

Now, this was essentially true in the early days of computing (punching cards) but we've progressed beyond this. The reason we use code is to allow people to discover, understand and build on ideas.


"The reason we use code is to allow people to discover, understand and build on ideas"

The reason we use WORDS is to do the same thing...

But somehow nobody will ever insist that:

"Likewise, teaching kids which letters to press in Notepad to make a sentence is also a terrible idea, because it's focusing on the characters."

or that teaching spelling, punctuation and grammar are "meaningless".


The analogy's not quite the same though; it's good to teach spelling, punctuation and grammar, but they're not the end goal. Indeed my Google-fu can't find any reference to a "Year of Spelling", a "Year of Punctuation", a "Year of Grammar" or, indeed, a "Year of Text".

I can, however, easily find links for "Year of Reading", "Year of Writing" and "Year of Literature".


spell eng

Punch You! Hey Sean

and granma

ain't meaningless but take a backseat to phonetics and typography, and contemporary pedagogical practice in language arts emphasizes the act of writing rather than compliance with standards from the Victorian age.


Sometimes we keep standards for a very long time because they're really effective. Written English is one such example.


Poiesis does not require standards. Shakespeare is from the age before standardized English and "Poiesis" fails my browser's spell check. It's not just e e cummings and Ginsberg's howlings.

The obsession over standardized spelling in particular is an accident of the English language. There aren't spelling bees in German or French because phonetics is the important point of reference in ordinary human language.

The standards of English language are useful but for matters of taste and social differentiation not communication, e.g. the tenses of "read" which is about as important as a word can get in the context of written communication.


"The standards of English language are useful but for matters of taste and social differentiation not communication"

There are people living in the same country as me, speaking the same language, less than 200 miles away, whom I struggle to understand in conversation. We spell and write the same way, so we can communicate through the written word. If we relied on phonetics to be able to communicate, we wouldn't be able to. Having standards means we can communicate; I suppose what I'm saying is that I just plain think the statement above is wrong, based on my own personal experience.


There aren't spelling bees in France, but there are dictation competitions, graded on spelling, punctuation, and accents [grave, acute, etc]. And the French (at least the Academy) are notoriously sticklers for the purity of the language.


What do you think is the absolute minimum requirement to use a computer to its full potential?

A computer is basically a universal machine. It can do anything.

To use it, you need to specify your need with no ambiguity whatsoever. Code is just that: a non-ambiguous, Turing complete formal specification. C, Haskell, boxes you drag and drop, it doesn't matter. If you don't learn to construct and manipulate such code, you will never realize your computer's full potential, and will be at the mercy of those who do.

I'm not against division of labour, but this is different. Computers are the first machines that can have all our mental powers. The only reason they don't is because we don't know how most of our powers work. The day we do, however… http://intelligenceexplosion.com/

"Computer magic" you say? That's more true than you think: programming its awfully close to hermetic magic as found in fantasy settings: scribbling bizarre abstract signs on a surface does wonders beyond the understanding of most mortals, learning to make such scribblings often takes years of dedicated practice, and a single mistake in those scribblings, however small, can result in catastrophes, up to and including fatalities.

Coding is an Arcane Power from the Ancients. We should treat it with the same respect. Heck, everyone wants to be a Wizard. Why don't everyone want to know some actual magic?


In practice, however, our contemporary computers are far more complex than `send input/get output; master computers`.

Not because the basic mechanics have changed, but because our software ecosystem is vastly complex. The days of having a relatively simple and grokkable system are long over. Our computers are powered by extremely complicated, intricate and numerous interacting subsystems, where learning everything is akin to being a polymath, and people naturally need to specialize.

With this in mind, not all code is created equal. You're still under far more mercy with a visual programming language than a textual one.

Yes, programming is essential to realizing the full potential of your computer. Yet there is still so much more. The code is the main arbiter, but there is still a lot beneath, dealing with computation, discrete mathematics, information theory and so forth.

You're still at mercy when you can just blindly type instructions guided by your interpreter and it spits back programs with horrible performance, because you have no idea how data structures work, can't optimize for shit and don't really know what your interpreter does behind the scenes anyway.

I appreciate the nice fantasy gobbledygook about ancient arts and black magic, but I think you're getting full of yourself here.

Ultimately, in order to strive with complexity, first one must learn to operate a higher level above coding itself (system administration).

This is one of the aspects where the "learn to code" movement is shortsighted. When you operate on very high level abstractions, you get false ideas. Of course, abstractions aren't bad and ultimately every new language we use strives to wrap more away and give us a simpler interface. Yet being ignorant of the low-level details is still a curse, a disability, if you will.

That and IMO, it's just fucking stupid to have kids who can write console apps in C#, but can't use the shell to debug an OS issue. What use is programming when you don't even know your environment?


> Yet being ignorant of the low-level details is still a curse, a disability, if you will.

That very much depends on how leaky the abstractions are.

The things you consider "low-level" are themselves quite high level and abstract. A "register" and a "cpu instruction" are abstractions too. Yet you can take them for granted without worrying about the microcode inside the CPU, or the clock propagation, or quantum leakage in the logic gates. Etc.

We need good abstractions, and when they're actually good we shouldn't worry ourselves about what's going on inside the box, unless for the pure fun of hacking.

In practical terms, I agree with you that most people who are trying to write software today would benefit from knowing more of the layers below them. But this is only necessary because of the shortcomings of our abstractions.

If we achieve a glorious future where all people can wield the full power of general purpose computers, it won't be because everybody learned all the arcane layers. It will be because we put together really powerful abstractions.


That's fine. I still think one should know how to do sysadmin work before coding.


> Our computers are powered by extremely complicated, intricate and numerous interacting subsystems

Which is one of the big mistakes of the last decades. I understand market forces and Worse is Better, but the result is still way worse than what we could have gotten if we lived in Ponyland.

Complexity is overrated. Current home desktop systems (OS + browser + office suite + mail + drawing app) are over 200 millions lines of code. Now guess how much it really takes to build an equivalent.

About 20,000 lines, including the self-implementing compiler collection.

http://vpri.org/html/work/ifnct.htm

http://www.vpri.org/pdf/tr2011004_steps11.pdf


I too dislike the obsession with learning to code because, unless you're doing it volutarily and letting your own passion guide you to backfill missing bits of knowledge, it becomes a rote learning exercise devoid of understanding. As you state, the focus isn't on how to solve problems which is the raison d'etre of programming so the effort is wasted.

The problem I see is that we treat computer science as something you eventually get round to learning once you've already taken some preliminary steps with coding, and even then it's something to be looked at late into high school or at university. Code is the awesome tool that motivates learning, but I wonder if the process is taught the wrong way round. Could it be done a little compsci first, then code?

I started a series of blog posts trying to introduce the non-technical reader to the fundamentals of computer science in plain english: http://lifebeyondfife.com/tag/compsci-in-plain-english/. Still a work in progress, I'm trying to keep it interesting and explain why computer science is important at the same time.


Learning to problem solve would be a much more important lesson than learning to code.

Problem solving transfers across disciplines and is a useful lifeskill.

It could be combined with how to assess the quality of information or how to find decent sources of information.


There isn't really such a skill as "problem solving", problem solving is just applying intelligence and experience to something that you have been taught to do.


This is for children, so it's very basic. Read the question. Reword the question if that helps. Gather the information you have. List the information you need, and how to find that information. Sit and think about it a bit.

There are plenty of people who just throw their hands up in the air and give up if you ask them something they don't know.


Precisely this.

The "learn to code" hysteria is just the authorities trying to sweep over their incompetence in compulsory schooling by seemingly appearing to be in the times.

I can only expect progra-sorry, coding, will be awfully taught.


You say that now - and the inevitable procurement carve-up bonanza hasn't even started yet! I for one can't wait to see what overpriced proprietary toy language my hypothetical children will be pre-emptively de-skilled in.


> code is for people to understand and only incidentally for machines to execute

This was a clever quote that has been taken way too far and simply isn't true. Even Hal Abelson backed off of that quote in SICP when pressed about it: http://www.gigamonkeys.com/code-reading/. Code just is not primarily for people to read – it is primarily to make computers do things. Just think for a second about what you're saying:

> The reason we use code is to allow people to discover, understand and build on ideas.

Really? The reason most people write code is to express an idea with that code? I don't think so. People code to make computers do things. The things the computers do may facilitate communicating ideas (e.g. wikipedia), but the code itself is not primarily about ideas. Honestly, the only exception I can think of is when I'm writing example code to convey ideas about programming.


"What Dexter doesn't realise, probably because she's not spent a significant amount of time programming, is that code is for people to understand and only incidentally for machines to execute."

Lottie Dexter doesn't know how to code. She said it in the interview.


She did not say this:

"It doesn't mean anything to you, or to me yet, but it's the set of instructions that you type into a computer to get an output."

She said:

"It doesn't mean anything to you, or to me yet, ^^^because I do not know how to code.^^^ But it's the set of instructions that you type into a computer to get an output."

Is exparrot.com affiliated with paxson or this initiative?


What basis are you using to know what Paxman was thinking? I agree there's room for potential mis-understanding (although I still think that sequence was horribly executed), but do you know that? Or is it just conjecture on what his meaning was? You could be projecting your thoughts on him.


Paxman's whole shtick is that he argues with the people he interviews. Usually it is quite effective, because mostly he interviews politicians who are full of shit and he points it out in a way that more reverential interviewers don't. However, he probably doesn't mean everything he says: he just takes a contrarian view to whatever the interviewee says.


Quite right. I don't assume that Paxman has any of the opinions he appears to espouse, he's trying to get at the truth of the matter in a way that people unfamiliar with the topic can understand.

It's part of an interviewers job to ask the questions that the random person on the street would like answered. Sometimes an aggressive style is the best way to force the interviewee to get to the heart of the matter and make their best arguments (which is part of why lots of Colberts interviews are actually very good).

Perhaps it's cultural. I've noticed before that when Stephen Sackur interviews people from some cultures (including the USA) on HardTalk, and he asks them a difficult question, or confronts them with the opinions of their critics, they assume that he is personally attacking them. Interviewers are there to get the interviewees to talk and answer the questions that the viewers will have. It doesn't matter what their personal opinions are, and you shouldn't expect to be able to infer them based on the questions they ask.


My issue is that if you're going to play devil's advocate then you should come up with stronger arguments than "it doesn't mean anything". If there are legitimate questions about the program that isn't one of them


I don't think his job is just to argue against the theory; that remark - and the fact that the interviewee was unable to confidently rebuke it - shows that the people responsible for it are ill prepared, and that is a good argument against the practical implementation of the program.


I thought he was simply putting forward the comment that some people make - "what's all that gibberish on the screen?", and seeing what she has to say to that.


Symbols on a computer don't mean anything. They're only interesting for their manipulation of outputs that can be used as physical control systems, or in the values that we project onto them (the true content of endless font discussions that don't involve dyslexia or blindness.)

I'm comfortable with this, and it doesn't feel like a threat.


I recently spoke to a guy who sells swaps for a major investment bank.

The basic idea is that he comes up with some trading strategy, and writes the code that computes the profit/loss associated with that strategy if it was executed at market fixing prices (typically the market open or close each day).

The bank sells a total return swap on this strategy, which means that in return for a fee from the client, they pay the client the stream of profits (or losses) each month that would have resulted from trading this strategy. Behind the scenes, the bank trades the strategy (or something like it) in order to hedge the risk associated with the swap, so that they can earn the fee approximately risk free.

There's an obvious problem with this, which is that the client basically has to take the bank's word for what the profit/loss of the strategy is. To counter that, the guy who came up with the strategy now has to write a 100+ page document in legalese, which outlines exactly how the profit/loss on the strategy is calculated. This has to be sufficiently detailed that someone couple re-implement the code themselves to check it. The guy I spoke to said that this documentation takes up >50% of his time.

I'm sure everyone here will appreciate how incredible it is that a bank will pay someone six figures every year to spend more than half their time writing documentation that literally does nothing more than reproduce a piece of code, except about 50 times more verbosely.

Different symbols, but it doesn't mean anything.


It's important to remember that the "year of code" initiative is not related to the new curriculum coming into force in September 2014. My wife is writing a book for the new curriculum and, from what she's told me, it sounds quite good.

At Key Stage 1 (ages 5 to 7) children are taught about algorithms, that they are a way of breaking tasks into a sequence of steps that can be reused to solve problems.

They will also be taught about the pervasiveness of software - about the different kinds of devices and appliances that rely on software.

Finally, they will be taught about privacy. I'm not sure of the content but I assume it'll be about managing information about themselves.


It's great that they're being taught algorithms. It gives a context and reason to math and fights the "when am I ever going to use this" problem. Math should taught alongside algorithms to give the techniques context throughout education.

I don't think kids benefit much at all by being taught web stuff. HTML is just a lot of boilerplate, and there are better languages than javascript that reduce the code to things happening on screen barrier.

If trigonometry is being taught alongside making an asteroids game, I think that could enlighten kids on what it can be important for. It's not just filling in values on a triangle, it's direction, it's how things move, it's integral to physics, and it's important.


When good books written by knowledgeable and well-meaning subject specialists are taught by teachers who do not and do not wish to understand the material, the result is a predictable disaster. Doubly so since most teachers are not particularly smart.

I know that sounds mean, but look at http://www.statisticbrain.com/iq-estimates-by-intended-colle... and notice how students going into education strongly tend to have lower IQs than most college majors. Smarter than the average adult, sure. But well below most people who get to college.


I agree with you for a middle or high school level course, but I would hope a teacher could wrap their heads around a 5-7 year old level explanation of algorithms if they're capable of teaching math.


Based on informal asking, most teachers who are supposed to be teaching fractions cannot figure out whether 2/3 is bigger than 3/5 or vice versa.

Given that fact, I am not optimistic about what you hope.


If that's the case, I don't believe dumbing down the curriculum to the lowest common denominator of teachers is the solution to the problem.


So much of that is hard to watch.

Particularly the teaching of jQuery to people who are just starting out. And the multiple assertions that one can "pick it up in a day", whether it be programming or teaching skills.

Seems to be about throwing together a web page in a day rather than learning to code.

And it seems the usefulness of computer programs doesn't extend beyond ecards, web sites, and "apps".


"Code" is a horrible word.

I prefer "source documents", or, if pressed, "Formal descriptions of the program".

Using the word "code" implies that the document is "encoded" somehow, which is plainly undesirable and wrong.

With some notable (1) exceptions, the primary consumer of a "source document" is a human being, not a machine.

The machine's purpose is to ensure the formal validity and correctness of the document - but the majority of the informational content of the document (variable names, structure, comments) is exclusively directed to human developers.

We will never program in a "wild" natural language, but many programming languages (2) make a deliberate effort to support expressions which simulate or approximate natural language usage, albeit restricted to a particular idiomatic form.

There will always be a tension between keeping a formal language simple enough to reason about and permitting free, naturalistic expression - but this is the same tension that makes poetry and haiku so appealing as an art form.

So many source documents appear to be "in code", not because this is a necessary part of programming, but because it is very very difficult to write things which combine sufficient simplicity for easy understanding, and the correct representation of a difficult and complex problem. In most of these cases, clear understanding is baffled not by the programming language, but by the complexity of the real world.

The rigidly deterministic nature of the computer forces the programmer to deal with a myriad of inconsistencies and complications that the non-programmer is able to elide or gloss over with linguistic and social gymnastics. The computer forces us to confront these complications, and to account for them.

In the same way that Mathematics isn't really about numbers, but about the skill and craftsmanship of disciplined thought, programming isn't really about computers, but about what happens when you can no longer ignore the details within which the devil resides.

(1) Assembler & anything involving regular expressions. (2) Python


>What angers me here is Paxman's attempt to make a virtue of his own ignorance.

It's important to be aggressively ignorant when interviewing people who are making a case for something. It's a hand-waving and "common sense" repellant.

You can't just say that everyone should learn to code because coding is important and everyone needs to know how to do it.

Other than the central argument, though, I actually love this post:) Paxman shouldn't mind being made a strawman for a pretty exploration like this. More, please.


This made me realize just how much the media's visual portrayal of computer programming is a problem. When he said "It doesn't mean anything" he's gesturing over to a stereotypical background image representation of code which is all tilted, blurred, overlapping, clashing colors, etc. It's purposefully skewed in every dimension to make it incomprehensible, and he actually uses that as evidence about something.


The interviewer is partly right, and students who have a natural aptitude[1] for programming often recognize this: Code is meaningless. At the application level, you may have something that looks like an image. Underneath, you may represent it with numbers. Numbers, however, are also a fiction represented with bits. The meaning is based entirely on one's interpretation. Creating a mapping between meaningless formalism and meaningful interpretation is the principle obstacle to learning to program.

I understand the point the author is trying to make, and the interviewer in the video has not grasped the above. To answer his concern, ask him what meaning does the letter 'd' have? Next to none, but in composition it can provide quite a lot of meaning. (This is an argument from Hofstadter.)

[1]: I had a study/source for this, but I can't find it.


So, they are teaching people to code, so they can make ecards? It is probably much easier to use something already written to create an ecard... And that is my whole point with this "learn to code." People think that everyone in the future will need to know how to code. No, in the future it will be important to know how to use the computer, but not everyone needs to know how to code. Just like not everyone needs to know how to do everything else in the world. You can teach people how use computers, without teaching them to program. And teaching them to program won't necessarily teach them how to use computers.


I don't think that Paxman's comment was as flippant as this article is making out. If you watch the video, his comment follows Lottie Dexter vaguely waving at the graphic of code behind him in desperate attempt to describe what code actually is when she quite sure herself. What he sees is a blurry mess of code pulled from who-knows-where that blatantly isn't of any use in it's current form. I'm not trying to downplay some of the awesome things done with code in the article, but I think the comment it's based on has been misunderstood.


Since the interview is seen by general population, he was just asking questions that an average computer illiterate person would ask and at the same time challenging and testing her on the subject.


Two of the most common questions I get asked when someone's curious about coding are:

1. So is it all just 1s and 0s?

2. So do you know how to hack?

Sometimes I cynically answer the first question with "Yes, 1s and 0s all day. Sometimes I don't even use 0s, just 1s" which I'm pretty sure I stole from a Dilbert comic, but its funny seeing the expressions I get.


wait wait wait your company can afford ones?!


One of the few benefits to working for a larger company rather than a startup - they can afford 1s and the more successful companies can afford 0s too!


Fucking Oracle...they'll let me use hex but it's a pretty hefty pricetag. >:(


"Code doesn't mean anything" is so mind boggling ignorant, it seems more like a cynical and rather insulting caricature of what the average computer illiterate person would ask.


Ah, you obviously never worked technical support! :)


Ah, tech support invokes a selection bias though. ;)


I found funny that she said "But it doesn't mean anything" when lots of artist now days make a career out of using FFT in Autotune...

Did I say that out loud?


I recommend anyone who doesnt understand learning the fundamentals of writing code read the book "player piano" by Kurt Vonnegut which depicts a dystopian future where most human labor is replaced by machinery and software and engineers and managers rule society.

http://en.wikipedia.org/wiki/Player_Piano_(novel)


If the general populace doesn't need to learn about programming because it won't mean anything to them, then the general populace really doesn't need to study the vast majority of history that we're forced to sit through and regurgitate in unimaginative fashion.

I, on the other, will sit on the side of the liberal arts and say that any and all learning is worthwhile.


The "Russel Brand" link (just before the footnotes) is pointing to the wrong url.


Paxman is a twat.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: