Originally, "code" referred to machine language. "Programming" meant designing the computation, and "coding" referred to translating it to machine language. "Automatic coding" was the term for translating a high level program to machine code, and it covered compiling and interpretation.
You can read all about it in Grace Hopper's 1955 paper Automatic Coding for Digital Computers.
"The
analysis
is
the
responsibility
of
the
mathematician
or
engineer,
the
methods
or
systems
man."
[...]
"The
job
of
the
programmer
is
that
of
adapting
the
problem
definition
to
the
abilities
and
idiosyncracies
of
the
particular
computer.
He
will
be
vitally
concerned
with
input
and
output
and
with
the
flow
of
operations
through
the
computer.
He
must
have
a
thorough
knowledge
of
the
computer
components
and
their
relative
speeds
and
virtues."
[...]
"It
is
then
the
job
of
the
coder
to
reduce
the
flow
charts
to
the
de-
tailed
list
of
computer
instructions.
At
this
point,
an
exact
and
comprehensive
knowledge
of
the
computer,
its
code,
coding
tricks,
details
of
sentinels
and
of
pulse
code
are
required."
I don't think that's accurate. It used to be that "he" and "man" were understood to be gender neutral in abstract and impersonal cases. Such as "to boldly go where no man has gone before" and "mankind" and "mailman". During Hopper's life, English language changed to expunge the ambiguity.
I have to concede that I still talk in this now outmoded manner. I have to make a conscious effort to switch to modern gender neutral language (eg. they) when speaking or writing to anyone outside my family and friends.
> English language changed to expunge the ambiguity.
I think it's more accurate to say it adopted different ambiguities. The use of 'they' can create ambiguities of number, and just switching between he/she leaves the same ambiguity (is the gendering intentional or not?) albeit in a gender balanced way.
As a side-note, 'he' was used to refer to people, who could be regarded as interchangeable (man or woman) and 'she' was reserved for 'uniquely individual' things, which is why countries and ships (as two examples) are referred to as such. At least, this was may understanding when growing up and I've never lost this habit.
This is a beautiful distinction (one that I'm aware of), but which appears to have been lost today.
Consequently, these terms are now used interchangeably, leading many people to conflate these two processes.
However, this distinction is indicative of how getting a computer to do something has two components:
1. Figuring out what needs doing, and
2. Expressing this in a form that the computer can carry out.
The distinction is lost because in many cases it doesn't hold relevance anymore. You don't need to write flow-charts, state or class diagrams before you start writing code. Todays tooling allows for more iterative approaches and also leaves room for making syntax errors without having to wait hours and days to get "compiler" errors.
But I agree it's still interesting to know of the distinction and know that both can require slightly different skillsets. So when you have someone that is very good at planing and has the experience to layout complex systems in advance, you can assign them an architect title, while someone who knows a programming language inside and outside, you might be better off keeping them in the "driving" seat.
The point here is that when higher level languages were introduced, writing the higher level program was understood to be part of "programming"; and its compilation or interpretation became "automatic coding".
Another thing worth paying attention to in Grace Hopper's paper is this: higher level program code is in fact called "code", but specifically "pseudo code".
That which we call "pseudo code" today doesn't execute. But we sometimes remark that certain high level languages are almost like "executable pseudocode".
At that time, the task of the compiler/interpreter writer was to take "pseudo code", formalize it more precisely so that then the machine could "automatically code" from "pseudo code" to "code".'
At the birth of Computer Science there is this notion that it's (as we might say today) turtles all the way down. Kurt Gödel, Alonzo Church, Alan Turing in the first half of the twentieth century come up with this weird stuff at the edge of the philosophy of mathematics to answer maddening questions, what is "computing" and what makes something "computable"? It seems to have no answer that isn't self-referential. The systems they imagine are endlessly defined in terms of themselves. But they are all theoreticians, and in theory this is perfectly satisfactory.
When actual digital computers get built, the engineers responsible for using them are not at first much interested in the theory of Computer Science, their machines resolutely do not come with an infinitely long paper tape, the problems they're solving are very specific and finite, these are practical people.
But Grace _applies_ the theory. Since the process of turning the high level (e.g. assembly language) into low level (e.g. holes in a paper tape) is mechanical, why not have the machine do it? And if this can be done, why not recurse, and produce even the assembly language from some higher level program. Productivity is improved enormously. And this is at the heart of everything today. Even our microprocessors are really full of microcode that implements more complicated instructions in terms of smaller ones, so that the "machine code" isn't the final form executed by the hardware in practice either.
You may or may not say "systems man" about a woman (depending on the language etc.), but I don't see that the sex of the _speaker_ has anything to do with it.
I don't think this really achieves what it sets out to do, unless it was meant to be aimed at already very tech-literate people who just don't program yet. It fails my usual test for laypersons' explanations: would my mum understand it?
My mum is a smart but non-techie person. I think this article, jumping off from its irrelevant take on business needs, through a (in my opinion) backwards diversion that fails to actually explain how computers process and display text, only then to touch on algorithms before racing off talking about conferences and how programmers feel about things, doesn't come close in all that to a solid laypersons' explanation. Soon it's asking you to evalute chunks of code when, if you are reading this article as someone who wants to know "what is code?", you don't even know what a loop is yet.
This seems more like "What is tech culture?" for people who are already a part of tech culture. I think people like it for its animations and interactive elements, but when you strip those away there's very little actual content there.
Considering this is Bloomberg the intended audience is likely business people. The "irrelevant take on business needs" is far from irrelevant in this context. On the contrary, it's describing the type of person this article is aimed at.
It succeeds in the goal of helping programmers understand how they are perceived by business people, and how to be effective in a business environment.
The 'scrum master' guy in the first chapter is the sort of terrible communicator who gives technical people a bad name. He either can't or won't express himself in terms familiar to his audience, instead throwing out a stream of terms that he knows mean nothing in a bid to appear sophisticated.
Amusing and whimsical. So.much.content. I enjoyed the end though. I spent about 17 minutes scrolling through, reading different parts and got to the end. It called me out for reading nearly 2k works per minute to finish in that time. haha.
So you listened to the Tomorrow podcast rerun as well?
I think I skimmed this when it first came out, but after two years working at a consulting firm where our clients are borderline tech illiterate, this seems like a pretty decent primer to me. It's not that our clients are idiots (mostly), it's just that they are lacking 20 years or context. This does a great job at filling in some of the gaps.
A favorite article of mine! Paul Ford has continued to write good explainers for Bloomberg Businessweek (most recently on the Github acquisition: https://www.bloomberg.com/news/articles/2018-06-06/github-is...) but I've been waiting for him to write something long form again...
I wish they would tell the story further, including how the man in taupe blazer would deliver the project and whether it would be a success or not, and what would happen past delivery.
You can read all about it in Grace Hopper's 1955 paper Automatic Coding for Digital Computers.
http://www.bitsavers.org/pdf/univac/HopperAutoCodingPaper_19...
Though a woman, she talks the masculinized talk:
"The analysis is the responsibility of the mathematician or engineer, the methods or systems man."
[...]
"The job of the programmer is that of adapting the problem definition to the abilities and idiosyncracies of the particular computer. He will be vitally concerned with input and output and with the flow of operations through the computer. He must have a thorough knowledge of the computer components and their relative speeds and virtues."
[...]
"It is then the job of the coder to reduce the flow charts to the de- tailed list of computer instructions. At this point, an exact and comprehensive knowledge of the computer, its code, coding tricks, details of sentinels and of pulse code are required."