Hacker News new | past | comments | ask | show | jobs | submit login
Claude Shannon Invented the Future (quantamagazine.org)
259 points by theafh on Dec 22, 2020 | hide | past | favorite | 68 comments



I really enjoyed the James Gleick book, "The Information", that covered a lot of this material. It spent time on both Shannon and Weiner, as well as what was happening at the time.

https://en.wikipedia.org/wiki/The_Information:_A_History,_a_...


One of my favorite non-fiction books, was great to visit the computer museum in Mountain View and see the recreation of the analytical machine mentioned in the book.


I would say to myself that you must choose carefully and knowingly when it comes to books so that you do not read just anything. Classics like Romeo and Juliet are worth learning, of course. It is also curious because there are the same gender roles. At https://studydriver.com/romeo-and-juliet/ I found interesting information about this subject. I think it should be useful for those who love this work, which has actually become a cultivation and is still focused on its gender characteristics today.


I saw him at a book reading in Sydney for The Information, he was fantastic. He did a straw poll about who had heard of Claude Shannon, out of about 200 people four of us put our hands up. I still find it sad that so few people have heard of him.


Me too. One of my favorite books of all time.


Its cool


The audiobook is great to listen too.


Don't forget Nippon Electrical Corporation:

http://museum.ipsj.or.jp/en/computer/dawn/0002.html

Claude Shannon inventing switching theory is kind of like the american belief that the bessemer process originated in pittsburgh, or that the Intel 4004 / 8008 was designed by Intel and not N.E.C. . This technical-superiority propaganda is a relic of the cold war and we are seeing the same thing now with the rise of Chinese technology firms. In fact, similar to edison/tesla as well.

I'm not positing that one side is better, more that the US technological superiority is mostly a facade. Especially problematic as it creates complacency.


I was shocked when he spoke at my commencement, I was already a fan but felt like I was the only one there that knew who he was. He was extremely unpretentious and just wanted to joke the whole time. I remember reading how his family only gradually became aware of his celebrity in the technical world.


Is there a record of the address he gave? I would love to read it.


There is a bio of Shannon for anyone that is interested that goes into more detail about his accomplishments and life in general -- https://www.amazon.com/Mind-Play-Shannon-Invented-Informatio...


It’s a really good biography of a fascinating life. I’m particularly fond of the fact that his _master_ thesis laid the foundation to rigorous circuit design. It’s been described as ”possibly the most important, and also the most famous, master's thesis of the century”.



Sadly Soni's book is reads like a precocious high school student's summer book report, and he and his coauthor (who doesn't seem any more qualified than Soni is to write such a thing) manage to misunderstand a bunch of obvious things about Shannon's work and life.

The wiki entry on Shannon's life is more interesting, readable and accurate and wasn't written by soul sick former management consultants.


Hey it seems like you have a bit of an agenda by posting this comment twice to different commenters who recommended this book. This isn’t an attack of you, what would be good is if you could explain your view a bit more than just an opinionated dismissal of the book. Is it the writing that sucks? Is it factually incorrect (And how is it?)? Some of us are interested in reading more about the history of information theory and it helps to have considered dissenting opinions to decide what to read and what to trust :)


Because I paid for the book, paid with hours of my life reading it, and am hopping mad that it is total garbage which wasted my time and money with a puerile, long winded stack of drivel.

I'll eventually take the time to write about it in detail; you can figure out some of the details on why the movie was trash by googling my name.


> you can figure out some of the details on why the movie was trash by googling my name.

For those interested: https://scottlocklin.wordpress.com/2020/12/07/bit-player-des...

Tl;DR: There are no substantive criticisms, but apparently the style wasn't to the reviewer's taste.

Here's an extract:

Finally there is the title. “Bit player?” Shannon is the most important applied mathematician and inventor of the late 20th and early 21st century. Nobody else comes close. What would they call a documentary on Maxwell? “Waves haggis-man?” What would their clever title for Newton be? “Calculus apple-noggin sperdo?” Napoleon’s documentary is “big hat frog midget.” How about calling your documentary of Freud “Pervert Schlomo” while you’re at it. What fucking pinhead planet are you from calling a documentary of Shannon “Bit player?” Die in a fire, you fucking philistines!

(Apparently the author missed that "Bit Player" is a play on the idea that Shannon played with bits, and is unfamiliar with the idea that association of a title with a common saying makes people more comfortable with the idea that'll enjoy watching it)


Yeah, I didn't miss their clever little double entendre, nor did I miss they portrayed Shannon as a doddering old fool. This was not a docudrama; it was a desecration.

Are you actually defending the thing, or do you just take offence at "Calculus apple-noggin sperdo?"


I haven't watched it so can't comment on it.


I beg to differ, I thought the book was great without being too technical.


I think this is a shockingly incorrect take on the book. It isn't meant to be totally technical, that isn't the audience or the point. It is a historical biography of a person for a generally lay, or technically inclined lay, audience. (just a counter opinion)


Most will know that Claude Shannon was also a pioneer in artificial intelligence. He did more than theorize, he implemented his ideas in hardware. Games were the proving ground for theorems in AI and Shannon built several games that included AI.

I compiled a list of them here:

https://boardgamegeek.com/geeklist/143233/claude-shannon-man...


I am filled with a sense of wonderment that I have never, until now, encountered an article written as a sequence of forum posts!

(Yes, I have seen countless long comments split over a reply thread because the content was too long for a single comment/post. That's... different)

Thanks for posting (and writing) this


The Geeklist format seems to be unique to the Board Game Geek site. The site, at its core, is a large database of board games, and hey have made it amenable to collect entries into a list and comment on each one. Thanks for your encouragement, I agree, I will collect these into an article some day.


> I agree, I will collect these into an article some day

To be clear, I was not encouraging you to do this. (I'm not discouraging it either. I'm ambivalent.)

I just enjoy seeing people use media in new/unanticipated ways.

(That said, Twitter threads suck beyond a certain size (sixish). This is especially the case if the author intentionally decomposed an entire blog post/article into a sequence of 140-byte chunks. Write a fscking post/article and link to it from a Tweet; perhaps with a couple of threaded pull quotes.)


It's amazing how successful Shannon's followers were and continue to be at depicting him as the 'father of the information age', by means of the erasure of Norbert Wiener.

Ronald Kline's The Cybernetic Moment, Or Why We Call Our Age The Information Age charts this historical boundary work in fascinating detail. https://jhupbooks.press.jhu.edu/title/cybernetics-moment


> by means of the erasure of Norbert Wiener

Is it not possible for you to sum it up here? What's with the tease. Articulate why and post a sentence or two explaining it.

edit: and in your bio, you have "my PhD on Norbert Wiener", ffs


Shannon was Wiener's student and they both published their equivalent mathematical theories of information simultaneously - Shannon as an article split into two parts published Summer and Autumn 1948, and Wiener in Cybernetics in Autumn 48 too. They presented talks at a conference where they used the concept in spring 1947 (Von Neuman presented on the computer). Wiener claimed that his concept of information predated this by months to a year. Regardless they both came at it from different but complimentary angles - Shannon from cryptography (securing Roosevelt and Churchill's line), Wiener from time series and neural nets. For all intents and purposes it should be called the Shannon-Wiener theory of information.

This said, Von Neumann considered information theory to originate in Leo Szillard's work from the 30s (Maxwell demon analogy), but that's a bit of an outlier position as it's quite obtuse.

Wiener is partly written out because his communication theory (a term Shannon, like Wiener used as a synonym for information theory, but which has come to identify Wiener in distinction to Shannon) was considered too philosophical. His accounts for things which cannot necessarily be measured and tested objectively - the quantity of meaning, as opposed to cliche, in a sentence, for example. Cybernetics also suffered a huge fall out of fashion in the US during the second half of the '50s, even though it's ideas spread everywhere. The coining of 'artificial intelligence' in Dartmouth to escape the cybernetic baggage of 'learning machines', the invention of hci, personal vomputer, the internet, for example. It re-emerged in the late 60s as 'second order cybernetics', with a focus especially on organism, but by this point the computer world had long disassociated itself from the organic and the social, in the sense found in Wiener's cybernetics. As Bateson, Mead and Stuart Brand put it, many of the original figures from the circles of Wiener and Shannon 'went off into input-output'. Computer science.[1]

Follow from this the popular, commercially-driven jargons of 'information technology' and 'information revolution' of the 1960s, which reduced Shannon and Wiener's rich concept to a catch-all for anything vaguely related to digital computers, and Shannon comes out a much simpler founding father than Wiener.

If you've read this far I'd suggest you now go and read Kline's brilliant Cybernetic Moment ;)

[1] http://www.oikos.org/forgod.htm



I have never heard before about such a contention. Both Shannon and Wiener were geniuses IMO with very pivotal contributions.


I'm not sure how well either is known to the public, notwithstanding Gleik's book that highlighted Shannon. Personally I'm not sure I'd immediately name one or the other as generally better known. But then I went to school somewhere that had a display about Wiener on a highly-trafficked corridor. Shannon does seem to be in more vogue these days though.


I agree with this sentiment. "Dark Hero of the Information Age" https://www.amazon.com/Dark-Hero-Information-Age-Cybernetics... lays out how both Weiner and Shannon created information theory independently but Weiner took it further; incorporating feedback as a way of lowering the entropy and helping lay the path for today's ML revolution. The book goes on to lay out how Weiner's growing popularity in the Soviet Union and socialist views (he feared automation leading to unemployment) led to the blacklisting of his contributions within the academic community. Its no wander that the first trained Deep (more than three layers) Neural network with published in a Soviet Cybernetics journal in 1970.


TIL. I've always wondered why Weiner is not a geek hero like Feynman and Turing.

What I've always admired about Weiner is precisely his sensitivity to the social ramifications of his work and area of research. Rare bird, indeed.

(p.s. thanks for book ref.)


You may enjoy one of his books, "The Human Use of Human Beings" https://monoskop.org/images/5/51/Wiener_Norbert_The_Human_Us...


I'm not sure whether I'd agree with that. If you look at Shannon and Wiener using Google's ngram viewer, Wiener has always been more 'popular'. They've been getting closer since the 90s, but it seems absurd to argue that Wiener has been erased in any sort of fashion. If anything, it seems that it's in recent times people have started appreciating Shannon.

People have been writing more on Norbert Wiener than on Claude Shannon, and I can only assume that it's mostly in the context of the information age.

To make another comparison: Alan Turing outpaced Shannon in 1976 and is now more than twice as 'popular' as Wiener.


Shannon loved to juggle and make non-sensical contraptions. One was a box with (a button? lever? I don't remember, let's go with button) that when pressed would open the lid. A mechanical arm would then pop out of the box and press the button to close the lid again.

What a hero.


It's called the Ultimate Machine. See it in action: https://www.youtube.com/watch?v=G5rJJgt_5mg


There's something rather Monty Python about it.


Shannon is yet another lion from the old Bell Labs. The value of things that sprang from the people in that place is incalcuable.


Put smart people together and don't manage them.


Perfectly said!

Again and again i have seen well-functioning teams ruined because a "Manager" was appointed to "Manage" Technical people. Middle-management in particular is the worst, because their very existence depends on it.

Give Technical folks a caring, nurturing environment, set clear boundaries, protect them from management BS/politics and let them be. Thats it.


And the magic of capitalism delivered very little of that which is calculable it to the owning company or the scientists that invented them.


One of my favorite Claude Shannon facts is that he (and co-authors Elias and Feinstein) independently discovered Ford-Fulkerson/maxflow-mincut [1] shortly after Ford and Fulkerson.

[1] https://ieeexplore.ieee.org/document/1056816


Oh wow, I love that. Thanks for that fact


> whatever the nature of the information — be it a Shakespeare sonnet, a recording of Beethoven’s Fifth Symphony or a Kurosawa movie — it is always most efficient to encode it into bits before transmitting it

I had always assumed there were practical reason for using bits - eg it being easier to store in digital memory, or that it was in another technical way too hard to have a higher base than 2.

I didn’t realise it is preferable that way because it’s more efficient to transfer information in bits. It seems counter intuitive.


I don’t think that’s right. What Shannon’s channel capacity theorem says is that a channel with a certain bandwidth and SNR can carry a certain rate of information. It doesn’t matter how the source information is encoded. Actual modern signals on a transmission line, like gigabit Ethernet or your cable modem are not just on and off pulses, they are complex analog waveforms.

The reason for bits being the normal representation of information is that they’re easy to do math on, and in a computer, digital logic is much more efficient in binary than in higher number bases.


Is it the source you think is incorrect or have I misinterpreted its meaning?


The trick is what the encoded sequence of bits represents. You have lossless "compression" built into the encoding of almost any kind of information because it by definition has repetition of some kind (if it didn't, it would be random noise).


I always had the impression digital encoding was less efficient than analog, because because transforming sound waves to electro magnetic waves was more straight forward.


With digital encoding, you have a defined bandwidth you have to care about (the audible range of frequencies), which puts a limit on how often you need to represent the state of the wave.

Also, digital transmission does fancy things with phase, amplitude, and frequency combined to represent multiple bits at once.


> less efficient

In what sense?

> because because transforming sound waves to electro magnetic waves was more straight forward

This sounds like you might mean power efficiency. I don't know if that is true or not.

But in this case they are talking bandwidth efficiency.


The article suggests that using binary digits (0, 1) is the most efficient means of communication. I think I've heard something to this effect elsewhere. Why is that? Wouldn't a tertiary digit (0, 1, 2) be more efficient? An a quaternary digit (0, 1, 2, 3) be even more so? At some point noise would make it difficult to distinguish between the states, but I would think that it would be most efficient to use a digit that has a certain number of states with relationship to the noise floor. (More noise, fewer digits; in the case of no noise, analog would be most efficient. Although I guess by this argument, no noise would imply infinite capacity, which either means that there's a problem with my argument or it is a reason why noise must exist) Am I missing something?


Base 3 encoded messages are shorter because this base is closer to the number e (2.718). Binary is the next best approximation, but easier to understand. I want to believe that DNA is encoded in base 4 because it provides more redundancy.


Technically, yes. I think what you're getting at is called "Radix Economy" (https://en.wikipedia.org/wiki/Radix_economy)

Ternary is the most efficient (integer) base, and binary and quaternary follow. It drops from there pretty quickly. The explanation I've always heard is it's dramatically easier to deal with two-state electronics.

Further reading: https://sweis.medium.com/revisiting-radix-economy-8f642d9f3c...


This is surprising to me because of the use of quadrature amplitude modulation in wireless communication, which uses 2^n quadrature states. Of course these states encode groups of n bits, but the thing actually getting transmitted over the channel has many more than two states. I haven't read this article yet though, I'm probably missing something.


1 is the smallest n such that n > 0 i.e. can be used to transmit information.

If you had something where you could represent base-whatever as easily as binary then it would be (naively) more efficient, but we don't.


I count myself lucky that I was able to study under Glan Langdon and David Huffman. Claude Shannon came up often in working with Glen. David wasn't a fan of discussing his namesake coding method. In "office hours" David would rather talk about origami and paper folding over anything really related to courses, etc (if one was doing well).

edit: I was also fortunate to get to speak at both of their memorials hosted at UC Santa Cruz.

Here is an example of his paper folding that was on display at Xerox Park -- http://www.graficaobscura.com/huffman/


A good documentary about him was released last year too.

The Bit Player

https://m.youtube.com/watch?v=E3OldEtfBrE


It is good. I had the good fortune at CMU of showing Shannon how to use a joystick to control a 3D pogo stick (Designed my Marc Raibert of future Boston Dynamics fame; I did the EE & SW) This shows the robot and me driving; unfortunately, we have no video of Shannon 'driving' the machine, but I can tell you the smile on his face was ear-to-ear. https://youtu.be/Bd5iEke6UlE



That movie is a travesty and the people who made it should be punished.

Soni's book is also pulpy trash which reads like a precocious high school student's book report, and manages to make Shannon ... possibly the least boring human of the 20th century .... boring. Gleik's book is reasonable.


> That movie is a travesty and the people who made it should be punished.

I also detested the movie, though I'm not certain I want the creators punished ;-) Also not a fan of the book


Jimmy Soni (https://en.wikipedia.org/wiki/Jimmy_Soni) lived in the room across the hall from me one year in college. He cowrote the Shannon biography that came out last year.

I haven't read it yet, but I will say that I remember Jimmy being one of the smartest people I encountered in college.


I came here to post this but you beat me to it

I attended CodeMesh in London few years ago and there I've come to know about this book.

His panel was really good, I didn't imagine Shannon made a trumpet that could spit fire just because it was fun (if I remember correctly he made it to entertain his children)!

the video can be found here

https://dev.tube/video/elwKUJg4-Ko


From Amazon Prime: "In 1948, Claude Shannon introduced the notion of a 'bit,' laying the foundation for the Information Age. His ideas power our modern life, influencing computing, genetics, neuroscience and AI. Mixing contemporary interviews, archival film, animation and dialogue from interviews with Shannon, 'The Bit Player' tells the story of an overlooked genius with unwavering curiosity."

https://thebitplayer.com/

Released in 2018; 85 minutes.

https://www.amazon.com/gp/video/detail/B08D291YQS/ref=atv_wl...


In 1943 Shannon and Turing worked in the same building in NY[1].

[1] https://www.scienceopen.com/document_file/94e7a73e-3f21-4da5...


He also figured out how to beat roulette https://boingboing.net/2017/07/27/wearable-computing.html


Also how to beat human attempts at random behaviour, using a (very small) model: https://literateprograms.org/mind_reading_machine__awk_.html




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: