Enigma was broken by Polish mathemathicians in 1938, then improved by Germans, then broken again, Poles had given all their discoveries to France and to Great Brittain, they also invented first "Bomba" to that worked mechanically.
It's complicated story, and these people deserve credit along with Turing. But till 1970 it was classified information, and to this day everybody talks only about Turing.
Maybe it's my nationalism bias at work, but I'd prefer if the whole story was better known.
I agree. Also this article only makes it worse. It claims that (a) no one was working on Enigma before Turing and (b) Turing worked on it because he 'liked solitude'. WTF? Given that total bollocks in that part of the article I have a hard time reading the rest. Note that Bletchley Park does a very good job of talking about the Polish contribution to the cryptanalytic effort with examples of how the Polish attacks on Enigma were performed and a Polish memorial and they have a special annual Polish day.
If you find the conflict around frequentist and bayesian methods entertaining check out Jaynes' Theory of Probability, which profiles lots of great arguments and personality profiles. Also you might learn a few things about maxent, statistical physics, and my favorite: the mind-projection fallacy.
Does anyone have a combined PDF? While I appreciate the author putting out the PDF, separating the files out into individual chapters and figures makes it harder to read in a sitting.
At what level is the presentation of statistics in his book? I'm interested to learn more about the application of statistics and Bayesian methods (as a physics major I intend to deal with lots of data), but I haven't taken formal courses on statistics yet.
(Of course, being a physics major, I don't mind having to hurt my brain a bit to get through it, so long as it's good.)
I've recently started going through http://www.amazon.com/Data-Analysis-Bayesian-Devinderjit-Siv... and can highly recommend it. The benefits of Bayesian reasoning can be grasped somewhat easily without a detailed knowledge of the math, but at the end of the day scientists and engineers still need to learn how to use it and write programs using it to do stuff! One review mentions the book dives right into the subject, it does; I believe Bayes' Theorem is shown on page 6. Some people find a brisk pace hard to follow, personally I like it since I can always fill in the gaps with other material if needed. (Like Jaynes, or http://uncertainty.stat.cmu.edu/ )
It has always bothered me that Baye's theorem is a named result, instead of just "the law of conditional probability," say. As soon as you formally define conditional probability, it is obvious.
To be fair, though, I think he discovered it before probability theory was formalized, and I can see how it may have been non-obvious at the time. Sometimes the best progress makes challenging insights appear obvious in hindsight - nice work, Laplace!
It's interesting that the Reverend Bayes was resisted for so long, and not on objective grounds. We seem to have this tendency, even amongst the academic elite, to not always have the most honest of mental apparatus, especially where the matter at hand would require us to reverse position, or acknowledge more our ignorance, than our ability.
I don't know, but it does seem to be sometimes evangelised with more fervor than I'm comfortable with. Us v Them is always a suspicious narrative.
Even in this article, stuff like:
Finally, in 1983 the US Air Force sponsored a review of NASA's estimates of the probability of shuttle failure. NASA's estimate was 1 in 100,000. The contractor used Bayes and estimated the odds of rocket booster failure at 1 in 35. In 1986, Challenger exploded.
makes me uneasy. The rhetoric is too clearly marked.
Yea, that has nothing to do with Bayesian vs frequentist, and everything with massaging numbers to get the results that where expedient. I'm sure that the people at NASA would have been plenty smart enough to work out how to get the 1 in 100,000 result using Bayesian analysis if they really wanted to.
(Not trolling, I'm honestly curious. My mental association for Bayesians is people who want to tag documents, diagnose illnesses, or similar AI tasks.)
There seems to be some sort of variant of Goodwins law surrounding Bayesian analysis. Anytime a topic anywhere touches anything related to statistics or probability chances are someone, who's total understanding of the field is a couple of misunderstood wikipedia articles, will jump in and say "Pah, you're only saying that because you're all dirty Frequentists. If only you'd use Bayesian analysis is everything would be obvious".
I knew a doctor at a noted research hospital who was using Bayes to fine tune cancer treatments. I still wonder if he was on the right track with his research or was he as you put it a "crank".
I think you're misunderstanding, there is nothing about using Bayesian analysis that makes you a crank. Using Bayesian analysis where it makes sense is good and sensible and standard practice among just about everybody in the field (although there are some argument about the size an shape of the set where Bayesian analysis makes sense). Cancer treatment would be a perfect example of an area where Bayesian analysis makes sense.
However, for some unknown reason, Bayesian analysis has also become a trendy buzzwords among huge number of crazy internet trolls who seem to think it's a magic formula that can solve all problems, and who have some paranoid delusions that "they" are trying to suppress the knowledge of Bayesian analysis.
On the other hand, there's not a lot of Bayesian analysis taught in high school, statistics 101, or any of the other places that non-stats-nerds are likely to be. It's useful for a lot of things, easy and intuitive (thus its popularity amongst statistical laymen). So why isn't it taught?
The costs of teaching and learning Bayesian Analysis are low (it's just not as hard as, say, the method of moments), and it does have benefits.
My old stats 201 book (Wackerly, Mendenhall and Scheaffer) covers Probability, Discrete Random Variables, Continuous Random Variable, Multivariate Distributions, Functions of RVs, The Central Limit Theorem, Estimation, Properties of Point estimators and Methods of Estimation, Hypothesis Testing, Linear Models / Least Squares, Designing Experiments, Categorical Data, and Nonparametric Statistics. 15 topics (including the introduction), and Bayesian analysis isn't mentioned. Bayes Law is (of course), but only as a theoretical tool, and for solving toy problems about pirates, beads, and rats in the second chapter.
You wouldn't take an engineering analysis book seriously if it didn't mention FEA, but statistics courses can hold their heads up while completely ignoring a useful and easy to teach tool.
Of course, good statisticians and mathematicians will learn about it (later, or on their own), but there's leagues of economists and engineers coming out who will never bother wrapping their heads around it.
Of course, it's entirely possible that it's not so much a conspiracy spearheaded by old-guard frequentists so much as introductory stats courses being focused on teaching a core of theory (LS and MoM), rather then teaching practical tools to people who will use them. You could also accuse introductory math courses of ignoring useful, fun, and easy stuff (scaling?), while focusing on an old, predefined, widely accepted body of theory.
I find that most of the Bayesian 'militants', if you will, aren't actually doing research using Bayesian methods, but are more recent converts writing blog posts about it. Most researchers I know who use Bayesian methods in research aren't even particularly dogmatic about it, and don't go around writing essays about the Evil Frequentists and Oppression of Bayesianism.
In fact, most (at least in my circle) use frequentist methods as well, as well as methods that don't fall comfortably into either camp. In ML it's particularly common for the same researcher to use any/all of kernel density estimation, Bayesian graphical models, SVMs, etc., depending on the problem.
For those (possibly a small number on HN) not aware - I first encountered Bayes' Theorem via pg's articles on Spam - http://www.paulgraham.com/antispam.html
Enigma was broken by Polish mathemathicians in 1938, then improved by Germans, then broken again, Poles had given all their discoveries to France and to Great Brittain, they also invented first "Bomba" to that worked mechanically.
It's complicated story, and these people deserve credit along with Turing. But till 1970 it was classified information, and to this day everybody talks only about Turing.
Maybe it's my nationalism bias at work, but I'd prefer if the whole story was better known.