Hacker News new | past | comments | ask | show | jobs | submit login

This is brilliant. The whole causal inference thing is something I only came across after university, either I missed it or it is a hole in the curriculum, because it seems incredibly fundamental to our understanding of the world.

The thing that made be read into it was a quite interesting sentence from lesswrong, saying that actually the common idea that correlation does not imply causation is wrong. Now it's not wrong in the face-value sense, it's wrong in the sense that actually you can use correlations to learn something about causation, and there turns out to be a whole field of study here.




When did you go to university? The terminology here came from Pearl 2000, and it probably took years and years after that to diffuse out.


I thought Pearl was writing from 1984 onwards?

I was at university around the millennium.


Causality (2000) made the topic accessible (to students and lecturers) as a single book.


Rigorous causal inference methods are just now starting to diffuse into the undergraduate curriculum, after gradually becoming part of the mainstream in a lot of social science fields. But this is just happening.

Judea Pearl is in some respects a little grandiose, but I think he is right to be express shock that it took almost a century to develop to this point, given how long the basic tools of probability and statistics have been fairly mature.


"correlation does not imply causation is wrong"

That's a specific instance of a more general problem in the "logical fallacies", which is that most of them are written to be true in an absolutist, Aristotelian frame. It is true that if two things are correlated you can not therefore infer a rigidly 100% chance that there is a causative relationship there. And that's how Aristotelian logic works; everything is either True or False and if there is anything else it is as most "Indeterminate" and there is absolutely, positively, no in betweens or probabilities or anything else.

However, consider the canonical "logical fallacy":

    1. A -> B.
    2. B
    3. Therefore, A.
It is absolutely a logical fallacy in the Aristotelian sense. Just because B is there does not mean A is. However, probabilistically, if you are uncertain about A, the presence of B can be used to update your expected probability of A. After all, this is exactly what Bayes' rule is for!

Many of the "fallacies" can be rewritten to be useful probabilistically, and aren't quite as fallacious as their many internet devotees fancy.

It is certainly reasonable to be "suspicious" about correlations. There often is a "there" there. Of course, whether you can ever figure out what the "there" is is quite a different question; https://gwern.net/everything really gets in your way. (I also recommend https://gwern.net/causality ).

The upshot is basically 1. the glib dismissal that correlation != causation is, well, too glib and throws away too many things but 2. it is still true you still generally can't assume it either. The reality of the situation is exceedingly complicated.


I don't disagree with the substance of your comment, but want to clarify something.

Lesswrong promulgated a seriously misleading view of Aristole as some fussy logician who never observed reality and was unaware of probability, chance, the unknown, and so on. It is entirely false. Aristotle repeats, again and again and again, that we can only seek the degree of certainty that is appropriate for a given subject matter. In the Ethics, perhaps his most-read work, he says this, or something like it, at least five times.

I mention this because your association of the words "absolutist" and "Aristotelian" suggests your comment may have been influenced by this.

ISTM that there are two entirely different discussions taking place here, not opposed to each other. "Aristotelian" logic tends to be more concerned with ontology -- measles causes spots, therefore if he has measles, then he will have spots. Whereas the question of probability is entirely epistemological -- we know he has spots, which may indicate he has measles, but given everything else we know about his history and situation this seems unlikely; let's investigate further. Both describe reality, and both are useful.

So the fallacies are entirely fallacious: I don't think your point gainsays this. But I agree that, to us, B may suggest A, and it is then that the question of probability comes into play.

Aquinas, who was obviously greatly influenced by Aristotle, makes a similar point somewhere IIRC (I think in SCG when he's explaining why the ontological argument for God's existence fails), so it's not as if this is a new discovery.


I consider Aristotelian logic to be a category. It is the Newtonian physics of the logic world; if your fancier logic doesn't have some sort of correspondence principle to Aristotelian logic, something has probably gone wrong. (Or you're so far out in whacky logic land you've left correspondence to the real universe behind you. More power to you, as long as you are aware you've done that.) And like Newton, being the first to figure it out labels Aristotle as a certifiable genius.

See also Euclid; the fact that his geometry turns out not to be The Geometry does not diminish what it means to have blazed that trail. And it took centuries for anyone to find an alternative; that's quite an accomplishment.

If I have a backhanded criticism hiding in my comment, it actually isn't pointed at Aristotle, but at the school system that may teach some super basic logic at some point and accidentally teaches people that's all logic is, in much the same way stats class accidentally teaches people that everything is uniformly randomly distributed (because it makes the homework problems easier, which is legitimately true, but does reduce the education's value in the real world), leaving people fairly vulnerable to the lists of fallacies they may find on the internet and unequipped to realize that they only apply in certain ways, in certain cases. I don't know that I've ever seen such a list where they point out that they have some validity in a probabilistic sense. There's also the fallacies that are just plain fallacious even so, but I don't generally see them segmented off or anything.


> ...it took centuries for anyone to find an alternative...

Pedantry: s/centuries/millennia/ (roughly 21 of the former, 2 of the latter?)

EDIT: does anyone remember the quote about problems patiently waiting for our understanding to improve?


I liked the way Pearl phrased it originally. A calculus of anti-correlations implies causation. That makes the nature of the analysis clear and doesn't set of the classic minds alarm bells.


Unfortunately this calculus is exceedingly complicated and I haven't even seen a definition of "a causes b" in terms of this calculus. One problem is that Pearl and others make use of the notion of "d-separation". This allows for elegant proofs but is hard to understand. I once found a paper which replaced d-separation with equivalent but more intuitive assumptions about common causes, but I since forgot the source.

By the way, there is also an alternative to causal graphs, namely "finite factored sets" by Scott Garrabrant. Probably more alternatives exist. Though I don't know more about (dis)advantages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: