the first two things you said are preposterous strawmen, the third I already mentioned, the fourth is wrong. you don't know what will happen in 10 years, maybe some new coal extraction technology will be invented, maybe new sources will be discovered, maybe it becomes possible to economically extract the CO2 from the smokestacks and coal becomes green and gets tons of subsidies. yeah it's unlikely, but it's not something you can "fact-check". it does violence to the meaning of the term.
imagine telling someone ten years ago that oil prices would go negative, which they did last year.
and apart from all of this, for every example you can give me of an obvious black-and-white issue where you really could fact-check it 10 years in advance, there will be 99 others where it's really not so clear-cut, but partisans want there to be fact-checker approved talking points for their side. and the market will fill this demand. for subjects outside your domain-expertise, good luck telling the difference.
> yeah it's unlikely, but it's not something you can "fact-check". it does violence to the meaning of the term.
One of the points of fact-checking is to point out to "yeah, it's unlikely" to people who would not otherwise know.
Lots of claims are made about stuff, in particular climate change and energy supplies, that completely fall into the "yeah, it's unlikely" zone, and yet most ordinary readers and viewers would not know this.
It's always going to be in the interest of someone to say "this might happen by the year XXXX". There's generally no shortage of black-swan boosterism. Having someone step who actually knows the field step in and point out that yes, it might happen but it almost certainly will not is of incredible value.
Your response reminds me of the situation in the current satirical movie "Don't Look Up", where because the probability of an asteroid colliding with earth is only 99.7%, not 100%, the fictional US president decides it's OK to "sit tight and assess". I mean, sure it could miss, and *"yeah, it's unlikely but...."
> One of the points of fact-checking is to point out to "yeah, it's unlikely" to people who would not otherwise know.
Saying some prediction about the future is unlikely to be correct is not fact-checking. That's the whole point. Predictions aren't facts. Unlikely predictions aren't false facts. They're unlikely predictions.
> because the probability of an asteroid colliding with earth is only 99.7%, not 100%, the fictional US president decides it's OK to "sit tight and assess".
Saying that it is a good idea to act on predictions that are overwhelmingly likely is not the same as saying that those predictions are facts.
If you want to improve other people's critical thinking skills, you need to make sure yours are good. Calling predictions facts and acting as if they're the same thing is not good critical thinking.
> Saying some prediction about the future is unlikely to be correct is not fact-checking. That's the whole point. Predictions aren't facts. Unlikely predictions aren't false facts. They're unlikely predictions.
Our culture has become filled with a certain kind of noise in which people who frequently don't know what they are talking about make predictions about the future. I don't really care what you want to call a counter-balancing trend to that - I would agree that "fact-checking" for things that are clearly predictions is likely not the best term, but it's not the worst either, since frequently the process of pointing out just how ridiculous the predictions are will involve using actual facts. So in that context, "fact checking" does not mean "check that the facts claimed are correct", it means "check the facts underlying the prediction".
But call it what it should be called or not, it's still a valuable act.
> Our culture has become filled with a certain kind of noise in which people who frequently don't know what they are talking about make predictions about the future.
Focusing on "fact-checking" in general, let alone expanding it to include "prediction checking", worsens the huge amount of noise in our culture of supposedly authoritative pronouncements being made that turn out to be wrong. The Facebook "fact check" that is the subject of the article we are discussing is a case in point. If Facebook weren't so fixated on trying to remove "noise" through "fact checking", they wouldn't be going overboard all the time and removing things that aren't noise at all, but useful dissent.
Also, the very term "fact checking", as it is being used in our culture now, is a Russell conjugation (someone else brought up Russell conjugations elsewhere in this thread). Facebook is "fact checking" (actually their outsourced third parties who remain anonymous and unaccountable are doing it, but let that pass); those who support Facebook (and other "fact checkers") are "helping to spread authoritative information"; those who question Facebook (and other "fact checkers") are "questioning authority" (even if they cite actual facts).
In short, while I agree that our culture is filled with noise, I don't think all the noise is from individuals who don't know what they're talking about; I think a lot of it is from organizations who don't like to have their power and authority questioned.
I would agree that "fact checking" (at least of several varieties) is not unambiguously good, and may in fact turn out to be harmful, quite possibly for reasons not directly related to the content of "fact checking" itself. I would also agree that the case discussed in TFA is a good example of "fact checking" that likely does more harm than good.
However, that doesn't mean that the concept of "fact checking" is inherently problematic. It could be that there is no way in our current culture of doing anything remotely like what "fact checking" probably needs to be. To me, that's still not an argument against the concept, even if it is necessary to accept for now, the actual execution issues force us all to be profoundly skeptical about it.
> that doesn't mean that the concept of "fact checking" is inherently problematic
Perhaps not, but I think there are wrinkles in it that you might not be considering.
First, if "fact checking" just means "consulting other sources of information to see if they say the same thing", then you have to deal with the question of the credibility of those other sources of information. No source of information is always right. Nor is any "fact checker" always right in judging the relative credibility of sources of information. Ultimately, unless you have your own personal knowledge of some fact, any "fact checking" is going to come down to which sources you trust and which sources you don't. Those are always judgment calls and there will always be some degree of residual skepticism, so citing "fact checks" as if they were authoritative is problematic.
Second, if you try to go beyond that and actually do things like independent experiments to check claims (for example, when scientists try to replicate experiments or studies), then you're not really checking on previous facts, you're creating new facts, which you are then going to use to judge the validity of previous claims. But those previous claims were not factual claims but theoretical ones (for example, doing study B to help in judging the claim "study A shows that treatment X is effective against illness Y"). And again, these kinds of comparisons are judgment calls (sure, sometimes you uncover strong evidence that, for example, the data in study A was fabricated, but study B alone won't tell you that).
> t could be that there is no way in our current culture of doing anything remotely like what "fact checking" probably needs to be.
The critical problem I see with the Facebook case is that their "fact checking" results in something more than just publishing whatever Facebook's ultimate judgment is on some website (as, for example, Snopes and other "fact checking" sites do). Facebook's "fact checking" has other consequences, such as blocking access to things people have posted. And since our current culture seems to be fixated on using "fact checking" in this way, not just to arrive at judgments which are then published as speech, for the reader to take or leave, but to take actions that amount to filtering, restricting, or blocking other speech, yes, I think our current culture is not really capable of doing the limited kind of things that "fact checking" properly done would consist of.
> And again, these kinds of comparisons are judgment calls (sure, sometimes you uncover strong evidence that, for example, the data in study A was fabricated, but study B alone won't tell you that).
Making statements like "these kinds of comparisons are judgement calls" is precisely what I'd consider to be a part of any good "fact checking".
> Ultimately, unless you have your own personal knowledge of some fact, any "fact checking" is going to come down to which sources you trust and which sources you don't. Those are always judgment calls and there will always be some degree of residual skepticism, so citing "fact checks" as if they were authoritative is problematic.
If you follow through on this as far as possible, you vanish in a cloud of solipsism. If it is not possible to establish some ground rules for epistemological truth, then really things have just completely fallen apart (which, indeed, to some extent they have).
> Making statements like "these kinds of comparisons are judgement calls" is precisely what I'd consider to be a part of any good "fact checking".
But the very fact that judgment calls are involved means that it isn't "fact checking"; it's not just reporting facts and giving obvious "true" or "false" labels to statements.
> If you follow through on this as far as possible, you vanish in a cloud of solipsism.
Oh, please. Saying that other people might not be trustworthy as sources of information is not at all the same as saying that other people don't exist.
> If it is not possible to establish some ground rules for epistemological truth
The problem isn't "epistemological truth". The problem is that people have many reasons for not telling the truth, either because they have incentives to deliberately lie or because they have incentives to fool themselves.
In theory the idea of "just tell the truth as best you know it", independently of any incentives to do otherwise, sounds good. But in practice it never works out that way. The present time is not exceptional for the low level of trustworthiness of information; it's exceptional for how widespread the consequences of that are. Our culture has a belief that if only everyone would just listen to the "right" authorities, everything would work out fine. The idea that there are no "right authorities" at all and never have been--that every adult human being needs to have their own set of critical thinking skills, and that if some piece of knowledge is important to you, you have to make the effort to verify it for yourself, and that there is no way to avoid this by any form of social organization--is not one that our culture wants to consider. With what results, we see.
I haven't provided any information at all. I provided some opinions. If you're actually a fact checker, that would seem to behavior that plays directly into the critiques of such work. However, I suspect this remark is just a play on the "they are stubborn, i am persistent" trope, in which case I don't see the relevance at this point in this sub-thread.
imagine telling someone ten years ago that oil prices would go negative, which they did last year.
and apart from all of this, for every example you can give me of an obvious black-and-white issue where you really could fact-check it 10 years in advance, there will be 99 others where it's really not so clear-cut, but partisans want there to be fact-checker approved talking points for their side. and the market will fill this demand. for subjects outside your domain-expertise, good luck telling the difference.