Between that and simple order of magnitude approximations, it's distressing how often seemingly carefully prepared materials contain errors that can be spotted in a 5 minute reading.
> it's distressing how often seemingly carefully prepared materials contain errors that can be spotted in a 5 minute reading.
It really is remarkable how many mistakes like this happen. One I've seen a lot is not knowing how percentages work, particular when it comes to changes between them. Going from 10% of something up to 15% is not a 5% increase--it's a 50% increase.
Putting aside the "darker" forms of communication whose intentions are to hide, obscure, or mislead, there are manifold good reasons why good intentioned communicators use quantities (e.g. statistics) differently.
One significant underlying reason is that different authors make different value judgments and want to tell different narratives.
For a given set of values and narrative, there are many questions to consider, such as:
- Should a change be expressed summarized as one scalar?
Percentages are useful because they communicate scale without deep context. Who knows what we are talking about at all with regard to 0.0015? What units are we using? Most people don't have the context necessary to evaluate what it means that we started at 0.0015 foos and are now up to 0.002. But the percentage abstracts most of that away: on some metric we care about, this intervention increased the output by 33%. That immediately suggests the scale of the effect, without requiring anyone to walk out into the weeds.
Obviously this can and is constantly abused, but it is undeniably useful.
If everyone in the room implicitly understands all that elided context, sure. But I think there are very few contexts in which stripping away all context is useful when the audience doesn't actually understand it. Then you're not really just streamlining things, you're removing all meaning. Is 50% more Foo a lot? Who the hell knows. But it sounds like a big deal.
Or trying to figure out the basis for the percentage. 90% faster! And how about a percentage of percentages. The APR will go up by 10%. Condition A saved 10% but Condition B saved 50% more. (Is that 15% then?)
My pet peeve is something that used to 200 units and is now 300 units is accepted to be “150% larger!” despite that being non-sensical mathematically. (IMO, it's "50% larger" or "150% as large" but not "150% larger".)
This is a great trick to use if you're trying to find out a percentage mentally and get stuck. For example, if you can't immediately come up with 14% of 50, it turns out that you can just do 50% of 14 and arrive at the same answer thanks to the commutative property.
The issue is that, especially with very small percentages, the absolute percentage change is often more important to the discussion than the relative change, especially when we're talking about risk.
In particular I remember there were a number of breathless articles about covid vaccine side effects where they were talking about a 50% increase in incidence - from 2 in 100,000 to 3 in 100,000*. That's not something the average person needs to factor into their risk model, and headlining it as a 50% increase makes it seem more significant than it is.
*not the actual numbers, but around the right order of magnitude, fit to the example.
A similar thing that drives me nuts is when people report quantities that only have interest as a flux over some time, like eg tax cuts or investment costs or whatever and they just report the dollars. Is the tax cut 1 trillion USD / year or decade or century? It isn’t just 1 trillion USD that is a meaningless statement. If we switch to AWS we save how much per month? Not just how much.
Between that and simple order of magnitude approximations, it's distressing how often seemingly carefully prepared materials contain errors that can be spotted in a 5 minute reading.