Hacker News new | past | comments | ask | show | jobs | submit login

Another super-power is dimensional analysis. People make sloppy mistakes all the time by ignoring this. ( https://en.wikipedia.org/wiki/Dimensional_analysis )

Between that and simple order of magnitude approximations, it's distressing how often seemingly carefully prepared materials contain errors that can be spotted in a 5 minute reading.




> it's distressing how often seemingly carefully prepared materials contain errors that can be spotted in a 5 minute reading.

It really is remarkable how many mistakes like this happen. One I've seen a lot is not knowing how percentages work, particular when it comes to changes between them. Going from 10% of something up to 15% is not a 5% increase--it's a 50% increase.


That still trips me up from time to time. I'm not sure why anyone ever uses percentage increase to communicate anything.

Just say the quantity increased "from 0.0015 to 0.002" or something.

It's particularly prevalent in news announcement of scientific studies. "New drug decreases x by 350%!"


Putting aside the "darker" forms of communication whose intentions are to hide, obscure, or mislead, there are manifold good reasons why good intentioned communicators use quantities (e.g. statistics) differently.

One significant underlying reason is that different authors make different value judgments and want to tell different narratives.

For a given set of values and narrative, there are many questions to consider, such as:

- Should a change be expressed summarized as one scalar?

- As a difference? As a division? As a percentage change? (Read https://en.wikipedia.org/wiki/Relative_change_and_difference for many ways to do it; you might be surprised at the number of ways)

- Should the initial and final quantities be specifically mentioned?

- But what are the appropriate "initial" and "final" quantities? These are tied into the goals (above).

Is a relative change best expressed as


Percentages are useful because they communicate scale without deep context. Who knows what we are talking about at all with regard to 0.0015? What units are we using? Most people don't have the context necessary to evaluate what it means that we started at 0.0015 foos and are now up to 0.002. But the percentage abstracts most of that away: on some metric we care about, this intervention increased the output by 33%. That immediately suggests the scale of the effect, without requiring anyone to walk out into the weeds.

Obviously this can and is constantly abused, but it is undeniably useful.


If everyone in the room implicitly understands all that elided context, sure. But I think there are very few contexts in which stripping away all context is useful when the audience doesn't actually understand it. Then you're not really just streamlining things, you're removing all meaning. Is 50% more Foo a lot? Who the hell knows. But it sounds like a big deal.


Or trying to figure out the basis for the percentage. 90% faster! And how about a percentage of percentages. The APR will go up by 10%. Condition A saved 10% but Condition B saved 50% more. (Is that 15% then?)


My pet peeve is something that used to 200 units and is now 300 units is accepted to be “150% larger!” despite that being non-sensical mathematically. (IMO, it's "50% larger" or "150% as large" but not "150% larger".)


Just solve this one, and you will love percentages forever:

Is x% of y the same as y% of x?


This is a great trick to use if you're trying to find out a percentage mentally and get stuck. For example, if you can't immediately come up with 14% of 50, it turns out that you can just do 50% of 14 and arrive at the same answer thanks to the commutative property.


In my head I convert percentages into simple multiplication.

So 14% of 50 becomes 1.4 * 5 == 7

Or to simplify even more…

1 * 5 = 5

4 * 5 = 20 -> remove the “0”

5 + 2 = 7


Is that easier than half of 14?


Percentages were created for clickbait.


The issue is that, especially with very small percentages, the absolute percentage change is often more important to the discussion than the relative change, especially when we're talking about risk.

In particular I remember there were a number of breathless articles about covid vaccine side effects where they were talking about a 50% increase in incidence - from 2 in 100,000 to 3 in 100,000*. That's not something the average person needs to factor into their risk model, and headlining it as a 50% increase makes it seem more significant than it is.

*not the actual numbers, but around the right order of magnitude, fit to the example.


More generally, the slightly gauchely named "Street-Fighting Mathematics" http://streetfightingmath.com/

http://web.mit.edu/6.055/

* Divide and conquer * Abstraction * Symmetry and conservation * Proportional reasoning * Dimensional analysis * Easy cases (plugging simple values into complex formulas) * Lumping (discrete approximations of continuous functions) * Probabilistic reasoning * Springs (approximate complex systems as simple oscillators)


A similar thing that drives me nuts is when people report quantities that only have interest as a flux over some time, like eg tax cuts or investment costs or whatever and they just report the dollars. Is the tax cut 1 trillion USD / year or decade or century? It isn’t just 1 trillion USD that is a meaningless statement. If we switch to AWS we save how much per month? Not just how much.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: