In addition to being nicely divisible, 72 has an important advantage which most people don't realize: It's on the right side of 100 ln(2).
The exact "rule" for N% interest is N log(2) / log (1 + N/100), which has taylor series 100 log(2) + log(2)/2 N + O(N^2) ≈ 69.315 + 0.34657 N - 0.0005776 N^2 + ...
For N approaching 0, the exact "rule" becomes the "rule of 100 log 2"; but for larger N increases slightly; the "rule of 72" is exactly correct for ~7.84687% interest, and for 15% interest it only gets as far as a "rule of 74.4".
That said, the power series gives us a way to get a significantly more accurate result: Divide the annual percentage interest rate into 832 months, then add 4 months. For any interest rate between 1% and 40%, this result will be accurate to within 3 days.
Maybe I'm obtuse but I'm not sure I understand what you mean by the "right side". You definitely want to use a number that's slightly larger than 100 ln(2) ~= 69.3 to account for the linear factor, but is there some inherent reason to use 72 rather than say 70 or 74, other than that assuming that 8% is some useful midpoint for the type of growth rates you're likely to be interested in?
72 is better than 70 or 74 because it's divisible by more small factors. My point was that it's fortunate that 72 is slightly more than 100 log 2 rather than slightly less; for "how long will it take for money to triple" (100 log 3 = 109.86) taking the closest number with lots of divisors would lead you to take a "rule of 108" but since 108 is slightly less than 100 log 3 instead of slightly more, it will produce larger errors for the common range of interest rates.
I never understood how ten percent errors, when applied to decades by creatures who expire in decades, were acceptable. This is a great approximation trick, but as with most things, please understand its derivation.
Most of the time when I see the "Rule of 72", it's in Excel spreadsheets. I passed on investing in an infrastructure project by changing a Rule of 72 calculation to:
It's a great trick for approximating a value with mental maths. When my boss says he thinks the team can double revenues over 5 years - I can pretty quickly see what sort of year on year growth he's magically assigned.
When you're in a goddamned fancy calculator, you do the full form calculation because what the hell. I mean, I don't understand why you'd be deriving NPV from "years to doubling", but you can sure as hell get it exact.
(Also, as someone who's never actually invested, do people really judge investments on that number rather than internal rate of return, payback period, etc etc?)
> do people really judge investments on that number rather than internal rate of return
Internal rate of return (IRR) is the rate at which the net present value (NPV) of a set of cash flows is zero. NPV is, in essence, a scaled IRR. If you have a budget of capital to deploy, NPV is preferred. For example, if you have a $100MM budget with a cost of capital equal to zero and two investment options: a $10MM project yielding 20% and a $100MM project yielding 10%, and you cannot purchase parts, only wholes (see the appeal of securitisation?), then IRR would guide you to the former while NPV to the latter. When dealing with securities or any other sufficiently quantised investment, IRR becomes a good enough approximation.
You say IRR is the rate, but not every project has a unique IRR, e.g. if there are negative cash flow periods interspersed with positive cash flow periods.
Your assertion about NPV being useful when you have a budget confused me. If you have no budget then NPV is useful. You just do every possible NPV-positive project.
If you do have a budget, as in your example, you could not pick between those two projects without considering some missing information (the length of time). You can't just pick the highest IRR project, or the largest project. The NPV of your larger project with a slightly lower yield could be lower than that of the smaller, higher yielding project. Consider if the larger one had a length of 1 month, and couldn't be repeated, whereas the smaller one would take 5 years. Or you could flip it around. The information you gave is not enough to choose, regardless of whether or not you're budget-constrained.
> Also, as someone who's never actually invested, do people really judge investments on that number rather than internal rate of return, payback period, etc etc?
Given that it made the difference between a profitable and unprofitable investment, I'm guessing that whoever was selling the investment just kept fiddling with the payoff metrics until they found a way to make it look attractive.
It boggles my mind that this would appear in a spreadsheet. Even if the result holds, you should head for the hills. I've found it to be (at best) a cheat to mentally understand compounding.
It does get taught, but I think it's one of those things that people don't internalize until they've been on the wrong side of it, or just get a little older (mid-20's). Aka a maturity thing.
Strange, so it's not taught in USA? I always assumed compound interest to be basics and taught universally. I did learn about them probably in 7th or 8th grade.
I think every time I've heard someone say 'this should be taught in public school' it's in reference to something that already is taught in public school. The problem is we don't internalize things that we haven't had to experience yet very well, and we forget.
On the flip side of your point, I find myself taking weird positions on issues of what should be in curricula. It seems to me that most people carry around a mental model of schooling such that schools just pour knowledge onto a hard drive that then retains it for the rest of that person's life. But this doesn't stand up under 5 minute's examination if you look around in the world. All those polls about how 60% of $FIRST_WORLD_COUNTRY can't even name the number of $SUBDIVISION they have (state, province, whatever)? They all went to school, too. Is it really a crisis if we remove something from the curriculum that 90%+ of 40-year-olds can't recall anymore? Curricula debates should be grounded in reality, not some fantasy where we apparently spend 12 years downloading a couple dozen megabytes of text into people's brainharddrives and set the read-only bit.
I'm with you on this. I would rather have schools spend more time on problem solving and critical thinking, and less on memorizing facts that can be looked up later.
The major obstacle is that it is much more difficult to teach thinking than it is to teach facts. So it's harder to find, train, support, and retain teachers who can do that well.
"Common Core" math (the stuff you might see mocked on Facebook) is trying to move toward teaching kids how to think mathematically, rather than memorizing equations and formats. But it's hard going (again, see the mockery on Facebook), and may not actually the right direction.
I think you might be right and I was wrong, or at least didn't put it correctly.
Compound interest should be taught in terms of examples people will likely use. So when they first get a credit card offer at a certain %APR, or a chance to invest, they immediately pull out law of 72 and do some quick estimath.
I can't speak for anyone else, but in my case at least, that's how it was taught. I remember the teacher going through problems on the chalkboard - "you want that brand new stereo for 150$ (This was the nineties) and you decide to put it on your credit card and then make the minimum payments. What's that stereo actually going to cost you?"
In addition, it's printed on my credit cards statements. There's a little box that prints the total time and total cost if I make minimum payments, and if I make payments that are some larger round number.
Same, however I also taught a math course at a world renowned business program and the students (college freshmen) absolutely hated any stuff to do with compound interest. They were much happier with integrals.
Most likely. My experience with learning math in the US is that it's all taught in a very boring manner and they generally don't care about showing you how it's applicable to daily life, so it seems abstract and useless to most students. It was only slightly better in university.
Yes! Perhaps we should have some kind of district-invariant ("standardized") test to verify that students are learning vital technical knowledge before the school recognizes them as having achieved sufficient mastery for a diploma -- and the school as having accomplished one of its critical functions.
(Actually, that's exactly what we do, and when it fails, people complain about "teaching to the test" and suggest dropping the tests instead of the systematic failure to achieve core objectives of the educational infrastructure.)
Compound interest is taught in school, but the important part that is left out is how it can effect things like school debt, housing debt, your retirement savings, etc.
Meh? I mean retirement savings: over the course of 30 year you can expect about double what you put in. "oh if you put in 5000 now when you're 70 you'll have 10000" is not really that impressive (and of course that works out even if it's not a one time sum. Assuming you don't retire just before/after a market collapse you're going to see about double what you put in (that's assuming were talking actual investments, the .01% interest on a savings account is just an insult)). It's basically the reverse with housing and student loans. You end up paying about double what you take out. A) you don't really need easy facility with compound interest to be made to understand that B) "about double" is, despite Einstein claiming otherwise, not actually that powerful.
Credit Card debt is really the only thing that most people will interact with that has sufficiently high interest rates to be worth worrying about and like I said elsewhere the principle on CC debt is going to be having constant significant swings which makes analysis entirely dependent on spending habits, not the interest.
But...those things aren't driving the economy. Those examples are mostly driving the profits of credit card companies; if people saved more responsibly and used their money responsibly, the could likely (and ironically?) afford to buy more goods, and not less.
Instead of large chunks of their income going to credit card companies (via interest), it could instead be spread out among more stores.
For me, the "free" service of a credit card is not worth my neighbors being saddled with debt.
edit: That cartoon bothers me more the longer I think about it; I know it's supposed to be humorous, but does the author really think that lottery tickets are driving the economy?
>the could likely (and ironically?) afford to buy more goods
While that seems true it's messier than that. In your scenario people have to postpone consumption until they accumulate the cost of goods which would have a huge negative effect on the overall economy. The entire concept of interest rates is to smooth out this issue, rewarding those who can postpone consumption.
> edit: That cartoon bothers me more the longer I think about it; I know it's supposed to be humorous, but does the author really think that lottery tickets are driving the economy?
Well, Wall Street is pretty much a giant casino, so it's not too much of a stretch to say that gambling defines a large portion of our economy.
>For me, the "free" service of a credit card is not worth my neighbors being saddled with debt.
Fine. But that means you'll pay an average of 2-3% more for items.
>Instead of large chunks of their income going to credit card companies (via interest), it could instead be spread out among more stores.
Yes. They would gain under a new system. People who are responsible now would lose under that system. It's a transfer , with credit card companies taking a cut.
People go under on credit cards because they can't stomach the ascetic lifestyles their (newly worsened) financial situations demand, and credit lets them hold on to their standard of living a little while longer.
I don't think better education about how badly it harms them would help much. We see present upside much more strongly than long-term future downside (see: procrastination), even if we are familiar with the nature of that downside.
You don't get any rewards (1% is standard, plenty already give 2% back)?
You also need to take into account the bank's cost of funds, the cost of administering the account, the cost of fraud, cost of disputes (do you ever do a chargeback?)
I thought the cash equivalent of the rewards are charged to merchants ontop of the fee? Square etc shelter that from their merchants (which partially explains their losses quarter to quarter).
The marginal cost of servicing another person when they already have a consumer base of people with debt is very small (close to zero). I'm trying to say it's still profitable, but not where they make the bulk of the revenue.
Amex charges higher fees (many of their cards charge the consumer directly, and their merchant fees are higher) and don't charge interest on many cards (charge cards). Their business model is different.
Knowing how interest works probably won't stop people from racking it up.
Also I remember spending around a month learning about interest and finance-related math somewhere around 8th grade, so at least in some us schools it was being taught over a decade ago.
I didn't write that they couldn't understand it. I wrote that it isn't being taught, so they don't understand it.
You could probably learn many things you don't already know easily, however I don't think "there are many things you shouldn't be doing" because you haven't thought to learn them. For example: empathy.
I went to school in Delaware and Pennsylvania between 2nd-12th grade and I dont remember this being taught (unless I missed the compound interest day in 1st grade ;))
Maybe it isnt being taught universally or well enough, or maybe it just needs to be reinforced more often.
You're misremembering, that's all. And things have changed since you were in that age bracket, anyway; the US common core standards mandate compound interest lessons as early as the seventh grade.
Spending all of your time with people in the top few percent, intellectually, tends to skew your perceptions of what's normal. Most humans struggle with concepts like compound interest. There's a reason hating highschool maths is a trope, despite how trivial most of it is.
No offense intended to math teachers, but I think in the US that's as much because of ineffective math teachers & textbooks as students.
I realize self-selection bias in that it did work for me, but I went to a decent school and still had both wonderful and terrible math teachers.
Math to me is somewhat of a fractal, and if a teacher or textbook can't effectively answer "Why?" in addition to teaching blind computation then a good outcome is unlikely.
Who are these 'top few percent intellectually' and how does one tell them apart from how shall we say the 'intellectually challenged'? How does one join this group?
Is there someone or a group perhaps that decides who are the top few percent intellectually, maybe a fraternity who then initiate other members who can then come on HN and declare with confidence they are a part of this special group and no validation is required. Does one require to be a coder or maybe just a HN reader?
This kind of hubris of a self appointed intellectual elite is seen far too often on HN to be comfortable and apart from promoting small minded bigotry sets a dangerous kind of thinking letting these self appointed groups jump to flawed and silly conclusions about others human beings.
Billions of human beings pass exams testing basic maths like compound interest and much much more every day. Its not an achievement of any kind. There are thousands of professions in the world of which software engineering is just one.
Being a software engineer does not make you magically better than other human beings in any way. Is a civil engineer or a doctor allowed to think HN is full of idiots because they dont have a clue about their fields, or that they are an intellectual elite because of their specialised knowledge in their chosen field? How come this kind of thinking is rife here? How may engineers an doctors have you heard gloating about reading trade mags?
I am prepared to believe that the majority of HN readers are in the top 20%ish globally, but the top 1-2% is a stretch. Tying that to a particular skill is, of course, ridiculous, but I think a lot of us here have consistently performed at the top of our class or at the top of our paygrade in the working world (and when we haven't, we were probably already in a very selective environment). I imagine the same can be said of doctors and civil engineers, most of whom probably understand compound interest anyway.
I don't understand this mentality, top 20% of what? For what purpose? How is it measured, wealth, education?
The ability to educate oneself is a privilege of one's enviroment, background and upbringing, the ability to do well in one specific area could be a measure of one's interests provided one has the freedom and means to follow it. None of these are a measure of one's intelligence. Individuals could go on to distinguish themselves in their respective fields and then could be part of a group of say nobel winners or some specific measure.
This 20% seems completely arbitary and self serving that puts you in a group simply because of your specific interests or chosen profession. This is not in the least scientific or valid and there appears to be no useful purpose than hubris. The bigger problem is why the great need to feel better than others and clutch at shadows? This is how bigotry works.
Surely every human being has the ability to learn and be adept at what they do given the right conditions and this is a problem the world grapples with, to provide these conditions.
An individual struggling with compound interest is an opportunity for those who know, to try to teach them in a way they can understand, not an opportunity to deride and run them down as idiots. That's juvenile and mean spirited.
I don't think there will ever be a scientific method for measuring or ranking intelligence, and I understand that life circumstances play a huge role in being able to acquire the social markers of intelligence. Still, some people are clearly smarter than others. Saying "top 20%" is a reasonable shorthand for "decently above average," which I think is about as accurate as we can get outside of people like von Neumann who could recite novels that he had read years previously. (Note that I'm not saying something like "people who can't graduate high school are dumb." There are plenty of smart high school dropouts and plenty of dumb Ivy League graduates.)
There are people around here who believe that their interest in computer science is one of the things that makes them smarter than other people, but that is not a belief I am defending. In fact, I believe that is a small minority of people here. Most of the posts I saw in this thread were criticizing the education system that leaves many people without an understanding of compound interest, not calling those people idiots.
Most people are comfortable with the idea that some people are at the bottom of the intelligence heap. It is scientifically verifiable that some individuals are unable to remember new information or identify patterns. Why shouldn't this be true at the top of the heap as well? Based on my life experiences, I believe that my intelligence is significantly above average, although I realize that I have been aided by fortunate circumstances. However, I am not strongly invested in the idea: I am more interested in being happy, in being a disciplined worker, and in having strong relationships with people than I am in being a smart person. Why are you so invested in the idea that no one is smarter than anyone else?
Yes, and even if the claim was reliable it still wouldn't mean much to me because IQ is just one aspect of being an intelligent person. Even with the dubiousness of the claim, though, I think we can probably agree that there are some individuals who are off-the-charts intelligent (von Neumann is my example of choice).
Compound interest gets taught in like middle school. The same time we get taught percents and stuff.
That said "compound interest" is and any sort of interest is kind of futile if, like a credit card, the principle is constantly changing, it's also a bit hard to scare people into frugality by telling them if they put these steaks on credit they'll have to pay $12 dollars more later. The numbers don't get scary until well into "they already knew they shouldn't do it" territory.
I think the concept is easy to grasp, but seeing the impact of 30% with actual numbers is very different. It makes the financial impact more comprehensible.
I just read about the Rule of 72 in a book titled The Intelligent Asset Allocator: How to Build Your Portfolio to Maximize Returns and Minimize Risk by William Bernstein.
It's the one thing I remembered from my high school US Government teacher. (Or was it Sociology? Same teacher, two classes. What's up, Mr. Swigert?) I always thought it was really cool that he taught us that and it stuck. It had nothing to do with the lesson of the day, he just did it.
Glad to see this rule come up on a tech site. This is Finance 102 stuff and it's good for everyone to know.
Just in case this hasn't been said elsewhere, the rule is more precise for numbers whose factor into 72 is closer to it. So medium numbers, for lack of a way of saying it. Not 50, not 1, but 12, 8, etc.
It's good to have a feel for the rate of increase when reading news articles or having a vague idea what an interest rate really means, just like knowing scientific notation helps you get a quick feel for the difference between a government program costing 550 million and one costing 5.3 billion, compared to a country's GDP of 1.2 trillion.
By the way, betterexplained.com is where I finally, after memorizing my way through just enough calculus to get a BS in comp sci, began to understand calculus. Seeing how the area of a circle was derived was wild - I'd easily memorized all that stuff, but never understood. We wait way too long to begin teaching calculus concepts.
For a great many business discussions, a close enough answer that's technologically unaided is way more effective than a precise answer that takes an extra 45 seconds and the incorporation of a smartphone (or one person per smartphone) into the discussion.
I think of it as akin to the fluidity of a conversation between two native speakers versus a conversation with a translator in the room.
I can do calculus in my head, but I struggle with arithmetic. My colleagues who can do fast mental math but can't do any algebra to save their lives end up looking much more competent in meetings.
The rule of 7% is more potent and easier to understand.
It tells you that if market grows by 7% annually, it's going to move more volume in ten years than in all years before, combined.
For example, of country's electricity consumption grows by 7% YoY, it's going to consume more electricity in (any) ten years than it ever consumed before that, grand total.
I fail to see how you can compare the two rules of thumb. For starters, the rule of 72 is simply talking about doubling (not about aggregate consumption)and the rule of 72 isn't fixed to a single interest rate (7%).
The exact "rule" for N% interest is N log(2) / log (1 + N/100), which has taylor series 100 log(2) + log(2)/2 N + O(N^2) ≈ 69.315 + 0.34657 N - 0.0005776 N^2 + ...
For N approaching 0, the exact "rule" becomes the "rule of 100 log 2"; but for larger N increases slightly; the "rule of 72" is exactly correct for ~7.84687% interest, and for 15% interest it only gets as far as a "rule of 74.4".
That said, the power series gives us a way to get a significantly more accurate result: Divide the annual percentage interest rate into 832 months, then add 4 months. For any interest rate between 1% and 40%, this result will be accurate to within 3 days.