If your job pays you say $50/hour to work, but you have to spend one additional hour each month implementing this system (figuring out what to buy, where, when, managing coupons, gaming the system, etc.) that right there costs you $50 in opportunity time, cancelling out some of the supposed savings. The problem gets worse if it takes you multiple hours per month, or your wage is even higher.
In short, I think the whole extreme min-maxing effort makes the most sense if you have a very low income job and the most important thing to minimize is your dollar outgo rate. Most professional programmers should not have to do this.
But for folks who already don't watch mindless TV, or who have other opportunity projects or investments that monetize better, the argument still holds. For example, instead of putting an extra hour into one's day job, you could put an hour into making some new product or service which will in turn make you back more money than you would have saved if you did his extreme food cost reduction scheme. Or learn a new skill.
But who are we kidding? The extra hour will surely be spent reading HN. :)
That assumes a new product I was able to make or new skill I was able to learn in the time this takes would make me back more money than I would have saved. Not sure this is a given. Time doesn't tend to slice up so neatly.
Nicely said, and I totally agree. I just like to pull up examples of the opportunity cost fallacy. After all, it also overlooks the maxim that 'a penny saved is a penny plus taxes and expenses earned'.
In short, I think the whole extreme min-maxing effort makes the most sense if you have a very low income job and the most important thing to minimize is your dollar outgo rate. Most professional programmers should not have to do this.