I've never liked how people say that statistics is useful but calculus is not. I do not believe that you can actually understand statistics without understanding at least some calculus. So much of statistics is about areas under curves!
The problem with this is that the first classes in calculus are usually focused on continuous functions, which don't really exist in statistical datasets. The math has a lot in common, but most people don't really see or use that to their advantage, as evidenced by the literature on "transfer of learning".
Have you actually studied calculus based probability/stayistics though? Your comment seems characteristic of my own former thinking from when I had only taken an algebra based intro stats course (AP statistics) and hadn't yet learn it the calculus based way a few years later.
There is a lot of cool stuff you miss out on in the basic stats course because of having to dumb it down to avoid the calculus. Some I remember off hand:
- proof of the central limit theorem, which gives the shocking result that if you sum several uniform distributions you get rapidly more precise approximations of the normal distribution, which looks similar to exp(-x^2) if I recall. This central result is the foundation of all statistical sampling. This is why in real life if you see something follow a normal distribution you can guess it is probably caused by a moderate to large number of somewhat independent factors, and vice versa. This is genuinely useful, but if you don't know it you won't miss it
- poisson distribution which relates the mean time between events to the probability of failures. Obviously very applicable to a lot of real life tbings
Also, pretty much every fancy formula you learn in Stats 100/ AP stats that looks weird but is very useful, can be derived and proved using calculus. Without calculus you just have to take it on faith, and may not have as intuitive an understanding of why it's true and what the significance of the terms is.
The same is the case in basic physics. No, V does not = IR, nor does F = ma. That's the simplification they tell us so they can explain a simplified version to us. In fact, the correct equations have derivatives in them and thus are differential equations.
Look, nobody needs calculus but nobody needs to read either. After all you could hire someone to read everything out loud to you. All knowledge is like this.
I hate to pull rank, but I've seen this enough to realize almost everyone saying "statistics but not calculus" are simply ignorant to what calculus (or analysis) entails. It is true that the classes as taught focus on continuous (smooth actually) functions, but data needs not to be continuous for you to use calculus, otherwise the field would be useless in real life applications and wouldn't even be a part of high school education like number theory isn't.
If there is any issue, there is an issue in how it is taught: I feel like there is too much focus on symbolic manipulation. The algebra essentially prepares you to take a physics course, and that's it really. The underlying concepts however do lead you to things like optimization and approximation (which is fundamentally what calculus is anyway) and that needs to be communicated to students somehow.
I actually do a lot of discrete math (numerical analysis) and statistical analysis for work, and completely agree that there is too much emphasis on symbolic manipulation in school-math. That said, I’m trying to address the reality of what exists in high schools, and the current reality is that a little extra discrete statistics, and a lot less continuous calculus seems like a good trade-off to me.
Alright, sorry to assume less of you. It is a sentiment across the thread but you do know that it's analysis and that calculus is just analysis. Really, calculus of continuous smooth functions is a special subset of calculus.
The thing I remember vaguely is when I was taught calculus first, we "took limits" by hand, including derivatives, numerically, and then we did the formulae and spent the rest of the time doing nonsense like difficult trigonometric integrals and integration by parts. The thing is as you go onto proofy classes including real analysis and such, you go back to the original concept and learn that that was the important bit and actually useful piece after all, as most of life's data cannot be well modeled by analytic solutions you can write down.
I think this is what I contend the problem is. Unfortunately, I don't have much contact with people who actually teach students high school calculus, but almost every mathematician and physicist I know (apart from the theorists may be) agrees with me, that at the end of the day, there is a lot of value to the concepts underlying calculus because they are general and help both naive models of data in your head and eventually statistical and numerical (read computational) models that vastly more people use, while the trigonometric substitutions are much less useful, and are really only useful if you're going to go on to being a theoretical physicist (or at least get a degree in physics where you'll need to do derivations).