Here's another exercise (resp. exam question) that tests understanding: given a sketch of a curve in a graph, roughly sketch the derivative (or integral). The number of otherwise good students who go "but I can't do the derivative without the formula?" suggests we need more questions like this.
I remember there being a distinct lack of "closing the loop" on concepts in university math.
Day 1 of the class: The derivative calculates the slope of a function
Day 2: The integral calculates the area under the curve of the function
Days 3-89: Rote exercises deriving and integrating increasingly obscure functions
Day 90: Final Exam
Spending a few days at the end re-exploring the "big picture day-1" to tie together all of the various strands of knowledge you accumulate over the semester would have made all of it so much more effective.
It's interesting because I was going to argue with you until I applied the same idea to a subject I couldn't care less about if I tried in school, but now think is deeply fascinating: History.
See, I loved math. So all through Calculus I could easily remember the big picture. Every time I practiced an Integral I imagined curves and calculating the areas under them and the visual problem and the relevance of what the curve represented in real life and the massive amount of applications it could be used for in the real world lit up my neurons like fireworks in the sky.
(Then I went into computer science, and wound up never needing calclus again, based on the type of work I happen to be doing, but alas)
But where I really would have wished a constant reinforcement of the "big picture" is History. Cuz it always seemed so pointless and useless. Why are we studying these old dead people, and everything they did. Who cares? They're old, and they're dead, and nothing they did matters to us anymore.
Until you grow up, and go from your 20s to your 40s, and suddenly realize oh shit we're living THROUGH history. We're creating history NOW. We're making choices, and we're making mistakes just like those old dead people in history. Old dead people that weren't really any less developed or evolved primates than us. Just equally victims of their circumstance like us, and also agents of change like us.
Funny enough, I ended up with the opposing view - history doesn't matter, narratives do. During the last year I learned about propaganda, psy ops and myths much more than I knew before. And at the risk of sounding as an edgy teenager, I can vouch for the phrase "History is written by victors" which stopped being a platitude for me and became a part of my worldview
Frankly, this is spoken like somebody without a history education.
Not only would "narratives matter" be an uncontroversial statement among historians, they'd tell you that all historical writing is narrative construction. And they'd hand you a book on historiography and teach you about the methods that historians use to understand an honestly present narratives in their writing.
Note that this does not mean that historical writing is bullshit. A lot of engineers seem to come up against these observations in the humanities and then just assume that nothing can be done and that entire fields must be discarded while the people working in those fields have been living with this stuff for their entire careers.
I prefer the existentialist "history is written by those who write history." Often, the victors give themselves a better justification for their conquest than admitting that someone bigger and badder may eventually dethrone them. However, some losers write histories praising their new overlords and their incredible military might, not out to gain favor with the new king but to preserve the dignity of the old king. The defeat was inevitable because of the invader's futuristic military powers aided by the Gods (instead of admitting the old king had no strategy).
You're not wrong but what's the difference between history and narratives?
"History" is the study of different narratives to TRY to come to a semblance of truth, but even for recent events this is almost impossible.
Pick an example like "Did the US dropping the nuclear bomb on Japan ultimately save lives, or was it unnecessary" and it's impossible to find the truth between 2 conflicting narratives, each fairly justifiable.
Another one is, most Soviet Anti-American Propaganda was true.
The real history is often a horrible mess, narratives are built by carefully cherrypicking facts to support a worldview. This is similar to what happens to mass media - they don't need to lie, just present only those facts that support their position
I think history is potentially even easier. Everyone asks "why?" about things being the way they are, or can be interested in it if you remind them that things could be otherwise. History is the answer to "why?" for a huge number of things with both small and huge life and death consequences. Put another way, "why" is a good framing for a lot of big picture history stuff.
I wish I could give this comment hold, straight up put me off math because of this lack of real world application in lower level calc and adhd brain craving meaning and substance.
I learned basics of calculus as a teenager making animations with a pirated copy of Adobe After Effects. It’s a motion graphics package where you can animate any property of an element over time using both absolute values and velocity curves (i.e. the derivative). It shows both curves next to each other, so you can tweak either and see how the other changes.
Seeing graphics animate according to a derivative that you just plotted yourself is really useful to develop practical intuition about what it means.
After Effects is too expensive and complex for high schools, but maybe some kind of modern Logo-style environment that combines coding and animation could be useful for calculus beginners. (And linear algebra too — another field where the basics have a direct intuitive application in computer graphics.)
The problem is that most teachers cannot come up with questions that go outside the small number of cases which the student trains on. I mean, coming up with fundamentally new questions is very hard work once the low hanging fruit is gone.
That or the point of the class is to prepare for a test set by someone other than the teacher (such as in UK A-levels), where you know the questions will mostly be of the form "derivative of 3x^2-5x+1" or similar.
Intuition tells me e^sin(x) would look similar to an ordinary sinusiod, except its range would be between e^-1 and e, and its shape would not be smooth as a sinusiod. I have no idea what the shape would look like, and I'm a visual learner when it comes to mathematics.
I think most of these questions are not measuring intuition per se, but rather has the tested person previously seen such functions plotted on a graph.
Either that, or my mathematical intuition has got rusty from years of code monkeying.
In fact, I was taught the graphical intuition via a few days of these sorts of exercises, before being introduced to the formulas. It worked really well, at least for 9th grade students in an acceleration program at the University of Minnesota (UMTYMP).
That's one of many ways to check one's answer. The whole point is to be able to intuit a rough sketch of what the graph should be, not to use software tools to build a pixel-perfect render of the plot.
I recall a similar exercise on my AP Calc exam from years ago.
Instead of sketching the derivative based on the graph of a function, we had to sketch the function based on a table of data which described the function as well as its first and second derivatives in terms of value, existence, and sign at various points and intervals.
My school teacher asked us this exact exercise several times. He always made sure to link abstract concepts to real applications, as well as showing us how the mathematical concepts were discovered.
(note: this was not in the US, but in the early 2000's in a small European country)
I did well in high school math. These days, when something involving algebra, trigonometry, geometry etc comes up I feel like I have a good understanding of it but my calculus seems weak to non-existent. I'm not sure if it's how I was taught, how I studied it or something else but calculus always seemed like a huge step change in difficulty.
That said, I love how this article gives practical hints on how to replicate the insight and solve the question, rather than just the insight itself.
I find personally that my math ability is set to approximately 2-3 levels "below" the highest level math I completed, and I've seen hold for others. I have an applied math bachelors so I've taken analysis, dynamical systems, and other high level math classes, but I find that the stuff that I actually remember at a level to pass undergrad exams is up to linear algebra or maybe a little more advanced. Of course if I were to relearn it'd be much faster, but years of being a software engineer have caused me to forget all that stuff.
I was literally about to write this same comment. I did physics so I finished some group theory and complex analysis, but 20 years later all that stuff is gone because I never applied it in other classes. Only the stuff that kept coming up, like calc 1/2 and first order diffeq, really stuck.
With that said, does anyone know of a good method to relearn math efficiently? I found it to be really hard to self-learn any math topic, most books repeat everything from the basics at the beginning like what a set is, and then suddenly turn into ultra-advanced with “the proof is trivial” all around.
I think this experience is typical of self teaching from a math textbook. It's extremely difficult to find a good book that leaves no gaps while simultaneously explaining everything that might be difficult to understand. The key thing when encountering this for me is to expand my horizons and begin looking for videos or other supplementary materials. A teacher would show you the proof, or at least help you along the way, the internet can be used for this as well, up to a point.
While it's frustrating and time consuming, self teaching a difficult subject is just like that unless you're a god amongst men (and few of us are). Sometimes you'll want to fight through the strange unproven thing by thinking hard for a couple weeks about it while googling intermittently to find key steps. Sometimes you'll have to give up on it and keep moving. If it's foundational then fighting through can be highly beneficial, but a lot more things are presented as foundational than actually are.
I'd recommend finding good books by searching relentlessly on reddit and other forums for opinions, dedicating the time necessary to self teach something difficult (it can take upwards of a year to get through a smaller textbook if you have other things in your life going on), and if you really want it then fight for it. Give it everything you have, really let the problem consume your thoughts because eventually you'll wake up at 3am and know exactly what to do. And finally, move on if you don't want to do that. Try just keeping moving. Review from time to time but don't let a hard first couple chapters prevent you from ever learning the concepts. Or you could find something you want to know and work backwards through every term that's used until you're at a concept and then attempt to apply it to the larger idea.
In general, self teaching math is extremely difficult, and only really works if you're willing to dedicate the time to fight through ideas.
This is a great comment. To add on to a point that really aligns with my experiences:
> "Review from time to time but don't let a hard first couple chapters prevent you from ever learning the concepts."
This is a very good approach, and I wish I started doing this earlier. Even in my university math courses, the professors sometimes skipped ahead to have students focus on a few later chapters before coming back, or told the class to skip several pages in the book. I also found that working on later exercises in a textbook would sometimes help me better understand concepts introduced in earlier chapters.
Lastly—though this may not be completely relevant to studying mathematics—I've explicitly been taught in various language courses (explicitly for audio courses and implicitly for in-person university courses) that it's okay to move ahead if I know at least 80% of the material. The percentage may be higher for studying math topics, but especially for someone self-learning out of interest or for a specific application, it's much more preferable to move forward and revisit earlier exercises as needed, instead of quit the book. If you find yourself getting lost in later chapters, there is no problem with revisiting earlier chapters. You'd also likely be no worse off (possibly even better) than many undergraduates studying the textbook for a course for the first time.
The most important thing is just to not quit the habit of consistent study. Perfectionism in understanding is a pitfall for self-directed studies, which consistency in studying beats every time.
> it's okay to move ahead if I know at least 80% of the material.
One of the worst feelings is moving on in a textbook and realizing you are indeed totally lost.
Here’s my own list of necessary but not sufficient conditions for deep learning:
- Motivation. Easy to overlook; hard to get if you don’t already have it.
- Frequent experience of “I have no idea how to solve this,” followed by hours or days of playing with the problem, followed by a eureka moment. You can’t be sure you’ve learned the thing unless you’ve constructed the solution yourself. Builds confidence too.
- Seeing the same material in different contexts or presented in different ways. It’s like looking at an object from different angles.
And for bonus points:
- Teach the concept to a curious friend. Their questions will lead you to deeper understanding.
One of my professors used to say that “even a horse can do derivatives. Integration is the real deal”, another one said that you integrate by “look at it, deeply, deeply, deeply; and then solve it”.
The point is, many part of high school math is actually really “algorithmic”. I was one of the few in my class who absolutely loved coordinate geometry over “normal” geometry, because I simply felt really comfortable with equations — once you have it down, you can basically solve it, even if it is harder than the “notice this and that” elegant solution.
Most integration problems require this intuition-based solution which has a certain elegance to it.
It was especially humbling to me that Wolfram alpha fails most of the interesting calculus problems I encountered during my analysis classes, but after a while I managed to solve most of them. But it unfortunately does disappear after not using it for a time..
I actually hit this personally, because right up UNTIL calculus, math was the Feynman method for me. Everything always just clicked, made perfect sense, and I saw the beauty in it, and it was great fun.
Then for calculus, we learned concepts, like what a derivative is, and I understood that, and understood conceptually (as in, what everything "means" and what it tells you) but I could never take that concept knowledge to the practice problems with me.
I could follow along as the professor walked us through a problem, showing us what heuristics helped and what patterns to follow and how to manipulate the functions to get to something that followed one of the patterns to pull an answer out of your ass, but I could never commute those heuristics and patterns to novel examples. It's weird because I was great at doing the exact same thing for physics: Taking a novel and purposely opaque problem and finding which pattern it corresponds to.
To be honest, we would probably better serve our students in general by presenting integration as a numerical approximation, doing enough symbolic stuff to demonstrate that some integrations can be solved that way, doing enough other stuff to demonstrate why there's a ton of integrations that have no closed-form solution with any conventional functions, and then moving on to more productive things rather than blow over a full semester grinding out integrations. Integration by parts is useful as a method for exploring the concept more deeply, and there's a few other such tricks to be used primarily for ensuring the concepts are understood better. But I'm not convinced there's a lot of value in all these integration tricks.
Derivatives are friendly enough that I feel like a certain amount of grinding is justifiable, and it's justifiable as practice for symbolic manipulations in general. But we're leaving a lot of useful stuff on the table while we're jamming down how to integrate with trig identities and other such things.
But it's all pie in the sky anyhow. The Curriculum Must Not Be Changed. The Curriculum Is Perfect. Nothing Can Be Dropped From The Curriculum. I don't know what miracle would have to be worked to get people to reconsider the curriculum from some sensible perspective of what students should be taught rather than the way that question happened to be answered about 100 years ago when the curriculum froze into place, but it probably involves the total destruction of the school system at this point. I can't even get people to process the idea that shoving incomprehensible combinations of 450-year-old words in what is effectively another language at children and telling them this is High True Art is a bad idea, what chance is there of prying away the utterly vital fact that cos(θ/2) = SqRt((1 + cos(θ))/2) out of The Curriculum?
Maybe if colleges continue dropping the SAT and the ACT we can start actually fixing these curricula.
Most integration problems require numerical methods. The ones with with an analytic solution are just where someone at some point found a way of solving them.
I think this is really important for good teaching. It's not enough to show the student how to solve the problem. One needs to also show the student the patterns of thought that could have led them to the solution. And it's not enough to show how _someone_ could have been led to the solution, one has to show how _this particular_ student could have figured it out, knowing what they know and being who they are.
I'm a professor in a big university in western Europe. Students don't want to be led to thinking of the solution. They want the same exercises that they did during the tutorial, and they want to know in advance how to solve all the exercises. Any attempt to digress from this is met with vitreous eyes.
I'm in pretty much the same boat re: calculus, but I think a lot of it has to do with problems just like this. For me, early in my experience with calculus I always looked for the "graph it out"/non-calculus solution. So problems like water leaking out of a bucket, rocket acceleration, and other integrals where the underlying process is in some way linear always fell to non-calculus-based analysis. And thus when I got to problems where actual calculus was required, my non-grounding in the basics pushed me toward rote memorization which (of course) didn't stick over years of non-utilization.
This reminds me of an exercise I'll never forget from my Math Methods course: finding the derivative of arcsin(x).
It seems almost impossible because, just looking at it, there seems to be nothing you can do to simplify it. Then, out of sheer nothing-else-to-do-ism, you take the sin() of it and realize sin(arcsin(x)) = x. Take the derivative of both sides, apply chain rule and draw a right triangle and you have the answer.
Like the words the author uses for the integral, it's all valuable technique.
One technique for finding the derivative of the sin function is to find the derivative of arcsin first. The arcsin function can be expressed as the area of a certain figure, and therefore admits an expression as an integral. From this, the derivative of arcsin is immediate. Finally, apply your technique to arcsin(sin(x)) = x and obtain the derivative of sin.
A similar technique finds the derivative of exp(x) from ln(x), by defining the latter as the integral of 1/x.
The OP uses the derivative of sin to find the derivative of arcsin. My point was that the other way round is sort of natural as well. Thought I'd share.
The broader point is that you can obtain all the classic definitions and identities of the elementary functions [1] by combining only (i) the four arithmetic operations, (ii) integration, and (iii) taking inverses of functions (iv) the constants 0 and 1. I guess that's a bit cool.
This works fine over the real numbers. Over the complex numbers, this might require Riemann surfaces to work. e^z in complex analysis is usually defined by its Taylor series, instead of as the inverse function of the natural log (whose domain is actually a Riemann surface).
In turn, this reminds me a bit of a calculus problem I saw freshman year of college that I ended up sharing with my old high school calculus teacher because it similarly looks intractable until you make a "simplification". The problem was to find the integral of `x/(x + 1)` (I forget the exact bounds, presumably from 0 to x with respect to x). The trick is that this the same as `(x + 1 - 1)/(x + 1)`, which you can then split into `(x + 1)/(x + 1) - 1/(x + 1)`, which you can then integrate much more easily.
But that's not the usual form. Can you show that the denominator is sqrt(1 - x^2)? Doing that requires you make the same observation as in the parent post.
> Can you show that the denominator is sqrt(1 - x^2)?
That's a trig identity that is easily derivable: draw a right triangle with hypotenuse 1 and one side being x. Its one angle will be arcsin(x) by definition. Then by Pythagoras the remaining side is sqrt(1-x^2), which is the cosine of that angle.
I guess GP's point was that the implicit differentiation trick by using the inverse is generalisable and not only restricted to trigonometric functions. Obviously, which simplifications you can then additionally make will depend on the function in question.
Current Calc 2 student here. I would be braindead approaching this problem honestly, I don't think I'd even know how to begin; I'm hoping that's normal.
Why would the exponent be equal to x/2 - floor(x/2) be equal to x/2 on the interval [0, 2)? And how does the graph of x/2 - floor(x/2) imply anything about the behavior of e^(x/2 - floor(x/2))? I'm hoping I just haven't learned enough yet?
It's definitely a tricky problem for a student. The reason that everyone likes it is that this is exactly the kind of problem you run into when you need to solve an integral in the real world.
Once a year at least I run into a math situation like this. Obviously in some professions it will be much more (or less) often.
Exponents, floor, ceiling, and absolute value are very frequently part of the problem.
The approach of graphing the function, breaking it into components, and seeing if any of them are periodic, are all important steps toward a solution (more so than the symbolic manipulation because that might either be a big mess or even unavailable).
Often you'll end up using numerical methods to approximate the solution, but if you can come up with a closed form solution that's much nicer.
> Often you'll end up using numerical methods to approximate the solution, but if you can come up with a closed form solution that's much nicer.
Also even with the example in the post , thinking about the problem at first and exploiting it's periodicity before naively integrating numerically over the entire limit affords much faster computation. Although might not be specific to this example , utilising such symmetries can sometimes turn even a numerically intractable problem or a problem that requires days / 100s of gigabytes into something one can do on their laptop in hours.
Another way to think about it: floor rounds down, so there are intervals of numbers which all have the same floor, so you can break the problem up into intervals where floor(x/2) is constant. On [0, 2), floor(x/2) = 0, on [2, 4), floor(x/2) = 1, on [4, 6), floor(x/2) = 2, etc. So in each interval you are integrating e^(x/2) multiplied by e^0 or e^-1 or e^-2 etc. Effectively the same integral 1000 times, and the different limits and constants balance out in the end (try it to see that).
I think this kind of problem is less about maths and more about how one might approach an unfamiliar problem. Not knowing where to begin is normal. What you're looking to do is to build up the intuition for how you can break down the problem into smaller pieces so that you can investigate its properties.
x/2 - floor(x/2) is the natural place to start because it's the smallest independent piece of the equation. Take a couple of minutes to plot this on a graph for a small range of values, like 0 <= x <= 6 (deciding what range to check is also part of your problem solving skillset).
With this, you can calculate and sketch out e^(above result) on a graph. Finally, knowing the principle that a definite integral calculates the area under the curve, you should be able to use your sketch to reason out how to calculate the entire original integral.
Hopefully you can see how solving this kind of problem isn't about knowing anything about this particular problem, but simply investigating it without any prior expectations, which is why the author thinks this is an interesting exercise for students.
You don't need graphs or area at all, of course. All you need is to notice that the frac(x/2) expression is a periodic function that is only integrable piecewise, and is easy to integrate with a change of variables for each piece or a handwave of such.
It's actually gnarly to write out a formal proof as a new student would do (it requires principle of induction to handle all the pieces), but easy for an expert to breeze through as trivial.
The makes it a bit of an unfair problem for a students trying to follow the rules of math. This is very common challenge for students making the transition to higher math, when they are taught rigorous proofs but before they learn that professionals mathematicians are rarely rigorous (except when there is disagreement about the truth of an "obvious" claim).
I was guilty of the other extreme - I had a very hard time understanding the symbol pushing, so I tried to find numerical tricks all the time, which didn't work out too often.
I would classify a trick as something that happens to work but isn't rigorous. Like treating dy/dx as a fraction sometimes works, but only under certain conditions.
In my mind, a trick is something that applies to unusual and specialised cases, whereas a method is something that can apply to a broad, well-defined class of problems.
What's your favourite example of where it doesn't work? Physics is full of quasi-infinistesimal quantities and I always like good counter examples (ideally without invoking something like the blamange function or similar....)
It basically works without issue in 1d, simply because dx and dy can be considered modular forms and dy is exactly dx times the derivative dy/dx. You can even put an integral sign in front of them and calculate the corresponding integral.
Where this doesn't work is if you have more than 1 dimension. Then you need to deal with the added complexity of integrating modular forms and the fact that in 2D you don't have df = (df/dx) dx but df = (df/dx) dx + (df/dy) dx. The chain rule also changes into a matrix product, rather than a simple dz/dx = dy/dx dz/dy.
No, a trick is a shortcut with respect to the more tedious method. By definition anything that works also formally works, otherwise it wouldn't.... work.
First rule of integrating non-analytic functions: If they're analytic everywhere in the interval in question except a finite number of points, split the integral and compute it one analytic segment at a time!
(Second rule: If the function is non-analytic at an infinite number of points, you probably still want to compute it one segment at a time, but adding them back together afterwards may get messy.)
My introduction to calculus was “Calculus Made Easy” by Silvanus P. Thompson and I always liked math profs who actively worked to show math for what it is: useful, beautiful but not about the symbols or the jargon. “Any fool can calculate!” I think is what he says in the book.
I did some math in college and when I started knowing how to analyze the behavior of functions (and developing the mental math tools to imagine what they look like without having to actually draw them) that’s when I felt like I was kinda getting it
Great post! It really drives home the point that understanding the core concepts in calculus is way more important than just memorising formulas and mechanically applying them. The example problem shows how visualising and breaking down a seemingly complex integral can actually reveal its simpler underlying structure.
This reminds me of the need to be adaptable and versatile when tackling math problems, since relying solely on known techniques can limit your ability to solve more complex or unfamiliar problems.
Educators should help students focus on developing a deep understanding of math concepts and honing problem-solving skills, rather than just bogging them down in calculations.
Ha ha, sadly this can be transformed into a symbol manipulation answer as well. I know because this (stated slightly differently) is one of the questions in my 12th standard (senior year high-school equivalent) Mathematics I class.
You have to spot the period, but x - floor(x) is called "fractional part of x" where I come from and is a named function which everyone is familiar with. Then, without knowing the area-under-the-curve interpretation, one can blindly apply another symbol-manipulation tool: the summing of integral over a period.
Floor functions trigger my fear instinct, but at least with this question I could just sit and visualize what the graph looks like, e.g. the basic sawtooth function from first year engineering.
GPT: The expression ⌊x/2⌋ represents the greatest integer that is less than or equal to x/2. It is called the floor function of x/2. For example, if x=5, then ⌊x/2⌋ = ⌊5/2⌋ = 2. If x is an even integer, then ⌊x/2⌋ = x/2. If x is an odd integer, then ⌊x/2⌋ = (x-1)/2.
I expect Google is stripping the ⌊ ⌋ brackets out as punctuation in the search, so that you're effectively only searching for "x2", hence the "x squared" results.
I was aware of the floor function (and the corresponding ceiling function) since I’m a software engineer. But I wasn’t aware that you could graph it. It never came up in high school or college math. And I never thought about it. Of course, it makes sense now that I’ve seen it.
No. I think the term “function” is overloaded here.
My point was that I viewed it solely as a _programming_ function and not a _mathematical_ function (even though it exists in math libraries), hence my last sentence “Of course, it makes sense now that I’ve seen it.”
Out of all the functions in math libraries that I’ve used, floor/ceiling are the only ones where I had this idea for some reason. It was obvious to me that Math.sin(x) and Math.abs(x) can be graphed. I’ve seen those graphs over and over again. But whenever I used the floor or ceiling functions, I just thought in terms of rounding up or down with a predetermined rule to finish whatever piece of code I was working on.
But as others have pointed out, I have actually worked with the floor function as a mathematical function in several math classes. I have seen the graphs. They just weren’t called floor or ceiling functions.
I just never made the connection that they were the same and I don’t recall any computer science professor or TA “bridging the gap” to what was learned in the math classes.
The thing I dislike about many maths problems (including many proposed in this thread) is taking the wrong initial approach can make it take forever. Finding the right trick to solve something can feel enlightening, but in my experience it feels mostly frustrating if you waste 10x the time by taking one wrong step in the beginning.
After watching Michael Penns youtube channel [1] for some time now, and he loves the floor function, I recognized what was going on - and wondered how I could prove this is 1000 times the simple function beyond just stating it.
I loved maths in school and unfortunately didnt pursue maths but solved this problem while sipping coffee and listening to cornfield chase, I realised why I loved maths. the solution is so simple and so intuitive if you solve with graph.
This is surely a stupid question: In the article, the graph sure looks like a right triangle, with a base of 2 and a height of 1. Wouldn't the area under this curve (from 0-2) be ~1?
This is why I enjoyed doing math contests. You always got these problems that illuminated how things actually work, and the answer is always some set of basics applied to elegantly solve it.
I'm actually looking for a set of such problems as I think it's a lot better than grinding out hundreds of quadratics or polynomial derivatives and such. I found the AOSP stuff already, wonder if there's other good sources.
We used to get lot of such tricky stuff during the preparation of IIT-JEE here in India, and I'm telling you if you don't understand Area under curve is integral, you can't touch most of the questions. But I get your point, if you are interested in such questions, you should checkout IIT JEE mathematics question, you'll love them
I am from a rural region where IIT-JEE is not that popular, and I liked working these physics and maths problems initially.
Unfortunately the competition has become so intense you practically need coaching (which is expensive) and dedicate lot of time, at that point it becomes grunt work. There are many "tricks" and "shortcuts" taught in these coachings which doesn't exists in normal NCERT syllabus. Needless to say I didn't do very well.
It's floor(_) - as in, floor(1.999) = 0, but floor(2.001) = 2. If you look carefully the upper flange of the [] square brackets is missing, which makes it a floor.
It's easy to fall into the trap of relying on rote memorization of integration rules, but problems like (⋆) force students to truly understand the concepts behind the math.
I don't think that's true. Floor is a piecewise function, so you follow the rule for integrating piecewise functions and break it into a sum of integrals of each piece, then follow the rules for those (they're all basically the same, so you don't need to do 1000 of them). You don't need to think about periodic functions at all.
> You don't need to think about periodic functions at all.
except you just described that which is called period...so it's actually good for a student to notice these things, and use the correct term, so that they can associate the name with the idea.
I don't think many would notice doing it that way, as it's disguised by the different limits and constant factors. The indefinite integrals are what is basically the same on the surface. There's more to periodicity than different sections being similar on the surface, and there's certainly no need for them to use the correct terms.
> If some expression looks complicated, try graphing it and see if you get any insight into how it behaves.
This is not always a good idea. Some functions have complicated behavior that makes them either plain hard to draw (e.g. sin(1/x) near 0), or reach very high values but also be near 0, or be otherwise tricky.
When I saw the equation referred to as (*), I had a flashback to those problem sets with *hard and **harder problems. ** problems often required some real out-of-the-box thinking. I wasn't always able to solve those, but it was so satisfying when I did (usually after an hour or two of struggle).
I also like labeling formulas with playing card suits. I think they are easier for the reader to distinguish when they are looking back for the labeled formula.
A while back I had the idea of marking erroneous formulas with a red spade. I'm going to try doing that again because it's hilarious.
Nope, good teachers understand when the question they wrote thinking to test certain concepts falls flat, because of some unforeseen oversight or quirk. Something being clear and intuitive to a math professor is not the same as something being clear and intuitive to the students.
The best professors strike and ignore questions that "failed", ie didn't accurately test what they thought they would.
>the person who wrote it knows that you have the knowledge to find the answer.
This isn't always true. The person who wrote it only knows what they taught, not what you have learned. The point of the test is to find what you have learned.
Not a very interesting nitpick then. I guess I should have written "knows that you should have the knowledge to find the answer". If we could teach and be sure that students had learned, we wouldn't need to test them.
I think this exercise is dumb. Not interesting, not challenging, not useful, not anything. I'd feel offended, if someone actually approached me with it.