Hacker News new | past | comments | ask | show | jobs | submit login

>We didn't invent fraction out of nowhere.

But we did. When mathematicians provided a rigorous definition of fractions (rational numbers) they separated them from the real world. Rational numbers do not exist in the real world. Real-world does not have infinities. It does not have negative values. In the real world 1/3+1/3 does equal 2/6 in the way that the fourth-grader applied the analogy.

>The moment where you have to stop relying on intuition is a pretty delicate matter

I didn't argue that metaphors and analogies shouldn't be used. I argued that analogies and metaphors are intrinsically flawed and this article provides a great example. At some point, you have to give up on the analogy and fallback on the underlying axioms. You can't do "1/3+1/3=2/6" not because it doesn't make sense for tables of boys and girls (because it does) but because it's against the rules for adding fractions.




> You can't do "1/3+1/3=2/6" not because it doesn't make sense for tables of boys and girls (because it does) but because it's against the rules for adding fractions.

It is pedagogically superior to choose the route implied by the comments about this being a type error.

That is, if you teach the students to "type" all those fractions (e.g., 1/3 of this blue table, etc.), you gift them a tool they can use to map between the real world and basic unitless mathematical notation. (I'd even add explicit operator definition to that.)

For example-- such an educated student could hear your ascetic declaration that "it's against the rules" and quickly grasp something like the following:

1. "1/3+1/3=2/6" doesn't have any units, but it must somehow map to operations with units.

2. If unitless math can be applied regardless of units, then perhaps "1/3+1/3" may mean "1/3 blue table + 1/3 of the red table, where + means joining the two tables." That would equal 2/6 of the joined tables. But "1/3 blue table + 1/3 (same) blue table" would give 2/3 of that blue table, with + mapping to adding those two fractions of the same table.

3. 2/3 does not equal 2/6, so unitless math can't map to both operations.

4. macspoofing said that 2/6 is wrong.

5. Therefore, unitless fraction addition implies addition of things of the same units, and not joining two different things together and finding the new fraction of the new joined unit thingy.

If on the other hand a student of your apparent method of declaring rules for unitless math came to a class that had practiced explicitly mapping unitless <-> unit math, they wouldn't have any tools to understand the mapping. (Well, at least if the teacher made a similarly ascetic declaration regarding mapping.)

I offer into evidence this very article to show what happens when a student of your apparent method becomes the teacher and encounters the most trivial of unit -> unitless mapping errors.

Edit: clarification


>That is, if you teach the students to "type" all those fractions (e.g., 1/3 of this blue table, etc.),

Sure. You can certainly ad hoc extend the analogy to bring it in line with the mathematical rules. But at this point, you do hit a higher level of complexity. Your simple analogy is gone and you're slogging through the weeds. That was my point. Analogies are flawed. The teacher started with a very simple rule that worked well and communicated the ideas under certain constraints and then those rules were extended in a logical but incorrect manner by a fourth-grader ... and now the complexity that was hidden in the abstraction is leaking out.

Your explanation is more confusing to me and wouldn't be grasped by the vast majority of fourth-graders. At some point, simply stating that fractions have different rules is the most simple (and correct) explanation.


"But we did. When mathematicians provided a rigorous definition of fractions (rational numbers) they separated them from the real world"

This seems totally absurd to me. The concepts of proportion, ratio, etc preexisted any kind of formal definition of fraction.

They may have invented a definition of fraction, with the correct notation, and the correct set of rules after a long series of trial and errors (as with a lot of mathematical "rigourous definition"), but they always had in mind that this concept they tried to define should "work" when manipulating ratios / proportions, etc.

I may be wrong, but it seems to me that purely mathematical concepts spawning out of pure mathematical world exploration is a very modern (aka 19th century max) concept.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: