For Roman numerals to make sense you need to remove access to convenient writing tools. In a world without paper and pens, Arabic numerals are the ones that seem unwieldy.
I was on a multi-day hiking trip with a friend where we just had a pack of cards and some spiced rum to keep ourselves entertained in the evening. The first night we decided to play cribbage, but we didn't have paper or the like to keep score on.
At first I tried using a stick to scratch out numbers in the dirt. It was doable, but very awkward. The low light of the camp fire made it hard to see, and I quickly ran out of undisturbed soil within arms reach.
I then gathered a few twigs to shape into numbers, using the patterns you'd see on an LED clock. That worked reasonably well, but took more effort than I liked.
I switched to Roman numerals thinking that the simpler shapes would be easier to work with. This turned out to be true, but I discovered it had the added benefit of being really easy to increment numbers.
In most cases, incrementing is incredibly simple. To go from 0 -> 1 -> 2 -> 3, you simply add a stick each time. To go from 3 -> 4, you pinch together the bottoms of the 2nd and 3rd sticks. Then you take the first stick away, then later drop it on the other side of the V. Add another stick, cross some sticks, etc.
There are a lot of things that seem poorly done at first glance, but make total sense when you understand the environment they were developed in. I've heard younger folks wonder why old TV shows are so poorly written. When you've always been able to watch on demand (or rent episodes on DVD), it's hard to understand the restrictions on writing when there was no way for your audience to watch previous episodes if they hadn't seen them when they had aired. And of course this is true for most software I've ever worked on.
I like your thought experiment to illustrate why Roman numerals make sense for the time. I think you're right! It is more practical given the tools than Arabic numerals.
To respond to the hiking story; I would opt to use binary if I were in the same situation. I think it's a superior system to use especially out in the woods. It's hard to read for many folk unexposed to binary but it allows you to encode lots of base 10 numbers using a few items. For example, a twig rotated 0 degrees could signify 0, and a twig rotated 90 degrees could be 1. 5 twigs gets you 0 - 31!
To bring it back to the Romans... I realize Binary wasn't commonly used in the Ancient World and that is something I find a little baffling. Base 2 seems like it would be more useful in a pre-pen and paper world, especially when you have to etch something. The closest thing I can find is the Inca Quipu ("talking knots") but even that encodes base 10 numbers into base 4 knots.
If base 2 was widely used by human beings our fingers could have encoded up to 1024 digits.
Gray code takes a lot of work to learn. It's not very likely that people would have discovered it as a way of counting on their fingers (or with other objects) prior to the development of other number systems. If they had, it would have been extremely hard to teach, learn, or verify that someone was using it properly, without other number concepts to use as a reference!
I learned to count in binary on my fingers in high school, and I can confirm that it's more physically challenging than counting the conventional way, and also harder to remember and communicate numbers without translating them into language.
Although you can only represent 10 values instead of 1024 values, remembering one of those values is also correspondingly about 100 times easier.
Plus, it's easy to teach the direct correspondence between fingers and objects even to children or to illiterate or innumerate adults. We can imagine people using fingers this way for a straightforward matching task: "I saw this many dogs!" "I'll give you this many melons!"
And then it leads to separate "uninterpreted" names for each number of fingers that a person can hold up. After that there's a natural route toward place value with units of either one or two hands.
Even the shapes of the Roman numerals may have originated from chunking "hands" as units:
(although the Romans did use further multiplicative combinations of 5s and 10s, they didn't come up with implicit ways to continue the process indefinitely)
> To go from 0 -> 1 -> 2 -> 3, you simply add a stick each time. To go from 3 -> 4, you pinch together the bottoms of the 2nd and 3rd sticks.
The subtractive part of Roman numerals is an innovation the Romans generally didn't use. (Which is why clocks say IIII and not IV.) It adds significant complexity for no benefit.
Hadn't seen those before! Yeah I suppose they are, although could you call Roman numerals decimal? I don't know enough about the technical definition to say. I'm assuming "decimal" and "base 10" are synonymous. And the base is the number of distinct digits you have to work with. Roman numerals don't really work that way though. Man, I remember when I first learned about different bases, it took me a bit to wrap my head around the idea that base 10 is completely arbitrary. It doesn't make more sense than the other bases, it's just what we standardized on. Now I'm realizing that maybe the idea of bases themselves aren't some rule of the universe.
It's not a decimal system, no. The decimal system is a positional system (just like binary, hexadecimal, sexagesimal ...). In a positional system, part of the information of a symbol is in its position in the whole string, and it always represents an 1-to-(base-1) part of its base. That's why every base is base-10 when represented in itself.
However, most human languages encode numbers as an algebraic expression: "one {times} thousand {plus} three {times} {one} hundred {plus} thirty {plus} eight" = 1338. Some languages are more strict/consistent in this than others. English has its quirky unique words for 11 and 12 (instead of one-ten and two-ten), Danish and French enjoy adding multiplications and divisions to the mix (due to retaining more of their 20-based origin): "quatre-vingt-dix-sept" = 4 * 20 + 10 + 7 = 97, "halvtreds" = halvtredsindstyve = (3 - 0.5) * 20 = 50.
The Roman numeral system works likewise, it's just a shortened form of saying the number. Which means you need a symbol for every word for a magnitude, and your expression range is limited to the words your language has for ever greater quantities. But for most humans, in our daily lives we rarely need to precisely number more than, say, a few ten-thousand things. Beyond that "myriad"/"a lot"/"uncountable" suffices.
The simple form of the Roman system is just additive: MDCLXVI. You could add subtraction: XC instead of LXXXX to make it faster to write. But then you're already on the path to a positional system.
Romans used a device called a "calculator" which was a flat plate with vertical channels cut or pressed into it. It resembles what we would now call an "abacus".
get rid of the subtractive rule, and roman numerals are just a way of writing the state of a calculator, with each letter representing a calculi, and the value of the letter representing which column it is in.
most abacuses are base 10, with a special segment in each column for a stone that represents "5" in that position.
Thus, each roman numeral is either a power of 10, or half a power of 10.
with that context in place, the method of writing it isn't a positional system, exactly, but it's a way of serialising a positional system by capturing positions by naming them explicitly.
edit:
here's an image of a replica calculator, note how the columns are labeled.
edit2: it’s also easy to see in this context how this relates to our system of money, and coin values and names such as “cent”, if roman numerals aren’t decimal then neither are coins.
> English has its quirky unique words for 11 and 12 (instead of one-ten and two-ten)
Of course, consistency with other number names above 20 would lead to "ten-one" and "ten-two", not "one-ten" and "two-ten"; or consistency with other number names below 20 would lead to "oneteen" and "twoteen", or some variant (as with "thirteen" in place of "threeteen").
OK, valid point of about whether it's fair to call Roman numerals decimal. They're similar because all the symbols (except I) are multiples of 5 and 10, but it's not really the same thing as true base 10.
Swiss Jass games (which typically require adding up three digit scores until you reach a four digit goal) use a specialized writing system that is easy to increment as well:
Normally you have a board with pegs for each player, and rounds have you move your pegs up a number of holes for your score. First to 121 wins. Increments are anywhere from nothing to 29 (excluding 19, 25-27).
Personally I'd just make a few differently sized sticks/rocks to represent 5s, 1s, and 30s. Maybe 2s since those are common. Maybe -1s as well, and consolidate if I want during shuffling/dealing. (30s being convenient because if you lose by multiples of 30 it counts as a skunk / an extra loss.) (Edit: Actually I might just make a board in the dirt with sticks to designate every 5 points, like a real board, and each player just puts down 1-5 rocks in the segment they are in. But that gets dangerously close to just finding 120 small rocks... On the plus side, that's almost enough rocks for one side in a game of Go.)
People invented positional number systems that were also easy to write and increment well before Roman numerals. Those look good on monuments but that's about it.
Duh... because we all have decades of experience working with the former representation and almost no experience at all working with the latter one.
If you wrote a number using arbitrary different symbols and (say) base 14 instead of base 10, but still a positional number system, we would still find it incredibly difficult to interpret written numbers, because we wouldn’t be used to it.
Indeed we find exactly this happens when people who are not experienced programmers try to read hexadecimal numbers written using digits 0123456789ABCDE, and that’s even in a case where we are deeply familiar with all of the symbols and their order, and in a very straightforward power-of-2 base.
Romans or medieval Europeans had absolutely no problem reading numbers written like the latter example. Indeed, this is a much more straight-forward and intuitive representation, and is very easy to interpret for anyone used to working with a counting board, since it is a direct translation of the positions of counters on the counting board to symbols written on paper. [Edit to add: Note that Roman numerals were not a system for practical arithmetic – people didn’t use writing for this, instead calculating by moving counters around on a counting board — but only a written record of input data and calculated outputs.]
I really hate how much people bash on Roman numerals without any significant experience working with them, and without any discussion or even acknowledgement of the historical or practical context.
I agree that it's a sort of weak argument, and I'm a bit surprised by it. Usually the claim isn't that Roman numerals are hard to read and write, it's that they're hard to calculate with. And I think that's a stronger argument. Probably you can learn algorithms for doing calculations on Roman numerals, and we're familiar with our algorithms because of decades of experience. But the algorithms we learn in school actually do work marvelously for other bases.
People never used Roman numerals for performing written arithmetic, so that is not a fair comparison. Calculations were done mentally, with fingers, or for anything complicated or serious on a counting board.
I believe the ancient Romans would disagree? They had complicated rules for borrowing and simplifying, but of course it had to be done or they had nothing.
> complicated rules for borrowing and simplifying [..] or they had nothing
Maybe you can explain what you are referring to here?
Ancient Greeks and Romans simply did not do arithmetic with pen and paper. They used a counting board with counters or pebbles (“calculi”), or for simpler problems used their fingers or did the work in their heads.
Doing computation was literally called “placing pebbles”, and to settle accounts with someone you would “call them to the pebbles”.
Roman numerals were not a tool for computation. They were a tool for recording data or results. The purpose of Roman numerals was to faithfully record in writing the placement of counters on a counting board. The position of the counting board (or perhaps in some contexts a pile of coins of various denominations) was the primary representation of numbers, and Roman numerals were secondary.
That would be fair if all you cared about was calculation.
Comparing our numbers with Roman numerals is fair, I believe you're saying, if all you care about is representation. They both require practice to read and write.
But we care about both representing numbers and computing with them. So I don't think it makes sense to say its not fair to point out the drawbacks of a system that can't do both.
Doing pen and paper arithmetic is not part of the “system” of Roman numerals.
If people simply said “paper is good, we should use paper instead of counting boards” that would be fine. But instead they always make up nonsense about how pen and paper long division is stupidly cumbersome if your only tool is pen and paper and you write your numbers as Roman numerals, or whatever.
How is that nonsense? Is it not cumbersome to do long division with Roman numerals if your only tool indeed is a pen and paper?
Or perhaps you are saying it nonsense to start the discussion with the implicit assumption that the available tool is pen and paper, and not a counting board.
> nonsense to start the discussion with the implicit assumption that the available tool is pen and paper
Yes, that is right. Roman numerals do not make much sense in a paper-centric context where we have ditched counting boards. Any algorithms you might invent for working with Roman numerals on paper are anachronistic. Comparing invented straw-man uses of Roman numerals against real uses of Hindu–Arabic numerals is not fair or useful.
It’s like making an argument that Lassie and Perry Mason are “better” than Verdi operas on the basis that the opera set designs are too elaborate and colorful, the action is too spread out, and the commercial breaks don’t flow with the story when you try to show the opera on small black and white TVs from the 1950s.
This is an interesting perspective, but you have to assume that deprecating the system happened for a reason, probably due to ergonomic disdvantages. For simple counting tasks, unary tally marks are still fashionable.
Edit: I never used an abakus or the like, but maybe that's what roman numerals are the analogy to, and it got overcome with long division and the like.
There are certainly advantages to the more abstract positional Hindu–Arabic number system (or e.g. the ancient Mesopotamian system for writing sexagesimal numbers which also positional). I am not trying to dispute that.
A symbolic positional system saves space on the page. It makes recorded numbers more difficult to falsify later. Doing scratch work on paper leaves a better record of calculations (the position of a counting board is inherently ephemeral) which potentially makes it easier to spot errors. It is easier to describe pen and paper arithmetic in printed books which helps spread knowledge more easily than an oral tradition. It extends more easily to larger or (after the advent of decimal fractions) smaller numbers without needing an inordinately large number of different symbols. It is more easily extended by the use of pre-computed tables of function values, and especially by slide rules and similar devices. Once we start decimalizing our metrical system it becomes very regular to work with many different measurements in a consistent way. Perhaps most importantly the use of writing leads naturally to the development of symbolic algebra and ultimately simpler formulations of higher mathematics.
But we can carefully describe the advantages and historical process of replacement etc. without unfairly disparaging the more direct representation.
MCM requires subtraction, two directions, 1900 is additional, maybe multiplicative if you want to be strict, either way only one direction. I'm not sure what's "more direct", either is linear. I might even think of 15:45 as a quarter before 4 [1]), but I'd argue that subtraction is conceptually a level higher.
Nevertheless, high level representation might be an advantage. Maybe an interesting parallel viz. cognition is the 19.95 pricing scheme that's apparently proven to be confusing to the consumer. I'm just not sure which numeral system suffers more from that.
[1] which marks an isogloss between east and west germany, the latter saying "dreiviertel vier" - three quarters four; German also has a weird order for decimal place reading: 121=hundred-one-twenty
Well MCM is not how Romans would write this except in rare cases. That one did not become a standard form until Medieval Europe.
MDCCCC is absolutely a more direct representation of the counters on a counting board than 1900. Each letter precisely represents one counter and its position.
But if you want, you can make a counting board with a line down the middle and use one side for negative values, in which case MCM would be a pretty direct representation of that. (It is unclear whether this was the origin of such numerals; we don’t have a whole lot of evidence about how calculation on ancient counting boards was done, because it was largely part of oral culture.)
Betterexplained is a game changer. Understanding math was nearly impossible when I was a schoolboy, teachers could never (and hardly ever tried) explain anything the way that would make any sense other than "just write it down, memorize and use to substitute values". And now with resources like betterexplained I get everything for what I have wasted years with no success in just seconds. This ought to replace the way people are taught at schools and in colleges, the old way should be damn outlawed (;,;)
The problem is that we let experts design curricula. Experts don't see a problem with the old way, because that's how they learned it, and look at them - they're experts!
It's the same reason a top-down effort to use tau instead of pi can't get momentum. People who struggle with math find it a game-changer, but mathematicians just shrug and say they don't see the point.
I expect a similar dynamic explains why open source software has such terrible user interfaces, and why legalese is so impenetrable: the people with the power to change it, don't see the problem.
Indeed. Experts in teaching should design curricula and write textbooks, not experts in the subjects (of course experts in the subjects should be asked to check everything is valid but not how to teach).
Nevertheless it has been a long time since I've last seen an open source program with a terrible user interface. I'm writing this in Chrome (which is open source too and its GUI is quite ok) on KDE5 Plasma (which gives me aesthetical and ergonomical orgasm all the way I use it although I always hated how old versions of KDE looked and felt) on Manjaro while making notes in the Atom editor (which is ridiculously resource-inefficient yet perfect aesthetically and very intuitive and comfortable to use) and managing files in Krusader (which isn't as feature-reach as Total Commander yet looks a way more eye-candy and is almost equally convenient).
Also legalese is so impenetrable only in English (and, probably, some other) language speaking countries, legal English indeed seems a language distinct from what ordinary people speak but in many countries laws are readable as easily as anything else.
The challenge with roman numerals was definitely computing with them and really "thinking with them". Abacuses helped, but it was still a poor UX in many ways and it took lots of training & practice to get familiar.
Algebraic notation also requires training & practice but has a much quicker learning curve. Almost every literate adult uses equations in some form or another on a weekly basis. Both the medium (pencil & paper) as well as the representation (algebraic notation) were critical to this (as well as other things like education system / literacy, etc).
The benefits & disadvantages of different representations is massively understudied and misunderstood. Data visualization, semiotics, and some other fields focus a bit on this but still fall short in many ways.
The intuition is as simple as it gets; r·e^{iθ} is the complex number with radius r and angle θ. pi is just the angle that puts you on the negative half of the real number line.
For why that identity holds, you do simple algebra on the non-simple Taylor series of those three functions (e^x, sin x, cos x). It is, to say the least, much less intuitive.
The linked article mentions how helpful it is to have the right analogy (e.g. a number line, which you reference in your explanation of its intuitivenes). That's the main point: given the right analogy, things that otherwise seem arbitrary or nonsensical become clear and even intuitive. :)
For intuition on complex numbers, you could try playing with http://wry.me/math-toys — though it’s in a hacky state I haven’t had the energy to fix. There are much better complex-function plotters, but I was aiming for a more tangible UI.
The process of thinking has no strict rules. Anything goes. Even random search.
All the requirements for consistency and good form of argumentation apply only to the "final product". When you finally present your new result it should be as solid as possible.
Learning about Church and Scott encoding was much more interesting than I thought it would be. I was expecting it to be tedious and banal, but I came out of it feeling like I had some sort of revelation. Imagining naturals, lists, trees, etc. as function application was mind-bending at first, but it eventually clicks into place. Lambda calculus is so incredible in its combination of simple semantics, expressiveness, and abstraction/compositionality.
To go a step further, think about how you would implement a predecessor function on the Church encoding of the naturals. It is much more complicated than one might expect, and perfectly motivates the introduction of Scott encodings and the fixed-point function.
The Frege/Russell definition of numbers is pretty cool from that sort of perspective. The general style of their definitions is to temporarily sidestep the question of what X type of number is, and first ask to what sort of thing it applies to, and what is the relation of having the "same number" among such things. Then that sort of "number" is just defined as the equivalence classes of whatever the identified "same number" relation is.
There are two different definitions of the Naturals provided, that of finite Cardinal and Ordinal numbers.
The first applies to sets, grouping them by "same cardinality" or same size. Thus, the cardinal number n is identified with the set of all sets with cardinality n.
The second applies to well-ordered binary relations, and groups them by what we would call order-isomorphism. This might sound complicated, but the end result is that the ordinal number n becomes the set of all well-orders of length n (you can soft of think of this as the set of sequences of length n, at least for the finite case).
Amusingly, both of these notions are too general to correspond to just the natural numbers, since they don't discriminate between finite and infinite numbers. Thus, the 'natural number' portion of each is actually defined as the smallest initial subset for which mathematical induction is valid.
Anyhow, the Principia Mathematica is pretty fun if you're into that sort of thing. It builds up a lot of neat and weird representation of all sorts of different numbers, up from the naturals, integers, ratios, and reals. It even provides its own weird definition of vectors, and gives a kind of analysis of "signed magnitudes" (like weight, height, temperature etc) and how their definition of real numbers as pure mathematical objects relate to them, providing a kind of abstract interpretation of what we mean when we measure something in the real world.
Iterestingly in the modern approach to set theory there still are both ordinal and cardinal numbers, but you can't say that the cardinal n is the set of all sets with cardinality n, because this is a proper class. Instead we choose a particular representative for this class, which also happens to be well ordered (this is in ZFC, you can still pick representatives for equipotence classes in ZF via Scott's trick but they are not necessarily well ordered or well orderable).
I was on a multi-day hiking trip with a friend where we just had a pack of cards and some spiced rum to keep ourselves entertained in the evening. The first night we decided to play cribbage, but we didn't have paper or the like to keep score on.
At first I tried using a stick to scratch out numbers in the dirt. It was doable, but very awkward. The low light of the camp fire made it hard to see, and I quickly ran out of undisturbed soil within arms reach.
I then gathered a few twigs to shape into numbers, using the patterns you'd see on an LED clock. That worked reasonably well, but took more effort than I liked.
I switched to Roman numerals thinking that the simpler shapes would be easier to work with. This turned out to be true, but I discovered it had the added benefit of being really easy to increment numbers.
In most cases, incrementing is incredibly simple. To go from 0 -> 1 -> 2 -> 3, you simply add a stick each time. To go from 3 -> 4, you pinch together the bottoms of the 2nd and 3rd sticks. Then you take the first stick away, then later drop it on the other side of the V. Add another stick, cross some sticks, etc.
There are a lot of things that seem poorly done at first glance, but make total sense when you understand the environment they were developed in. I've heard younger folks wonder why old TV shows are so poorly written. When you've always been able to watch on demand (or rent episodes on DVD), it's hard to understand the restrictions on writing when there was no way for your audience to watch previous episodes if they hadn't seen them when they had aired. And of course this is true for most software I've ever worked on.