Hacker News new | past | comments | ask | show | jobs | submit login

Here’s what I wrote elsewhere about this:

Why was 1997 easier for us to read than MCMDCVII?

Duh... because we all have decades of experience working with the former representation and almost no experience at all working with the latter one.

If you wrote a number using arbitrary different symbols and (say) base 14 instead of base 10, but still a positional number system, we would still find it incredibly difficult to interpret written numbers, because we wouldn’t be used to it.

Indeed we find exactly this happens when people who are not experienced programmers try to read hexadecimal numbers written using digits 0123456789ABCDE, and that’s even in a case where we are deeply familiar with all of the symbols and their order, and in a very straightforward power-of-2 base.

Romans or medieval Europeans had absolutely no problem reading numbers written like the latter example. Indeed, this is a much more straight-forward and intuitive representation, and is very easy to interpret for anyone used to working with a counting board, since it is a direct translation of the positions of counters on the counting board to symbols written on paper. [Edit to add: Note that Roman numerals were not a system for practical arithmetic – people didn’t use writing for this, instead calculating by moving counters around on a counting board — but only a written record of input data and calculated outputs.]

I really hate how much people bash on Roman numerals without any significant experience working with them, and without any discussion or even acknowledgement of the historical or practical context.

I would recommend anyone interested in this topic start by reading Netz’s paper “Counter Culture” about Greece 2500 years ago, http://worrydream.com/refs/Netz%20-%20Counter%20Culture%20-%...




I agree that it's a sort of weak argument, and I'm a bit surprised by it. Usually the claim isn't that Roman numerals are hard to read and write, it's that they're hard to calculate with. And I think that's a stronger argument. Probably you can learn algorithms for doing calculations on Roman numerals, and we're familiar with our algorithms because of decades of experience. But the algorithms we learn in school actually do work marvelously for other bases.


People never used Roman numerals for performing written arithmetic, so that is not a fair comparison. Calculations were done mentally, with fingers, or for anything complicated or serious on a counting board.


I believe the ancient Romans would disagree? They had complicated rules for borrowing and simplifying, but of course it had to be done or they had nothing.


> complicated rules for borrowing and simplifying [..] or they had nothing

Maybe you can explain what you are referring to here?

Ancient Greeks and Romans simply did not do arithmetic with pen and paper. They used a counting board with counters or pebbles (“calculi”), or for simpler problems used their fingers or did the work in their heads.

Doing computation was literally called “placing pebbles”, and to settle accounts with someone you would “call them to the pebbles”.

Roman numerals were not a tool for computation. They were a tool for recording data or results. The purpose of Roman numerals was to faithfully record in writing the placement of counters on a counting board. The position of the counting board (or perhaps in some contexts a pile of coins of various denominations) was the primary representation of numbers, and Roman numerals were secondary.


The Romans heavily relied on the abacus. It used the 10-based positional representation of numbers.


I'm not sure what your position is. What would be a fair comparison?


The fair comparison would be a written arithmetic with Hindu–Arabic numerals vs. a counting board.

https://upload.wikimedia.org/wikipedia/commons/1/13/Houghton...


That would be fair if all you cared about was calculation. Comparing our numbers with Roman numerals is fair, I believe you're saying, if all you care about is representation. They both require practice to read and write.

But we care about both representing numbers and computing with them. So I don't think it makes sense to say its not fair to point out the drawbacks of a system that can't do both.


Doing pen and paper arithmetic is not part of the “system” of Roman numerals.

If people simply said “paper is good, we should use paper instead of counting boards” that would be fine. But instead they always make up nonsense about how pen and paper long division is stupidly cumbersome if your only tool is pen and paper and you write your numbers as Roman numerals, or whatever.


How is that nonsense? Is it not cumbersome to do long division with Roman numerals if your only tool indeed is a pen and paper?

Or perhaps you are saying it nonsense to start the discussion with the implicit assumption that the available tool is pen and paper, and not a counting board.


> nonsense to start the discussion with the implicit assumption that the available tool is pen and paper

Yes, that is right. Roman numerals do not make much sense in a paper-centric context where we have ditched counting boards. Any algorithms you might invent for working with Roman numerals on paper are anachronistic. Comparing invented straw-man uses of Roman numerals against real uses of Hindu–Arabic numerals is not fair or useful.

It’s like making an argument that Lassie and Perry Mason are “better” than Verdi operas on the basis that the opera set designs are too elaborate and colorful, the action is too spread out, and the commercial breaks don’t flow with the story when you try to show the opera on small black and white TVs from the 1950s.


Because MCMXCVII is actually 1997.

MCMDCVII is just wrong. D is 500, X is ten.


Kind of a nice proof of his point! We're not used to the numbers, people would be far less likely to make the mistake with 1997.


Yes that was a typo.


This is an interesting perspective, but you have to assume that deprecating the system happened for a reason, probably due to ergonomic disdvantages. For simple counting tasks, unary tally marks are still fashionable.

Edit: I never used an abakus or the like, but maybe that's what roman numerals are the analogy to, and it got overcome with long division and the like.


There are certainly advantages to the more abstract positional Hindu–Arabic number system (or e.g. the ancient Mesopotamian system for writing sexagesimal numbers which also positional). I am not trying to dispute that.

A symbolic positional system saves space on the page. It makes recorded numbers more difficult to falsify later. Doing scratch work on paper leaves a better record of calculations (the position of a counting board is inherently ephemeral) which potentially makes it easier to spot errors. It is easier to describe pen and paper arithmetic in printed books which helps spread knowledge more easily than an oral tradition. It extends more easily to larger or (after the advent of decimal fractions) smaller numbers without needing an inordinately large number of different symbols. It is more easily extended by the use of pre-computed tables of function values, and especially by slide rules and similar devices. Once we start decimalizing our metrical system it becomes very regular to work with many different measurements in a consistent way. Perhaps most importantly the use of writing leads naturally to the development of symbolic algebra and ultimately simpler formulations of higher mathematics.

But we can carefully describe the advantages and historical process of replacement etc. without unfairly disparaging the more direct representation.


MCM requires subtraction, two directions, 1900 is additional, maybe multiplicative if you want to be strict, either way only one direction. I'm not sure what's "more direct", either is linear. I might even think of 15:45 as a quarter before 4 [1]), but I'd argue that subtraction is conceptually a level higher.

Nevertheless, high level representation might be an advantage. Maybe an interesting parallel viz. cognition is the 19.95 pricing scheme that's apparently proven to be confusing to the consumer. I'm just not sure which numeral system suffers more from that.

[1] which marks an isogloss between east and west germany, the latter saying "dreiviertel vier" - three quarters four; German also has a weird order for decimal place reading: 121=hundred-one-twenty


Well MCM is not how Romans would write this except in rare cases. That one did not become a standard form until Medieval Europe.

MDCCCC is absolutely a more direct representation of the counters on a counting board than 1900. Each letter precisely represents one counter and its position.

But if you want, you can make a counting board with a line down the middle and use one side for negative values, in which case MCM would be a pretty direct representation of that. (It is unclear whether this was the origin of such numerals; we don’t have a whole lot of evidence about how calculation on ancient counting boards was done, because it was largely part of oral culture.)





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: