Hacker News new | past | comments | ask | show | jobs | submit login

It's true that humans hadn't discovered zero until relatively recently in mathematical history. Should we reject these "new" developments in numeracy, and go back to roman numerals? One could make a pretty strong case that roman numerals are more intuitive, more human.



I take your point here - but no developments in numeracy include counting objects from zero.


That's because making the switch from 1-indexing to 0-indexing is extremely counter-intuitive. However, take a hypothetical person who learned 0-indexing socially from birth, I'd imagine they'd be just as confused by 1-indexing.


When you use a count variable in programming, do you initialize it to 1?


Making this argument agrees with my general point: that high-level programming languages ought to count as humans do.


My point was that it's rather awkward to count from 1 when doing programming.

For instance, could you rewrite this function where int count = 1; and have it still be intuitive?

  int count_occurrences(std::vector<int> vec, int value) {
    int count = 0;
    for(int i = 0; i != vec.size(); ++i) { // using 0-based index
      if(vec[i] == value)
        ++count;
    }
    return count;
  }
My attempt is: (look how ugly!)

  int count_occurrences(std::vector<int> vec, int value) {
    if(vec.empty())
      return 0;
    count = 1;
    for(int i = 2; i <= vec.size(); ++i) { // using 1-based index
      if(vec[i] == value)
        ++count;
    }
    return count;
  }


As another commenter pointed out, it's only the indices that are one-based. I think it would look something like this:

  int count_occurrences(std::vector<int> vec, int value) {
    int count = 0;
    for(int i = 1; i <= vec.size(); ++i) { // using 1-based index
      if(vec[i] == value)
        ++count;
    }
    return count;
  }
Off the top of my head, this has a number of benefits in higher level languages. For instance in Javascript, here are some common inconveniences caused by zero-based indices:

  var lastEl = arr[arr.length - 1];

  if(arr.indexOf(someEl) !== -1) {
    // do something knowing that someEl is in the array
  }
and if we lived in a one-based world, here's what they would look like:

  var lastEl = arr[arr.length];

  if(arr.indexOf(someEl)) {
    // do something knowing that someEl is in the array
  }


It's only `i` (the index) that's meant to begin from 1.

`count` is a quantity, not an index, and there are such things as "zero apples". But you start counting them from the first one.


I agree that it's not at the same level, but there certainly are developments.


Computers aren't a development in numeracy?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: