Hacker News new | past | comments | ask | show | jobs | submit login

The problem is that the usefulness of primes comes from the ability to break down a number into a unique set of primes, and including 1 as a prime defeats that usefulness. While your understanding of primes is an elegant idea, in sciences, usefulness trumps elegance.

From a programmer's point of view, a similar thing happens with Lisp. Lisp is a very elegant language due to its minimal structure, but not a very useful one with respect to how often it is used.




It almost seems to me that there are two very similar, overlapping mental models* of "primeness".

The first is: "numbers that can be multiplied together to get all other numbers". This mental model by necessity excludes 1.

The second is: "numbers that cannot be 'made' by multiplying other, smaller numbers together". This mental model is "looser," and ends up including 1, and maybe 0 and -1.

The first mental model tends to be more mathematically useful, so becomes the official definition of prime.

\* Note that I explicitly use "mental model" here instead of "definition," because I am discussing different ways that different humans try and _understand_ different sets of numbers.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: