Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's another approach, from Boyer and Moore. Boyer and Moore built up mathematics from constructs at the Peano axiom level (zero, add1, etc.) plus recursive functions that must terminate. It's constructive mathematics; there are no quantifiers, no ∀ or ∃. [1] They built an automatic theorem prover in the 1970s and 1980s that works on this theory. (I recently made it work on Gnu Common LISP and put it on Github, so people can run it again.)[2]

In Boyer-Moore theory, all functions are total - you can apply any function to any object. Types are predicates. Here's a definition of ORDERED for a list of number:

    (DEFN ORDERED (L)
      (IF (LISTP L)
          (IF (LISTP (CDR L))
              (IF (LESSP (CADR L)
                         (CAR L))
                  F
                  (ORDERED (CDR L)))
              T)
          T))
If L is not a list, it is considered to be ordered. This makes it a total function, runnable on any input, even though the result for the "wrong" type is not useful. This provides the benefits of types without requiring a theory of types. It's a very clean way to look at the foundations of mathematics. It's simpler than Russell and Whitehead.

When you prove things using definitions like this, there's a lot of case analysis. This gets worse combinatorially; a theorem with four variables with type constraints will generate at least 2⁴ cases, only one of which is interesting. Most of the cases, such as when L is not a list, are trivial, but have to be analyzed. Computers are great at this, and the Boyer-Moore prover deals with those cases without any trouble. But it went against a long tradition in mathematics of avoiding case analysis. That made this approach unpopular with the generation of pre-computer mathematicians. Today, it would be more acceptable.

(It's fun to run the Boyer-Moore prover today. In the 1980s, it took 45 minutes to grind through the basic theory of numbers. Now it takes a few seconds.)

[1] https://www.amazon.com/Computational-Logic-Robert-S-Boyer/dp... [2] https://github.com/John-Nagle/nqthm



If every operation can have any type arguments applied to it and does something sensible with no compiler or run-time error ... good luck debugging, surely.

What happens with OOP? Every class has to understand how to "bark", not only the dog class?

If any class can somehow "bark" without throwing an exception, that may not be in alignment with the programmer's intent, or promote the furtherance of his or her goals in any way.

For intance, the intent may be that the programmer wanted to ask the local variable dog to "bark", but misspelled it as ndog and the integer which counts the number of dogs was asked to bark instead.

There is much value in identifying the problem that an integer doesn't bark.


Boyer-Moore theory has "shells", which are like structures. See page 39 of [1]. Since this is a pure functional language, values cannot be altered. Shells have constructors, a type predicate, and can have restriction predicates on values.

    Shell Definition.
    Add the shell ADD1 of one argument
    with bottom object (ZERO),
    recognizer NUMBERP,
    accessor SUB1,
    type restriction (NUMBERP X1),
    default value (ZERO), and
    well-founded relation SUB1P.
It's not intended that you run programs in Boyer-Moore theory, although you can. It's a proof tool.

[1] https://www.cs.utexas.edu/users/boyer/acl.pdf


I will have to read this to understand what we can prove with this; or rather, what kinds of wrongs in a program under this theory are usefully proved to be false.

---

Ouch; did you see that "overfull hbox" that got rendered out in the first line of paragraph 3 of the Preface? :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: