Hacker News new | past | comments | ask | show | jobs | submit login

> That goes in assert statements. It's run-time checkable

And runtime checks help because...?

> But you'd rather prove it once so the run-time checks are unnecessary.

No. I'd rather prove it to rule out the program being wrong. The runtime check is totally besides the point.

> With a few simple theorems, such as (...)

I think you mean “proposition”. It's not a theorem until it has been proven.

> Think of proof support as being an extension to optimization of assertions.

I never use runtime-checked assertions, so I never need to optimize them away.

> Users need never see those proofs.

But, you see, I want to see the proofs. How am I supposed to maintain a program I don't understand?




    defined(a,i,j) and defined(a[j+1]) implies defined(a,i,j+1)
is a theorem. It looks like this in Boyer-Moore theory:

    (PROVE-LEMMA arraytrue-extend-upward-rule (REWRITE) 
        (IMPLIES (AND (EQUAL (arraytrue A I J) T) 
                 (EQUAL (alltrue (selecta A (ADD1 J))) T)) 
                 (EQUAL (arraytrue A I (ADD1 J)) T)))

     Name the conjecture *1.


     We will try to prove it by induction.  There are three plausible
inductions. They merge into two likely candidate inductions. However, only one is unflawed. We will induct according to the following scheme:

      (AND (IMPLIES (LESSP J I) (p A I J))
           (IMPLIES (AND (NOT (LESSP J I))
                         (p A (ADD1 I) J))
                    (p A I J))).
Linear arithmetic informs us that the measure (DIFFERENCE (ADD1 J) I) decreases according to the well-founded relation LESSP in each induction step of the scheme. The above induction scheme leads to three new formulas:

    Case 3. (IMPLIES (AND (LESSP J I)
                          (ARRAYTRUE A I J)
                          (EQUAL (ALLTRUE (SELECTA A (ADD1 J)))
                                 T))
                     (ARRAYTRUE A I (ADD1 J))),
which simplifies, rewriting with ARRAYTRUE-VOID-RULE and SUB1-ADD1, and opening up ARRAYTRUE and LESSP, to the following eight new conjectures:

...

That finishes the proof of *1. Q.E.D.

"arraytrue" is defined recursively:

    (DEFN arraytrue (A I J) 
        (IF (LESSP J I) T                          -- the null case is true
            (AND (EQUAL (alltrue (selecta A I)) T) -- next element is true
                 (arraytrue A (ADD1 I) J)))        -- and rest of array is alltrue
And yes, there's a machine proof that this terminates.


> defined(a,i,j) and defined(a[j+1]) implies defined(a,i,j+1) is a theorem. (proof)

Sure. My point is just that a theorem is a proposition equipped with a proof. If the user just enters a proposition into the system, then they aren't entering a theorem. They're entering a proposition that the system can turn into a theorem.


> > But you'd rather prove it once so the run-time checks are unnecessary.

> No. I'd rather prove it to rule out the program being wrong. The runtime check is totally besides the point.

That's the same thing. It's just a matter of it being automated.

> > Think of proof support as being an extension to optimization of assertions.

> I never use runtime-checked assertions, so I never need to optimize them away.

If you've ever used a language that uses bounds checked array access, you have. IIRC, you've used Rust some, so you've used bounds checked array access, except where Rust was able to prove that out of bounds access was impossible and elide the code that checks.

> > Users need never see those proofs.

> But, you see, I want to see the proofs. How am I supposed to maintain a program I don't understand?

There's a difference between need and ability. Just because it's stated you don't need to see something doesn't mean you can't. If you want to see a proof, you dig in and find it, or find where it's accepted canon that we don't need to reproduce (do we need proofs for integer addition? That seems of limited use to me, but maybe a formally proved as much as possible all the way to that level language has some interesting benefits).


> That's the same thing. It's just a matter of it being automated.

Re-read what he said. He presented runtime checks as an acceptable but non-optimal scenario for performance reasons. My reply was that runtime checks don't prevent a wrong program from being wrong, and it's this wrongness itself that I consider unacceptable.

> IIRC, you've used Rust some, so you've used bounds checked array access, except where Rust was able to prove that out of bounds access was impossible and elide the code that checks.

Runtime bounds-checked array manipulation is literally the single thing I hate the most about the languages I like the most (ML and Rust). It introduces a control flow path that shouldn't be reachable in a correct program, and hence shouldn't exist. This is particularly painful because beautiful array-manipulating algorithms have existed since, like, forever, yet in 2017 I still can't express them elegantly.

> If you want to see a proof, you dig in and find it, or find where it's accepted canon that we don't need to reproduce (do we need proofs for integer addition)?

I don't need to rebuild arithmetic from scratch again, because I've already done it at some point in time, and once is enough as long as you understand the process.

On the other hand, when I'm first confronted with an already existing program, I don't understand why the program is correct (if it is even correct in the first place), so I do have to set some time aside to properly study it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: