No misunderstanding about NP here for sure. As I said, this is about as much of a thesis as Church Turing is about what can be computed.
I have no clue about CiC, lean and whatnot. It was never my field and I don't doubt there can be some very weird things you can do with some fancy logic models that are rarely discussed by non logicians.
What I'm claiming is that anything a human could prove without a computer could be done by a non deterministic machine in poly time under a very reasonable assumption of proof length. I'm baking in the assumption of proof length, so I can claim something about NP...
Let me put it this way: if you can write a paper with your proof and I can check it with only my brain without a computer helping me, then a poly time algorithm could also check that proof. Unless you believe something exotic about how the brain computes, how would it not be the case?
> Let me put it this way: if you can write a paper with your proof and I can check it with only my brain without a computer helping me, then a poly time algorithm could also check that proof. Unless you believe something exotic about how the brain computes, how would it not be the case?
What does "without a computer" have to do with anything, and what do you mean by saying "in polynomial time" when restricted to a specific instance of the problem? To talk about solutions being checkable in polynomial time in the size of the input (which is what NP actually means, more or less, since every problem in NP can be encoded as a 3SAT formula with at most polynomial blowup), you have to actually specify some (infinite) class of problems first, otherwise "in polynomial time" doesn't really mean anything. Then, you have to specify a polynomial time verification algorithm for every instance in the class.
T he problem is that "proofs of length N" in CiC doesn't have a polynomial time verification algorithm (for checking proof correctness) in the size of the proof (and as I mentioned elsewhere, attempts to translate to an encoding that does can result in exponential or much worse blowup in the proof size, which does not allow you to put them in the same complexity class in this context). Moreover, it's very unclear how you'd even begin to carve out the complexity class of "proofs that can be verified quickly" because that pretty much requires you to write an algorithm to determine the complexity of the proof, which includes being able to determine the complexity of arbitrary functions in CiC (which as you can imagine, is not easy in such an expressive system! AFAIK, the totality proof requires axioms beyond ZFC, which means that the proof is not constructive and cannot provide any specific upper bound on the number of steps taken by an expression).
This means that there is no polynomial function f(n) such that a nondeterministic Turing Machine could always terminate after f(n) steps and (correctly) report "yes" when there is a proof with n or fewer steps and "no" otherwise, and it is rather unlikely that there is a decision procedure to restrict the proofs to "proofs that run in polynomial time in the size of the proof input" so you could practically apply your claim just to that subset of proofs (though I don't know whether the absence of such a decision procedure has actually been proven, but generally speaking problems like this are very rarely decidable for interesting programming languages like CiC, even if they are total). It does not mean that a human, or human with a computer, couldn't come up with a clever short proof in O(n) tokens that takes longer than f(n) steps to check on some particular instance of the problem for some particular f... because why would it?
I think what you're trying to say (to make your argument hopefully what you had in mind) is that if humans can verify a proof in "a reasonable amount of time", then a nondeterministic Turing Machine could produce the proof in "a reasonable amount of time." Which might well be true, but the point is that in CiC, how long the proof takes to verify has very little to do with the size of the proof, so this claim has nothing to do with proof verification being in P (and therefore, has nothing to do with proof search being in NP, which it isn't for CiC). More to the point, this is pretty much true of any problem of interest--if it takes more than a reasonable amount of time in practice to verify whether a solution is correct (where verification can include stuff like "I ran polynomial-time UNSAT on this instance of the problem" in a universe where P=NP), it isn't practically solvable, regardless of what fancy methods we used to arrive at the solution and regardless of the length of the solution. So I don't find this particularly insightful and don't think it says anything deep about humanity, mathematics, or the universe.
Anyway, I'm mostly just pointing out that even if P=NP, you can't really leverage that to discover whether a short proof exists in CiC in polynomial time. More concretely, there isn't an algorithm to convert an arbitrary CiC formula to a 3SAT formula without greater than polynomial size blowup (we know this because we can run non-elementary algorithms in CiC!). So even if we had a polynomial time 3SAT algorithm, which would solve a great many things, it wouldn't solve proof synthesis, which is another way to phrase "proof synthesis in CiC isn't in NP."
First of all, thank you for a thorough response. I'll need to take time to read it (and the refs in your other reply) with the care it deserves.
Basically I'm talking about the subset of proofs that could be done with pen and paper and pass a careful refereeing process. And you got that interpretation in your reply. You say that it is not insightful and that is of course an opinion you are entitled to.
To me it is an interesting observation that NP is a complexity class powerful enough to basically obsolete us. The context of this conversation/thread is using Lean to formalize a proof and whether in the near future, integrating AI models would potentially give us novel tools for discovering new mathematics. We routinely solve NP hard problems in practice for many instances, so I think drawing this parallel here was relevant.
I do agree with you that there would still be out of reach proofs even with an efficient algo for NP, but as some other person replied, it would be a nice problem to have...
> Basically I'm talking about the subset of proofs that could be done with pen and paper and pass a careful refereeing process. And you got that interpretation in your reply. You say that it is not insightful and that is of course an opinion you are entitled to.
Well, that part was me being subjective, but the objective part is that this subset of proofs doesn't necessarily have a verification algorithm "in NP." A very short proof that took time doubly exponential (in some sense) in the size of the proof could still feasibly be solved with pen and paper, because the proof was short even though the verification time was very long (or the verification was long due to a technicality--I describe an example below), depending on the exponent and the proof size. While at the same time, a short proof that took time polynomial (in some sense) in the size of the proof could take longer than the age of the universe to verify and be out of reach for both humans and computers, depending again on the degree of the polynomial and the proof size (and, since again proof verification in CiC is not reducible to 3SAT, proof search that incorporates non-polynomial functions plausibly might not even benefit from any speedup if P = NP!). I say "in some sense" because it's unclear exactly what function f to fix, since pretty much any decidable, total function you can think of grows more slowly than "max steps needed to typecheck a term of length n in CiC" (outside of other meta-level functions that similarly don't provide practical bounds on running time).
(To give a concrete example of where human verification might be much faster than computer verification of the same step, human mathematicians can easily use "tricks" when they know two results will be the same, without actually proving them equivalent, modifying the proof, or imparting extra knowledge to a verifier. For example, in Coq, by default you use Peano naturals, which are convenient for proofs but have very poor complexity--which means exponentiation actually takes exponential time to compute with them. Humans verifying a proof step will recognize this and just do exponentiation the normal way regardless of which inductive definition of naturals is being used, while for a computer to do so the person writing the proof has to deliberately switch out Peano naturals for digital naturals. It is entirely possible that the proof size when using non-Peano nats will be larger than the proof size using Peano nats, since induction on non-Peano nats is much more tedious, while the verification time is exponentially lower using non-Peano nats! So this is an example where proof search for a short proof and proof search for a fast to verify proof may be in conflict. It's also an example of somewhere where even though it's not explicitly stated in the proof, and even though different formulations of nats can't exactly be proved "fully equivalent" in CiC, humans verifying a proof understand that "in spirit" it doesn't really matter whether you use Peano nats or not.
Indeed, if you work with formal verification a lot, you will find you spend significant amounts of time optimizing your proofs, and learning tricks like whether to make an input to an inductive definition indexed or parametric, which are things that human mathematicians don't care about at all but which are very significant to the machine verifier).
In other words, I don't think being practical to verify can be described completely by either the complexity of verification or the size of the proof, and I don't think it's that useful to conflate this with a problem being in NP since (1) CiC proof synthesis isn't actually in NP, or even in the polynomial hierarchy, and (2) being in NP has a much more precise technical definition that only somewhat overlaps with "practical to verify."
I have no clue about CiC, lean and whatnot. It was never my field and I don't doubt there can be some very weird things you can do with some fancy logic models that are rarely discussed by non logicians.
What I'm claiming is that anything a human could prove without a computer could be done by a non deterministic machine in poly time under a very reasonable assumption of proof length. I'm baking in the assumption of proof length, so I can claim something about NP...
Let me put it this way: if you can write a paper with your proof and I can check it with only my brain without a computer helping me, then a poly time algorithm could also check that proof. Unless you believe something exotic about how the brain computes, how would it not be the case?