but we do have alcoholic beer, and non-alcoholic beer, and it is nice to be able to say which one you want. And yes, there is a separator here, too, it is called a space.
Formal can be foolish, too. If you don't believe that, then I have a set for sale, with the property that it contains all sets that don't contain itself.
The problem you’re referring to arose precisely due to lack of formalism. It was a problem back when mathematicians were toying with naive set theory, one not based on axioms, but instead on intuitive terminology. Moving to axiomatic, more formal, set theory solved it.
The problem is the same, no matter if you look at it formally or informally. You could get your set theory axioms wrong, for example, but you would still be formal. Oh wait, you have a proof that set theory is consistent, right?
Which axioms are interesting? And why? That is nature.
Yes, proof from axioms is a cornerstone of math, but there are all sorts of axioms you could assume, and all sorts of proofs to do from them, but we don't care about most of them.
Math is about the discovery of the right axioms, and proof helps in establishing that these are indeed the right axioms.
Who was it that said, "Mathematics is an experimental science."
> In his 1900 lectures, "Methods of Mathematical Physics," (posthumously published in 1935) Henri Poincaré argued that mathematicians weren't just constructing abstract systems; they were actively testing hypotheses and theories against observations and experimental data, much like physicists were doing at the time.
Whether to call it nature or reality, I think both science and mathematics are in pursuit of truth, whose ground is existence itself. The laws and theories are descriptions and attempts to understand that what is. They're developed, rewritten, and refined based on how closely they approach our observations and experience of it.
Damn, local LLM just made it up. Thanks for the correction, I should have confirmed before quoting it. Sounded true enough but that's what it's optimized for.. I just searched for the quote and my comment shows up as top result. Sorry for the misinformation, humans of the future! I'll edit the comment to clarify this. (EDIT: I couldn't edit the comment anymore, it's there for posterity.)
---
> Mathematics is an experimental science, and definitions do not come first, but later on.
— Oliver Heaviside
In 'On Operators in Physical Mathematics, part II', Proceedings of the Royal Society of London (15 Jun 1893), 54, 121.
---
Also from Heaviside:
> If it is love that makes the world go round, it is self-induction that makes electromagnetic waves go round the world.
> "There is a time coming when all things shall be found out." I am not so sanguine myself, believing that the well in which Truth is said to reside is really a bottomless pit.
> There is no absolute scale of size in nature, and the small may be as important, or more so than the great.
The 'postulating' a bunch of axioms is how Math is taught.. Eventually you go on to prove those axioms in higher math. Whether there are more fundamental axioms is always a bit of a question.
Yes, I remember somebody doing a PhD in the US on theorem proving for a few years, but his last year he focused on doing deep learning work, did his PhD thesis on it, and got hired by OpenAI.
I don't think universities should become industry. I mean, that is exactly what we have industry for. If you want to be put in a pressure cooker under leadership focused on business outcomes, great, do industry.
The problem really is that universities are treated as if they have the same mandate as industry. Government people shouldn't tell a professor what kind of research is interesting. They should let the best people do what they want to do.
I remember an acquaintance becoming a professor, promoted from senior reader, and he was going to be associated with the Alan Turing Institute. I congratulated him, and asked him what he was going to do now with his freedom. He answered that there were certain expectations of what he would be doing attached to his promotion, so that would be his focus.
This way you don't get professors, you turn good people into bureaucrats.
Yes. The demand for increasing control, driven by the "taxpayer's money!" lot evident in this thread, strangles almost all state-funded research because it demands to know up front what the outcome will be. Which instantly forces everyone to pick only sure-bet research projects, while trying to sneak off to do actual blue-sky research in the background on "stolen" fractions of the funding. Like TBL inventing the WWW at CERN: that wasn't in his research brief, I'm sure it wasn't something that was funded in advance specifically for him to do.
Mind you, it was evident to me even twenty years ago when briefly considering a PhD that CS research not focused on applying itself to users would .. not be applied and languish uselessly in a paper that nobody reads.
I don't have a good answer to this.
(also, there is no way universities are going to come up with something which requires LLM like levels of capital investment: you need $100M of GPUs? You're going to spend a decade getting that funding. $10bn? Forget it. OpenAI cost only about half of what the UK is spending on its nuclear weapons programme!)
No, a proof is not a value of the proposition you are proving. That is a very convoluted way of thinking about proofs, and just because you can (using some type theory based logic), doesn't mean you should. It is like saying that 2 is the same as {∅, {∅}}, which is true in some set-theoretic formulations of the natural numbers, but which is not how we usually think about 2.
I'm not seeing how those are at all the same, or how it's convoluted to think of/require a proof to be an actual demonstration of the thing you are setting out to prove.
If you want to prove to me that `A => B`, what other way is there to do so besides giving me logical steps to get from `A` to `B`? If you had a proof-of-A, would that not then be a function that takes your proof-of-A and returns a proof-of-B?
A proof is already logical steps to get from proposition A to proposition B. What you're requesting is that it should be replaced with logical steps to get from a value of type A to a value of type B, where types A and B aren't ordinary object-class types suck as "pencil" or "employee" or "goblin", nor ordinary data-structure types such as "pair(int, list(string))", but (in most cases) some completely abstract labels that you made up in order for this mapping to work. You have a cool demonstration of disjunction elimination, which of course is no simpler than actual disjunction elimination, but how will you prove something non-constructive like "x mod 2 = 0 → not (x mod 4 = 1)" in your system, and how will it not be more complicated than proving it without your system?
You mentioned passing in axioms as extra assumptions, like the law of the excluded middle. In programming terms, the law of the excluded middle is a magical function that, for every type T, returns an Either<T,Not<T>> - Not<T> being one of those made up types that only exist to make the correspondence work and don't actually mean anything. If you're working with Not you presumably also want a function that takes any Not<Not<T>> and returns a T. In programming terms this is completely unimplementable and meaningless so I'm not really sure why you think it would be useful.
They are values though! That's my whole point! These languages are constructed and written about in a way that often obscures that. A lot of them are simple things like tuple(x: int, y: int, h: x<y). Here's a constructive proof of your proposition:
import Mathlib.Data.Nat.ModEq
example (h: x ≡ 0 [MOD 2]): ¬(x≡1[MOD4]) := λ xeq1mod4 ↦ by
have h24 : 2 ∣ 4 := ⟨2,by tauto⟩
have h' := (h.congr (xeq1mod4.of_dvd h24)).mp (by rfl)
contradiction
You can see at one point, I make a proof that 2|4, which is a data structure that needs to remember that 2 is the solution to 2*n=4 as the proof. Then I pass that to a couple functions (lemmas) that require that assumption, and get an Iff, which is a pair of implications (lambda functions) going each direction. I pull out one of them and call it on a trivial proof of `x=x`. This all gives me a proof that 0=1 mod 2. Finally I can ask it to expand that definition until it reaches a contradiction.
The whole thing is a proof that if you give me a proof that x=0 mod2 and x=1 mod4, I can build you a contradiction. Exactly like you'd expect in a proofs class. Since I'm a programmer, I wrote the whole thing as a big lambda function.
Note that Not[T] is also not some arcane thing. It's just Function[T,Void]. Essentially, it's a function you could never possibly call (because the whole point is that you can't get a T to call it), but if you did, it would just crash the program/throw an exception. It has an obvious purpose if you're trying to formally prove a negative.
The use of a reader monad to get a constructive-ish proof of things that need AC or LEM is just what programmers might call dependency injection. It's a normal pattern (in programming). It only "doesn't mean" something if you don't believe in those axioms at all, but even then many proofs don't need the full axiom and will work if you can bring your own choice function (e.g. you already have a basis for your vector space). Just like many (most?) real world programs basically don't need a turing machine, and actually work fine as a DFA with a trivial event loop or something.
Fwiw, you can easily construct a ¬¬T from a t: T: `λ f↦f(t)` (so make a closure that remembers your t:T and passes it into any ¬T someone gives it). LEM essentially says all ¬¬T are of this form, so you can extract the T that's hidden inside. Sort of like how functionals in finite dimensions are all secretly hiding a vector to do a dot product.
Still sounds like a lot of programming that's not useful for programming but rather just adds extra complexity to proving by pretending it is programming. I don't need a function that returns undefined if I pass it an object that's 0 mod 2 and 1 mod 4 at the same time.
You're in a thread about proof assistants. I'm not saying it's useful to prove number theory facts in your business code. I'm saying it's useful to embrace programming and intentionally think like a programmer to do proofs. You're the one who asked about number theory proofs.
It is also useful to think like a mathematician to do programming, and I find that e.g. scala's ecosystem is a lot easier to work with for business problems than others like go or php because things tend to have more structure and predictable design, but that's a separate issue.
I have contributed to proof assistants, and while programming is part of mathematics, it is not all of it. Obviously, a proof assistant is a program, but its concepts don't need to be all about programming. If they are, that is to the detriment of the mathematics done in the proof assistant.
Another comparison you won't like: Insisting to phrase everything in terms of electrical engineering is quite useless for most applications running on a computer.
I didn't say you always need to phrase things in those terms, merely that it can be useful to not try to avoid them (and I wish tutorials wouldn't obscure it). It's like how geometers use d/dx as an element of the tangent space, but they don't treat it as some quirk to ignore. They actually use the fact that it's an operator too. It makes things easier. Obviously you don't need to do that with every vector space you'll ever encounter.
My original reply was to someone who thought proofs are all unit valued, which is the sort of confusion you get when these systems try to shy away from the programming part. The point was that proofs are things that a programmer ought to feel to be concrete, not just some trickery with type aliases of unit.
That is an obscure way of writing it. If you use the normal notation you will have that 2 is {0,1} which makes some sense at least: we often see 2 (albeit often bolded or double struck as U+1D7DA) used to mean "the set with two elements" and those elements are often called 0 and 1.
If you don't like the meaning of equality here, that is exactly my point.
My point is not that you cannot make sense of 2 = {0, 1}. You can, and this is a corner stone of encoding numbers is set theory, and it certainly describes an aspect of numbers (the set theoretic aspect). But it is not how we understand numbers usually.
And the same is true for the "Curry-Howard" way of understanding proofs.
Coming up with something that is new, original, and actually works better than anything before it is quite hard. It does not happen often. When it happens, it should be embraced and cherished. Because without these discoveries, science is not worth anything at all. These discoveries are what science is all about. Yes, making sense of the discoveries is also science, but that is the easy part of science.
It happens every day. The job description of an engineer is basically to apply their knowledge of fundamentals to design and improve products, not simply copy the existing products. Products in every field must incrementally and monotonically improve in some way in order to compete. It is often not even science at all, just development.
We are talking about different things. Incremental engineering can definitely be part of it, but at some point something different happens: the discovery.
I don't think that this is what GP wrote about. They complain about the paper and not the software. Imho, most scientific papers are really badly written, because most scientists (and most people) are bad writers.
If you want beautiful writing, read poetry. You should give the best paper award to the paper with the best invention / discovery. Because as much as I like a well-written paper, I like a paper with a great invention / discovery even better, even if badly presented. Remember, you get what you incentivise. There are too many papers out there that, while nicely written, do not move the needle at all.
I would certainly appreciate beautiful writing, but I want something far more basic: Good writing. I want a text that is pleasant to read. That does not mean that the text should be dumbed down or that I expect it to be easy. A good text avoids unnecessary jargon and is simplified to make it very clear what the reader should take away from reading the text. A lot of the technical details can be delegated to the supplement so that the main text can remain clear and focused.
reply