Hacker News new | past | comments | ask | show | jobs | submit login
What is it like to understand advanced mathematics? (quora.com)
252 points by maverick_iceman on Aug 14, 2016 | hide | past | favorite | 92 comments



One curious thing I've noticed after completing my master's degree in maths is that the jargon of maths sometimes lets mathematicians communicate with far fewer words than non-mathematicians. Irritatingly, I can't remember any off the top of my head, though I know I've experienced this a few times. Here are some contrived and ineffective examples, where I put context in [square brackets]:

- "To first order, [the intervention turned out badly]" - "Modulo [this error, everything is going fine]" - "[The situation is] symmetric on interchanging [me and you]" - "[They are in] the same class under the equiv. rel. of…"


I think this extends to jargon from scientific fields in general–e.g. "on the margin" from economics, "failure mode" from engineering, "significant" vs "substantial" (statistics), "system 1/system 2" (psychology), etc.


See also 100+ Concepts To Add To Your Cognitive Toolkit (including all comments): https://news.ycombinator.com/item?id=10819355


Every discipline and even hobbyist/fan-group seems to have their own little “Darmok and Jalad at Tanagra”s.

See.


This is true, but mathematical jargon is more broadly applicable. (Though at this level of detail I would roll stats in with probabilty and hence pretend it is part of maths).

I've been a physicist and am now a software engineer. I can use the maths based jargon in both those fields, but not vice versa.

That's partly because everyone in a STEM field had to learn at least some maths. But more importantly: maths is about the logical interconnection of things, while the things themselves are substitutable. So it is no suprise that its jargon can be applied in many fields.


https://xkcd.com/435/

Made me remember this. If only the 'upper' fields adopt mathematical naming where applicable :D


Just curious, what's the difference between significant and substantial?


Significant, as in "statistically significant", usually that you've got sufficient evidence to conclude that your result is not simply due to random chance.

Substantial means that your result is big enough to have any practical import. In other words, is there any substance to the result?

For an example of a result that is significant but not substantial, suppose you find after surveying millions of people that members of demographic group X score 1/10 point higher than average on an IQ test.

For an example of a result that is substantial but not significant, sales figures are often so variable that it's impossible to determine with confidence whether even a 100% jump in revenues is due to a recent ad campaign or just a random fluke.


Significant - it matters. Substantial - of large magnitude.


Signal v. Noise is also priceless.


This phenomenon is just a facet of having a robust, shared vocabulary. Words can be shorthand for complex ideas. Knowing more words allows you to express intricate matters with ease. This fact is why in 1984 the regime is trying to shrink the dictionary and simplify the language.


The totalitarian authority in 1984 actually has the exact opposite goal. They wanted to control creativity, self expression, and thinking of new ideas as much as possible, and one of the methods to achieve this goal was to try to reduce the number of words being used in the general vocabulary of the public.


The same is true of programming. I also can't remember specific exchanges off the top of my head but I know I've used "it's a FIFO" and "do a binary search" to refer to non-programming related tasks.


My partner has (she says) treasured my comment to her once that the second derivative of her emotional plane is very low. I meant that she is very stable and doesn't suddenly fly into some weird emotional state. I couldn't think of any other way of putting it.


There's a joke in there about third derivatives of displacement (jerk).


What's the term for encountering some relatively unlikely concept multiple times in a short period? I was just looking at the Wikipedia page for the physics-variant of "jerk".


Baader-Meinhof Phenomenon.


That's the same one. I meant third derivative of position which is the derivative of acceleration, which is "jerk".


Latency throughput tradeoffs is the one I always seem to be trying to explain to non CS people.


I'm frequently frustrated when I'm trying to do chores, or something else, with someone who doesn't understand the concept of parallelization.


Haha, I once got into a rather frustrating discussion with my partner when discussing the optimal order and partitioning of laundry loads so that we spent the least amount of time babysitting the washer and dryer. After our third digression into understanding the concept of pipelining I decided that saving 30 minutes per week wasn't worth it :)


Laundry is the exact chore I was thinking of. Mine folds (e.g.) a few pairs of shorts, and then takes them to the dresser and puts them away. Then later, she folds more shorts, and puts them away again.

I fold everything, once, and then make one trip per handful-of-piles-of-items. Precious seconds saved! Seconds that I can use to have pointless arguments about saving precious seconds with my wife!


Yeah, but with her agile, small batch approach, if she makes a mistake on any one folding, she minimizes the amount of rework.


Similar to your third example: "Without loss of generality, [ our argument can proceed with the following assumption... ]


Same goes for the field I know well, Poetry.


Would you mind giving a couple of examples (even if they're hastily-constructed like mine)?


Take the word Love which immense ink has been used to describe. When it is used by a Poet he/she tends to have a very specific meaning that other poets understand without need of elaboration. Poetry has been in conversation with itself for a long time that it has developed strong general abstractions. I can see that in Maths as well, but I am an outsider so I could be off.


I am a poet and I have learned some advanced math. It seems like what you mean by "love" in poetry is more standardized to the subset of people that you have friends with instead of a general term that people can use like the above poster uses modulo.


Oh cool, I don't normally meet math trained poets. Any links to your work? (I am not a poet myself, just highly involved, publishing, editing, etc).


This is a book of poetry that I have self published. It is available on Kindle unlimited or you can get a sample to your device. https://www.amazon.com/Within-all-us-Joshua-Herman-ebook/dp/...


Equivalence classes aren't just jargon, they really are an important nonobvious concept.


Bullet point #6 is worth emphasizing.

>biggest misconception that non-mathematicians have about how mathematicians work is that there is some mysterious mental faculty

Popular perception would have you believe that advanced math is like weight lifting, where you need the brain muscle to process it. This weight is measured in IQ, and above a certain measure you are a genius. Einstein is the champion, and so if "you're no Einstein" then you have a perfect excuse to be ignorant and incapable of any advanced topics in math.

Of course, the reality as it should be perceived is that any advanced scientific topic is more like a journey, and that it never truly gets more difficult. It just becomes more specific and hence technical. We just go deeper and deeper into the rabbit hole of our choosing. Unfortunately, with this view, there are no excuses in being bad at anything. It's more dedication and obsession.

And Einstein himself has rejected any notion of genius being quoted often as saying, "It's not that I'm so smart, it's just that I stay with problems longer."

Except, didn't work, because the former group has written the quote off the words of a genius.


I like your answer because there is something strange about generalizing a group of people who have dug deep enough down that rabbit hole. Are there signs of a newbie "person A" to a field that they would have more potential to always take on a problem in a better way than "person b". What if einsteins work never involved mathematics and he wrote mysteries about a detective and from his books we could tell this person understands taking on problems and how to learn from failing etc,etc..


There are more ways of looking at what appears to be obscene amounts of talent... some suggest the media also has a hand, since they create the celebrities (monsters). Einstein was definitely celebrated.

But it's odd how popular perception can end up so far fetched. There still exists this notion that knowing everything and memorizing everything and being able to compute like a computer are all super powers. If only.

Except, we've been able to write things down forever, the internet is in our pockets now, and we have supercomputers crunching obscene amounts of data. As it turns out, none of these are "super" powers. Their capacity is easily enhanced with simple tools, which we don't even bother to take out most of the time.

You'd get definitely get better grades if your biological capacity were higher, since using tools during exams is cheating. But we're starting to think it's silly basing tests on these things.


Neat answer; although, the bullet points come off a bit like mathematical horoscopes (perhaps because of the use of the second-person, "you"). I often wonder if mathematics would be more popular if the beauty of works by it's masters were more accessibly appreciable by those without the mastery, as it is, e.g., in the arts or music. Anyone can look at a painting, see a movie, or listen to Bach, and at least have an opinion - not true for any random person flipping through, say, a functional analysis book.


We must remember the frame of reference for these works. The audience.

Most art, and certainly most paintings, movies, and most of Bach's work, is written for a general audience. Works of advanced, pure math however are written for a very specialized audience, purposefully.

This does not mean, though, that math cannot be written for a general audience and be understandable or even "beautiful". As an example, Cantor's diagnolization is understandable by a 10 year old. The profoundness of the question it asks is also understandable, as is its initially seemingly mind-bending inscrutability. What's more the solution and the logic of the proof are perfectly understandable and some of the easiest to understand explanations literally use a child's method of counting (mapping fingers to numbers). This is a beautiful work of math. And much math could be written for broader consumption. It's not done frequently only because that isn't the task most mathematicians set themselves to.

Moreover, even outside the actual proof, many math results can be described in understandable and even beautiful ways. And the derivative results of mathematical discoveries are often easy to describe, understand, and appreciate as impactful and beautiful.

So, I think this is a function of choice and focus. Perhaps math could indeed more popular if greater efforts were made here, but mathematicians as a group on the whole haven't set out to popularize it.

Now, no matter how much effort is applied, math will probably never quite be as enjoyable to most people as music or movies. But with such efforts math may be more appreciable to a wider audience, as say paintings or poetry are.


Cantor's diagonalization is a great example; I'm definitely not a "math person", but I was awestruck the first time I learned about it. People had explained the idea of uncountability to me before, but I never really "got" it until then. I can't think of a better word to describe its effect on me other than "beautiful".


You can do Cantor's diagonal for positive integers. It's still true but seems ugly to most people. Higher math is filled with just as many ugly as butiful ideas, but they are simply less talked about.

Infinity MOD X is undefined... How sublime.


Here is a scifi short story in which a machine is built that allows anyone to directly perceive mathematics similarly to music. The mathematician inventors hoped that this would increase the popularity of math but the opposite was true: http://www.rudyrucker.com/transrealbooks/completestories/#_T...


You are right that a lot of math is pretty inaccessible to the mainstream, but so is a lot of music. Sure, an average person can nod their head at the beauty of a Bach piece just as they might marvel at an (to them) unintelligible equation, but just as they do not understand the higher math they may also lack an understanding of counterpoint, canon in inversion, fugue, etc.

Likewise for avant garde cinema, modern art, poetry, and more!


It's qualitatively different. The nature of math work is analytic, that of music is more expansive. Someone can appreciate the beauty of Bach intuitively, without consciously breaking it down or even being aware of a construction, depedends on their intuitive senses. In math the information is encoded and mostly can't even start to be processed by the layman.


> A theoretical physicist friend likes to say, only partly in jest, that there should be books titled "______ for Mathematicians", where _____ is something generally believed to be difficult (quantum chemistry, general relativity, securities pricing, formal epistemology)

At least one of these does exist:

https://www.amazon.com/General-Relativity-Mathematicians-Gra...

Given how the author seems to be most familiar with geometry by the kind of examples that she or he gives, I'm surrpised that they're not already familiarw with this book.


It would be interesting to know what a book on advanced theoretical physics would look like if the author assumed that the intended reader already knows everything there is to know about the mathematics involved. In other words, what does theoretical physics minus math look like?


You might wanna look at Folland's Quantum Field Theory: A Tourist Guide for Mathematicians[1] as an example. [1]http://bookstore.ams.org/surv-149


As an aside: Folland's real analysis is a great second-level textbook in analysis.


Yep, try this: "The Algebra of Grand Unified Theories" John C. Baez, John Huerta.

https://arxiv.org/abs/0904.1556


To some extent, this reminds me of Feynman's Lectures on Physics. The lectures were originally given to undergraduates at Caltech (so not "advanced theoretical physics"), but they rely more heavily on intuitive explanations than mathematical derivations compared to a conventional physics textbook. The books were highly regarded by physics experts, but were essentially a failure at their intended purpose of teaching undergrads and preparing them for future work.


I've seen many glowing references to Feynman's Lectures on Physics, but this is the first time I've seen them called a failure. Could you share a bit more about why they're considered a failure?


The lectures are well-regarded by many, but it's been argued that they're ineffective for teaching undergraduate students the process of solving physics problems.

Feynman himself was an exceptional mathematician, and used a very mathematical approach to solve problems. But once arriving at the solution, he identified a concise intuitive explanation, which he then presented to others. Everyone was impressed with the brilliant intuition, but it didn't accurately reflect the more laborious and mechanical approach he used to solve the problems [1]. The Lectures of Physics are similar: a series intuitive explanations that would be difficult to discover independently without actually working through the math.

[1] http://www.stephenwolfram.com/publications/short-talk-about-...


Problem with these lectures is, they require too much work for a layman to benefit from them; at the same time, they are too elementary to be useful for a seriously interested person (say, an engineer). The book may still be good for schoolchildren who are interested in physics. I find more traditional courses, such as Berkeley Physics Course, more informative.


Some facts regarding this matter can be read here: http://www.feynmanlectures.info/popular_misconceptions_about...


Please read again. There aren't really any facts on this page that demonstrate whether or not LoP is an effective undergraduate physics textbook.

The page addresses two points: Feynman's fluctuating opinion of his own work, and the drop-off in student attendance (Feynman's opinion is then regurgitated as the third and final point). In fact, Feynman's opinion has no bearing on whether or not LoP is an empirically effective textbook. And regarding student attendance, the page argues against one person's recollection of dwindling attendance with another person's recollection (years after the fact) of steady attendance. Again, no evidence, just two contradictory and possibly flawed memories that have no bearing on whether or not LoP is effective.

The page provides as evidence an undated photograph of Feynman in front of a full classroom. When was this photo taken? Was it on the first day of class, or in the middle of the semester? Was it during a regular class, or a special lecture? We don't know.

Perhaps the LoP approach really is an effective textbook for teaching undergrads (despite Caltech abandoning it, no other nobody else to my knowledge successfully adopting its methodology), but the cited page doesn't provide relevant facts regarding the matter.


But the claims that it sucks for undergrads are also folklore, as far as I've seen. I wouldn't privilege them.

I took Caltech's intro physics sequence in the mid-80s -- IIRC my freshman year was when they switched from the Feynman Lectures to Goodstein teaching from his new The Mechanical Universe. (Goodstein was one of the pair with the negative quote above.) I thought at the time that the new textbook wasn't bad, but that Feynman was amazing. IIRC much of why they wanted a new book and video series was to take advantage of 3d computer graphics; I don't remember that making much of a difference to me.

(Added: I also attended a Feynman lecture live, but can't fairly rate him vs. others because I'm hard of hearing and rarely get much from lectures. The lecture hall was indeed packed with non-freshmen, but it didn't seem like any of us frosh skipped it. This isn't very relevant since it was a one-off.)


Yeah, that would be amazing. I have a friend of mine who has a PhD in math, but has never learned any physics. Nevertheless, he was able to pick up some pretty advanced physics topics in a few weeks, to everyone's amazement.

On the other hand, perhaps the "math for physics" needs a different emphasis than general math-for-the-sake-of-math. E.g. perhaps for mathematicians certain topics are covered without much detail as they are relatively uninteresting, but a math-for-physics course would put extra emphasis on them because of some applications. Don't know enough to give concrete examples.


Still quite interesting. Physics is a way of giving an interpretation to a physical system. Only based on an interpretation we can answer questions about internal and external consistency as well as completeness.

Trivial example:

a @ b

b @ c

c @ a

Makes "sense" if @ is = but not if @ is <.


Makes sense if @ is reflexive, more generally speaking :) I have no problem redefining < to make c < a true, that's just a matter of interpretation


Or we might be working in a preorder which has that particular a<b<c<a loop.


As a theoretical physicist by training, I would say what remains would be the immutable facts about the physical world ... after all, math is only a language .. though very convenient n powerful one ...


> What is it like to understand advanced mathematics?

It's really nice! Get to understand a lot of stuff.

Some of the advanced math is really powerful for applications.

And, can look back at the math saw in physics where wondered if the physics profs really understood the math and conclude, right, they didn't or at least not very well.

E.g., recently I saw a physics lecture where they explained more than once that a quantum mechanics wave function was differentiable and "also continuous" as if there was some question, doubt, choice, or chance otherwise. Of COURSE it is continuous! Every differentiable function is continuous!

There's a lot more on why it's nice to understand the advanced math!


> Every differentiable function is continuous!

Under who's definition? The Heaviside step function is discontinuous, but it's derivative is usually considered to be the Kronecker delta function. Both of these are used extensively in physics and engineering.


Did you mean Dirac delta instead of Kronecker delta?


Yes, my mistake. Kronecker is used often in quantum mechanics, but when working in Hilbert space rather than real space.


> Under who's definition

W. Rudin, Principles of Mathematical Analysis.

W. Fleming, Functions of Several Variables.

For the set of real numbers R, a function

f: R --> R

x in R, d in R, function f is differentiable at x with value d provided for h in R

lim_{h --> 0} (f(x + h) - f(x))/h = d

In this case we write

f'(x) = df(x)/dx = d

In that case, in particular,

lim_{h --> 0} (f(x + h) - f(x)) = 0

so that f is continuous at x.

Differentiation is important in physics, e.g., in Newton's second law.


The Dirac delta function is not a function but a distribution. This is just the product of sloppy notation by the physicists.


No, it is just that physicists means something different than mathematicians when they use the word "function". Just assume that they always are distributions that are evaluated with dirac deltas and everything makes sense.


> If f is differentiable at a point x0, then f must also be continuous at x0.

https://en.wikipedia.org/wiki/Differentiable_function#Differ...


They most likely said "The wave function is differentiable and its derivative is continuous", which isn't obvious. If you think that it sounded strange you should have asked during the lecture and they would have explained. Now instead you just assumed that you knew better and made a fool of yourself...

Also this isn't advanced mathematics, this is something most who are interested learns in high-school and I can assure you that most physics professors are aware of this fact.


> They most likely said

No, they said what I wrote they said. It was at YouTube, from MIT, from a course in quantum mechanics.

Uh, I do understand this stuff: E.g., given a positive integer n, the set of real numbers R, the usual topology for R^n, a set C a subset of R^n closed in that topology, there exists a function

f: R^n --> R

so that for x in R^n, f(x) is 0 for x in C, f(x) > 0 for x not in C, and f infinitely differentiable. E.g., for n = 1, the result applies for set C a Cantor set or a Cantor set of positive measure. For n > 2, the result applies for a sample path of Brownian motion or the Mandelbrot set.

When I was a grad student, I discovered and proved this result. Later I published the result in JOTA.


A bigger problem is the difference between Hermitian and self adjoint, a lot of physics professors don't understand that.

Or for example, this famous dialogue between Geoffrey Chew and Arthur Wightman from the 1960s bootstrap fad:

Wightman asked Chew: why assume from the start that the S-matrix was analytic? Why not try to derive it from simpler principles? Chew replied that "everything in physics is smooth". Wightman asked about smooth functions that aren't analytic. Chew thought a moment and replied that there weren't any.


By the way, the reason they said that was no doubt to motivate the students to choose correct boundary conditions for some Sturm-Liouville problem (as the previous poster sort of alluded to)


Wow - This article made me think I identify with and act as a mathematician much more than I would've thought. I only started programming six years ago, but the way the author talks about breaking down problems, building frameworks and tool, and using multiple methods of attacks from an arsenal of knowledge that builds up over time, and how specific problems are not as interesting as insights into the general case -- these are all exactly the directions I find myself moving in as a programmer that just feel 'natural'.


Programming is undoubtedly the largest branch of applied mathematics—although for some reason we as a field don’t usually think of it as such.


> your brain can quickly decide if a question is answerable by one of a few powerful general purpose "machines" (e.g., continuity arguments, the correspondences between geometric and algebraic objects, linear algebra, ways to reduce the infinite to the finite through various forms of compactness)

To me these tricks all feel like the same trick. "Can I find a program* that gives a finite representation of a generator or a transformation, from something I know, to my problem."

* Not as in php but as in lambda calculus.


That's like saying each book ever written is based on the same trick, generating a finite number of words. True, but trivial.


Thanks. Great read. Among other things this particularly resonated: You are easily annoyed by imprecision in talking about the quantitative or logical. This is mostly because you are trained to quickly think about counterexamples that make an imprecise claim seem obviously false. On the other hand, you are very comfortable with intentional imprecision or "hand-waving" in areas you know, because you know how to fill in the details.


This is a great answer. It brings into relief many of the thought processes and experiences which change in tackling questions after studying advanced mathematics deeply.

However, the answer seems mostly to cover what it feels like to study tractable problems. Things like reading and understanding others' research or work, or studying new questions which seem "within reach, or nearly so." This is representative because this kind of work forms the majority for most mathematicians, and nearly all of the work for non-professional mathematicians who have studied math deeply and keep abreast of their fields, but don't research full-time.

One aspect of the experience of understanding advanced mathematics which doesn't seem to be covered thoroughly here is what it feels like to study truly intractable questions, let alone those questions which you fear may actually be inscrutable. The simultaneous awe, respect and consternation one feels when confronting truly difficult questions which you intuitively feel you just don't have the tools for. The problems which you think you'll need to discover new tools to even begin breaking down. This answer generally projects confidence and fluidity in tackling problems. And that's fair, in that this is the biggest change one undergoes when tackling mathematical questions after studying advanced mathematics deeply.

For deeper questions-- those which you feel you are very far from being able to answer-- the experience isn't quite opposite of what is described here, but it is different. When you feel that all of your fluid mappings and transformations, all of the most powerful tools at the ready in your toolbox, and all of your simpler analogues are not only not going to solve the problem, but are unlikely even to lead to a truly deeper understanding of it... when you're not even sure whether breaking the problem down in a particular way will be productive or counterproductive... when you've wrestled with a problem for days, weeks, and months and you're still not sure if the "foundations and frameworks" you've built are even of the right type or in the right vicinity to solve the problem... then the sense of confidence projected in this answer falls away. The sense of assured solution-- even assured understanding-- and fluidity of movement in problem solving is no more. You're still confident in your mathematical knowledge and ability, and certainly you feel differently than you did as a beginner. But it is a humbling experience.

The process of tackling these questions is not quite so structured as the impression that might be given by reading this answer. In these situations, Wiles' analogy of stumbling in the dark through a great mansion for months and years (which the author also quotes) is closer than the analogy given by the author of building a house. What even Wiles' analogy doesn't quite capture, at least in the section quoted here, is the uncertainty of the process of stumbling through that mansion. At various points you ask, "does this room even have a light switch? At least one that I can reach?" There is tremendous backtracking as well, rather than the sense of steady progression in discovering and understanding room after room. Imagine stumbling through the dark for weeks or months, finally discovering the lightswitch in a room... moving to the next room and doing the same... then on again to the next room to do the same again for yet more weeks and months... then only to discover, based on what you've seen in these rooms, that you must in fact be in the wrong hallway. Or the wrong wing. Or even the wrong mansion entirely.

In any case, I think this is a wonderfully insightful answer overall and its author deserves great credit for being so thorough, so accurate regarding the great majority of work, and so descriptive and relatable. I just wanted to enhance the picture painted here, or expand one little corner of it, as it relates to other types of experiences one is almost sure to have in understanding advanced mathematics and trying to apply that understanding to work in the field.


Not a mathematician, but I have tilted at the P vs. NP windmill enough to have a sense of what you're talking about. It is indeed an awe-inspiring experience. I think I learned something by trying it (beyond the obvious lesson in humility, heh), but I'd be very hard put to say what that was.


>When trying to understand a new thing, you automatically focus on very simple examples that are easy to think about, and then you leverage intuition about the examples into more impressive insights.

Sounds like something Feynman would say.


You get to explain the rules of the game set in one line:

"Pick 3 cards, such that for none of the properties exactly two are the same."

Not very advanced though.


Thank you for posting this example! I've used something like this myself to describe Set, but it didn't occur to me when I wrote my comment on the OP. The notion that one can quantify over properties (not just concrete objects) is one that is not necessarily obvious to someone without mathematical training.


Just as well, "Pick 3 points comprising a line (in the four dimensional space over the three element field)"


NB. There is an amazing discussion in the comments following one of the linked blogposts.

https://rjlipton.wordpress.com/2009/12/26/mathematical-embar...


It devolved into a FLT truther, I wouldn't call that "amazing".


Disclaimer: Contrary opinion that might not be yours

I'm building AIs and intelligent botnets for over 13 years now. One thing I learned is that I now think humans can't understand mathematics.

No matter how good you think you are, a simple AI algorithm will proof you wrong. From physics, to algorithms, to simulations, to automation, to analysis - and in evolutionary AIs also the solution generation for a problem.

Every single scientific mathematical model that occured in my life was in an "imperfect" state where humans think that our current "model of things" in form of mathematics is correct while it's totally incomplete and a few days of AI simulation figured out a better way to do it.

I somehow see mathematics not as a knowledge pool; more like a serialization format for transferring knowledge. And I think that serialization format often is too complicated for other humans to be understood and therefore has bias in interpretation and resulting conclusion errors.

And I personally think that this is a bad thing and a problem that needs to be fixed desperately.


> Every single scientific mathematical model that occured in my life was in an "imperfect" state where humans think that our current "model of things" in form of mathematics is correct while it's totally incomplete and a few days of AI simulation figured out a better way to do it.

I don't think I understand your claim and I'm rather skeptical of what you seem to be saying, so can you give three examples?


An algorithm has limits that humans don't, as Godel's incompleteness theorems prove.

The study of mathematics is communicating some deep logic, but it doesn't end there as you trivialize. It's not just some poetry that we memorize, but it's a new insight to the nature of everything.


Godels incompleteness theorem says that no formal system can prove or contain all truthy theorems. It says nothing about humans being immune to this effect, which it seems to me your comment is implying


"Number, sets and categories or what do mathematicians actually do?" - https://youtu.be/Lrp0M-p5pMU?t=4m59s


This is a rather populist answer, but still pretty cool :)


first, i recoil at "advanced" and would prefer "pure". that aside,

> You move easily among ... different ways of representing a problem

is like 3 answers in one. representation has specific meaning.

for me, however, it's not an ideal phrasing: i prefer not to think in terms of "problems". it can be productive to do so but also messy.

(as if i knew a/b math :p)


I don't know anything about architecture, but I know I dislike houses where it seems like the architect had just recently discovered the "extrude" tool and was picking all sorts of random things to pull out an extra three feet.


The McMansion thread is over here: https://news.ycombinator.com/item?id=12286724




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: