Hacker Newsnew | past | comments | ask | show | jobs | submit | ionfish's commentslogin


Is there a point to the piano name? I know that Bösendorfer is a high-end piano maker, but it looks like Danziger has altered it to Boshendorfer, or Buchendorfer, or something like that. Intentional, or just a mistake?

EDIT: I'm an idiot. It's "Bushendorfer".


I wouldn't confuse modules with modularity. Think of it like this: types are the primary API for the program. In fact, I tend to call the modules that perform this function in my code something like 'Core'. Since they're a shared language that allow different parts of the code to communicate without knowing any implementation details, these types actually make it possible to write code that is more modular, not less. Different parts of the program can go about their business and perform specialised tasks, while still being interoperable (and composable) with other parts of the program (and, hopefully, code not yet written—although this very much depends on how well your types model your problem domain).

I'm not sure the author's explained this point quite as clearly as it could have been. One doesn't want to expose every type used in the program across all modules: there are many that are just used in internal representations within particular modules.

Here's an example from a project of mine, which is a command line program to produce truth tables. There's a Core module which contains the types shared across the whole program, and the most fundamental operations on them.

https://github.com/beastaugh/hatt/blob/1.5.0.3/src/Data/Logi...

Then there are other modules like Parser which just exposes a single function, but of course makes use of the core types.

https://github.com/beastaugh/hatt/blob/1.5.0.3/src/Data/Logi...

The command line program itself is defined as a Main module, which imports the pure library modules and does all the I/O. It has a few types of its own but these aren't shared across the program, because the library doesn't need to know about all this stuff.

https://github.com/beastaugh/hatt/blob/1.5.0.3/src/hatt.hs


Ah, thanks!


It's pretty common in the UK these days.


Some people still give you weird glances and awkward pauses when you use it though.

In situations where it doesn't matter (and that's most of them) I've stopped correcting people who assume we're married and I've taken my partner's surname.

It actually makes things easier. If they assume you're married, they generally have no problem talking to you about whatever they called to talk to your partner about. When they find out you're not married, sometimes suddenly you can't be trusted.


But then it reinforces the belief there there aren't other options. Or rather, it does not show alternatives as more normal than one would assume. I guess that was the point of the article.


I can understand that, but I also don't like having to fight with a company rep because suddenly I'm not good enough to talk to.

If they want to assume I'm Mrs so-so, rather than Ms so, partner of Mr so-so, and that gets me better service, then I don't correct them.

If they ask directly, for whatever reason, I don't lie.


No, the default backend is the native code generator. You need to use the -fllvm flag to enable the LLVM backend.

http://www.haskell.org/ghc/docs/7.6.1/html/users_guide/code-...

"[The LLVM backend] is an alternative backend that uses the LLVM compiler to produce executable code. It generally produces code as with performance as good as the native code generator but for some cases can produce much faster code. This is especially true for numeric, array heavy code using packages like vector. The penalty is a significant increase in compilation times."


Lots. For example, that every set has a cardinality (is bijective with some aleph). That's pretty intuitive, too, as (I think) are the following.

* Let X and Y be sets. Then either they have the same cardinality, or one is smaller than the other.

* Let X be an infinite set. Then there is a bijection between X and the cartesian product of X with itself, X × X.

* Tychonoff's theorem: every product of compact topological spaces is compact.


Your third example is often cited as an unintuitive result, so I wouldn't mind getting rid of it. The first and second are easy enough to consider collateral damage, which we already have plenty of in basic math. But the fourth one is harder to give up. What would it look like without the axiom?


Coming from a set-theoretic perspective, I suppose I've got so used to Tarski's theorem that I consider it intuitive.

As far as Tychonoff's theorem goes, you might find this paper interesting:

http://matwbn.icm.edu.pl/ksiazki/fm/fm113/fm11313.pdf


Having written the following, I now wonder whether you meant something more specific by computation than I did, so I'm not certain whether my point is really a response. Could you spell out the details of your comment a bit more, and perhaps touch on the approach which Wolfram argues for?

* * *

Offloading computation only works when you understand what the computations are and why we do them. That's something that must be learned, it's not knowledge that springs fully formed into our minds as soon as we step into a classroom.

Carrying out computations thus gives us explicit and implicit knowledge of how the things we may eventually automate actually work. But it's also valuable because it trains us to compute in a precise and effective manner—a capability that remains useful later on. For instance, in logic it's often important to be able to carry out syntactic manipulations (e.g. into normal forms) in one's head, or even tacitly.

I'm sure there are plenty of examples from other areas of mathematics where computation is important, it's just that we do it so automatically that we don't think about it. Often I've found that students have trouble following proofs that take logically and computationally innocent steps without saying what's going on. Here I don't mean things like applying AC in the background, but just simple tricks like de Morgan's laws or taking the contrapositive. They have difficulty because they haven't taken those steps often enough themselves to have internalised them.


I just recall the Wolfram article and I seem to think it was pretty handwavey as to what/how things get offloaded (to Mathematica specifically of course). But I will say that at least half the homework of my Calc 1-3 courses was spent well past the "understanding" stage and more into "getting fast enough to do it on an artificial, time limited test situation" and basically memorizing pages of identities that I quickly forgot because they so rarely came up in my physics courses. This was pretty much the case with almost every math class since about algebra 1 in middle school.

And in particular I would like to hold up Electricity and Magnetism 2. Calculating the momentum of a magnetic field, in all but the most trivial case, takes a full sheet of paper: being rows and rows of 8 inch long equations as you carry out the tedious work of canceling terms; moving things in and out of square roots; and multiplying large polynomials together. It's all basic algebra stuff you learn in high school but it's a slog to work through and so time consuming that you actually lose track of the big picture and end up with very little better understanding at the end.

As far as I know that's why things like tensor and bra-ket notation had to be invented in the first place. Without a compressed notation the ability to get a correct answer to any interesting problem became less a question of knowledge and more a question of probability of transcription/sign flip errors.

not that anybody teaches sophmores tensor notation.


Unless you were truly exceptional, the "understanding" phase tends to get skipped in the first three calculus courses in favor of computation. Before you disagree read the following bullet points:

- What is the tangent line? How does it connect with the derivative?

- What is a limit. How is it used to make the above rigorous?

- What is the Fundamental Theorem of Calculus? Why, non-rigorously, would you expect it to be true?

That is not a random list. That's a list of the most important concepts taught in the first Calculus course or two. If you couldn't give a quick impromptu explanation of ALL of them, then you failed to master the key concepts. (Don't worry, most can't.)

To get to Terry Tao's formal math stage, you'd need to take proof-heavy courses such as real analysis.


I can, even 10 years later, not because I'm gifted but because I had good calc teachers who consistently covered and circled back on those points. I know what you mean though.

But what I mean is that 25th time you're doing an integral to ram home some trigonometric identity or working out a fourier series for PDEs it's not because anybody hopes that this is the time you get the epiphany it's because the teachers need something for the grade books and you need to be able to do it during a midterm.

Assuming Wolfram wasn't engaged in just an attempt to sell more mathematica licenses I would assume that was kind of his point. If you dump the most of the endless repetition on to maxima/maple/mathematica you could actually spend the semester on the concepts and proving them instead of focusing so heavily on the student's facility at algebraic manipulation.

Now having had to do everything by hand I have the sort of knee jerk reaction that "well I had to do it so they should do it too" but then I also remember that it sucked giant balls. As I see it is students definitely need pretty solid facility at doing this sort of shit and so we get the classic: "where do we draw the line" problem, which means I should probably not be counted as a proponent of Wolfram, so much as maybe a sympathizer (in this regard; fuck NKS).

*also while I take didn't real, I did get a minor in math which included Basic Concepts of Mathematics, or as I tend to remember it "that semester of not being able to divide because it's not defined over the integer set" but it was certainly a purely proof oriented course, and my numerical methods 1&2 were at least 50% proof based, I've done the formal rigor thing.


To this day I remember how outraged I was that on my final for Calculus 101 I derived from scratch an answer to an integration problem, then did the derivative and proved it was right on the final exam. Then the grader, upon seeing an answer different from the expected one marked it wrong.

I understand the grader was in a hurry, and the trig identity demonstrating that my answer was, in fact, equivalent to the standard one is not easy. But I had the right answer! And proved it was right, right there on the test!

I still remember the outrage. Over a question that did not matter then (I got an A+ in the course either way) or now.


I don't care that nobody cares. I care that they pretend that they do. Fake friendliness is annoying, and good service is not formulaic.


Come to Spain then, most people don't care and there's no effort to hide that. Stuff like this happens often:

- having to chase a store clerk while he flees and tries to hide because his chatting with his friend on the cell is obviously more important.

- "Oh could you also please bring ..." replied with a "why didn't you say that before?" at say a restaurant.

I used to be annoyed by fake attention in the North Americas, but after a couple years back here, gawd how I miss fake. They may not really care about me, but they care about trying to make the experience pleasant for the customer.


I live in the US Midwest. Yes, it is true that if someone asks you casually "How are you?" they are not expecting your life story, and the waiter's question about "how's the food so far?" is generally not an invitation to give a three paragraph review of the food. But it is legitimately a time to ask for refills, and a clerk's question is legitimately a time to tell them about the bathroom being dirty or something. It may not be a close personal connection and you may not invite them to your son's graduation party, but it isn't quite the null, empty question this essay portrays, either. It's better than being ignored.

Fun hack: "How are you?", "What's up?", "Hey", etc., are all fungible. It is perfectly acceptable to answer "What's up?" with "Hey" or a question in return. Fun.


Taken to extremes by the colloquial English greeting of "alright?" where the answer is "alright" and the formal introduction of "how do you do?" which is responded to with "how do you do?" with no change of inflection.


as someone from eastern europe, I'm always happy visiting countries/places where people are nice to customers by default, and I don't care if it's fake. I don't like being an accidental victim of pms when comming to buy a pack of cigs.


The history of the proof is a little messy; the Wikipedia page has a decent summary.

http://en.wikipedia.org/wiki/Cantor–Bernstein–Schroeder_theo...


Yes, it does rely implicitly on Cantor–Schröder–Bernstein. That might be a downside, but I think when working informally (that is to say, when not teaching a set theory course) one can simply assert that if there exist injective functions from sets A and B into one another then they are equinumerous.

That being said, although important in the theory of the order of the cardinal numbers, Cantor–Schröder–Bernstein doesn't show that the cardinals are totally ordered. That statement is actually equivalent to the Axiom of Choice, whereas as far as I'm aware Cantor–Schröder–Bernstein holds in ZF.


That's absolutely correct -- trichotomy for arbitrary cardinals (any two cardinals are cardinal-size-comparable) requires AC, but SB doesn't require AC. Trichotomy for the cardinal numbers of well-ordered sets (e.g. ordinals) doesn't require AC.

It's a little irrelevant to this thread... but as long as I'm quoting non-proofs that require lots of extra machinery, I'll give my favorite appeal-to-intuition equivalent of choice: the product of non-empty sets is non-empty (any point in the product of a collection of non-empty sets is a choice function on those sets).


AC is equivalent to a lot of things. There's a collection of them on the Wikipedia page.

http://en.wikipedia.org/wiki/Axiom_of_choice#Equivalents

Something I find pretty interesting is that some of these equivalences break down in weak systems.

http://www.math.uchicago.edu/~antonio/RM11/RM%20talks/mummer...


Yup, I did a few projects on equivalents of AC back in the day. That's just my favorite "appeal to intuition" one (my favorite "appeal to intuition" against AC is: the identity function is the sum of two periodic functions (though this is a consequence and not equivalent)).

Equivalence breakdown in alternate systems is a wonderful topic. I've been trying for a couple years now to figure out how to get back into set theory now that I'm out of academia. Maybe later this summer...


> That being said, although important in the theory of the order of the cardinal numbers, Cantor–Schröder–Bernstein doesn't show that the cardinals are totally ordered.

Good point, thanks; I've corrected my post accordingly.


I think this becomes much more intuitive once one understands that the cardinality of any nondegenerate closed interval is the same as the cardinality of the continuum.


It becomes perhaps even more obvious when you realize that any non-empty open interval (a,b) with a<b is homeomorphic to the reals. Once that really sinks in, much of this becomes easier.


> the cardinality of any closed interval is the same as the cardinality of the continuum.

In English: There are just as many real numbers (rational and irrational combined) in between any two real numbers as there are real numbers total. (That's a somewhat simplified statement as I didn't really define 'closed interval', but it serves to give a taste of the concept.)

If nothing else, statements like those completely break your intuitive notions regarding size in this context and convince you that you can only proceed based on logic and proven theorems. Which is, of course, accurate, at least until you develop a new intuition.

Measure theory is the branch of mathematics concerned with developing new notions of size appropriate to different contexts.

http://en.wikipedia.org/wiki/Measure_(mathematics)


> Measure theory is the branch of mathematics concerned with developing new notions of size appropriate to different contexts.

While this is probably among the most interesting definitions I've ever seen, I'm not sure it is correct. Cardinality, as we've discussed it here, is usually considered to be in the domain of set theory; measure theory assigns measures in a multitude of interesting ways, but, at least in all of it with which I'm familiar, those measures come only from [0, ∞], not from some more exotic domain of values. (A lot of the measure-theory proofs that I know do rely in this being the set of values of a measure—for example, on being able to subtract most of the time, and on having only one kind of infinite measure—so just saying "let's allow different values" doesn't immediately rescue it.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: