At schools with separate "pure" and "applied" math departments, there's sort of a running joke in the pure-math departments that they feel offended if someone has accidentally managed to find their work useful, because it means they've, quite to their chagrin, advanced the agenda of the rival applied-math department (or at least, that they weren't "pure" enough). But of course that's a self-deprecating joke more than a statement of what theoreticians really think.
I have said that quote a lot in jest. :) But I always remind myself that it is based on the problem that there is actually very little theory going on -- that the developer is doing a lot of random fuddling around with little understanding, little theory, until some piece of code produces a desired behavior. The developer may decorate his muddling with tests and wordy comments, but it is still muddling around. If the developer understood his system, then the little mysteries that break theory would not be so magical anymore, and he can move on to keeping his theory on par with his practice.
Countless times over the years I have seen areas where Academia is decades ahead of us so-called industry experts. What is "new", "shiny", and posted all over the place online today is built of theory produced 20+ years ago. In practice, we have gotten our tasks done without some of this knowledge, but theoretically, we have, over time in our ignorance, cost our employers millions of dollars in hours dedicated to bug-fixing and features that could not meet deadlines. It is a hard pill to swallow. The silver lining is that we can recognize this and work to grow, to learn that missed theory, rather than fall into the trap of complacency.
Today's fad is functional programming. In the coming years, we will relearn the weaknesses of functional programming, why OOP was the next stage of evolution. Hopefully we will have learned not to fall into extremist camps again (the notion of a "functional vs OOP" conflict still strikes me as hilarious misunderstanding).
Out of curiosity, could you share these weaknesses of functional programming and how OOP solves them? Or at least point me to some material that discusses them? I would be very interested. I have programmed in OOP for over a decade and just picked up functional programming in the last 2 years so do not have enough experience with FP to make any definite conclusions yet (although I am really liking what I see so far). Thanks.
I wish I had a good handle on that. There is much to functional programming I need to digest yet. But the hints are there (I noticed this in the timeline of when these various concepts were played with and integrated into standard specs), and in my own programming, I have learned to use the hybrid effectively. What follows is purely my current opinion, and it should include some healthy ignorance. I am but an egg.
If I were to put a finger on it, I would say that the reason OOP arose was that there was general recognition of the need for a rich type system. I got a hint of this in the first couple parts of SICP. Haskell incorporates a kind of type system that is, itself, pretty interesting. CLOS/Moose makes C++ and Java feel antiquated.
In my experience, a lot of us who learned OOP via C++, Pascal, Java, Ada, and the like generally missed the point: type systems. We tend to treat classes like collections of the language primitives as opposed to combinations of data structures. Specifically, we make do with primitive data types and neglect to constrain them as a type. And "everything must be an object" is an extremist mantra with little value; given that your process image must have an entry point, purely a code-flow issue, was there really a point to the illusion of wrapping the entrypoint main() function in a class (I am looking at you, Java)?
Here is an OOP type example I see in the wild: an ID that is a string of four characters, a dash, and three digits tends to be stored as a string. This is incorrect; that string is not like other strings, and this is a blind faith that somebody is not going to incorrectly use that string. Wrap that in a type that specifically checks that the ID fits that pattern. It is easy to fall into this kind of mistake in the database management world, too, where an ORM pulls in a VARCHAR2 and does not translate values of this type into an appropriate type in the system, perhaps a type with a restricted domain.
Another example: I have learned to always question if my method really needs to accept an "int". This is a smell. Could my value be negative? What is the real min/max range of the value? Could the value have a magic value, such as "Number Not Given" (or NULL)?
Functional programming helps us with code flow. OOP helps us carefully classify the data.
Most mainstream OO languages with a type system to speak of actually get in the way of correctly classifying data by confusing the separate issues of reusing implementation artefacts (aka subclassing) and classifying data into a hierarchy of concepts (aka subtyping). The only widely used OO language (for sufficiently narrow values of wide and wide values of OO) to get that right used to be Objective Caml, and recently its stepchildren F# and scala.
So it is actually FP that helps you with the classification.
This is a very interesting point and should be highlighted. You said implementation artifacts (especially in reference to reducing code duplication), and for clarity, I think you are referring to the definition of operators on data (class methods, friend methods, and so on). I agree with you that subclassing (for the purpose of reusing behavior), traits (for adding behavior), and the like can be confused with classification to such an extent that modern designs tend to depart from type systems and be used for mere code organization.
"was there really a point to the illusion of wrapping the entrypoint main() function in a class (I am looking at you, Java)?"
Far be it for me to defend Java (I hate the damn thing), but: main is just a function in a class. The class is the entry point, as specified in the command line; main is just what the OS looks for, by convention. You could have a "main" in each class, but only the one in the specified class will be the entry point.
The way of the theorist is to tell any non-theorist that the non-theorist is wrong, then leave without any explanation. Or, simply hand-wave the explanation away, claiming it as "too complex" too fully understand without years of rigorous training.
Of course I jest. :)
You have a point, but it's not all as one sided as you say. If you never grubby your hands in the practical, you won't have practical limitations and insights to help you on your way.
I remember a long time ago I was watching an online forum on riddles. One of the 'riddles' was how many cigarettes (read as: finite regular cylinders) can you arrange such that each one is touching each other one. I watched the thread go for three days as various theorists claim a maximum number according to mathematical theorem A or B or C. It started out with a max number of 3 and took three days of impassioned debate to work up to 6, with theories floating in and out of favour - and each one claiming to be 'the absolute ceiling limit'.
I then took a matchbox down the pub and asked the same task of my drunk friends (telling them to ignore the bulbous heads/square cross-section). Within five minutes, all of them, even the utterly non-scientific ones, had found a solution for six. Most of them found it within 1-2 minutes. Drunk, untrained, undiscplined folks that actually physically played with the items beat out impassioned, educated enthusiasts.
It highlighted the issue for me that theory is all well and good, but it's not useful in a vacuum. Theory and practice need each other to be efficient.
Similarly I went back to visit my old university. I saw a guy there 12 years into his PhD... and still clueless about the practicalities of what he was recording. He'd spent his whole career in the theory of it and had no idea of the realities of recording his subject, something that 3-6 months in industry would have given him in spades.
So no, it's not as clear cut as 'theory done cleanly in practice would win', as the real world is grubbier and has more edge cases. Theory and practice need each other for efficiency. Thinking of it another way - you don't need to test your backups... in theory... :)
Incidentally, on the cigarette question, I've seen a solution for 7, but I don't know of any higher. The guy who showed the solution for 7 was also the first one to say 'this is a solution, but there may be a higher ceiling out there'.
On the one hand, playing like that can give you insight. On the other hand, it can make you lose lots of time chasing an almost solution. Are you sure that arrangement of 6 was really a solution and did not require some minute bending or deformation?
"re you sure that arrangement of 6 was really a solution and did not require some minute bending or deformation?"
It's a good question, but then, it is also a perfect example of why playing around with the real object is sometimes better than applying a theoretical analysis. The question was about cigarettes, and cigarettes do bend / deform. That means that any analysis will need to take this characteristic into account. The problem is, we don't necessarily know all of the factors that need to be modelled to do a correct analysis, which is why empirical testing is still an important technique for problem solving.
Absolutely - which is why I say they need each other. They need to be merged intelligently, and purists of either camp should be taken with a grain of salt.
Cue one highly ranked HN news item with no news, nothing about startups, nothing specific to hacking, no real information, and nothing that gratifies one's intellectual curiosity.
Yup - I'm "that guy" and today I'm a grumpy old man. This item is just noise.
ADDED IN EDIT: And yes, I expected to be downvoted, because anything that smacks of criticism of the community gets downvoted. People don't like to be told that they're doing something wrong, especially people who are so damned sure they're right, or clever, or talented, or hardworking.
Yes, the very nature of an entrepreneur is to follow things through in the face of adversity, but when you're criticised you need to stop and see if there's some truth - to see if there's something you should learn.
Before you downvote this comment, ask yourself why? Does this submission as a whole really add value to the site? If it does, then my criticism doesn't, and rightly should be downvoted.
But if this item adds no real value to the site then my criticism is valid and should at least make you pause before looking for an arrow to click.
And if you do downvote, then a reasoned response would add value. But maybe you really are just looking for cheap entertainment. Replying - adding value to the site - would be, you know, like, work.
That was an especially fine job of grumpy old man. Kudos and congratulations.
If you haven't noticed, the site is going to hell in a handbasket. Looking at the front page, we have more quadracopter porn (love it!), three stories along the lines of "guess who's quit/joining X" story, more Color trashing (can we say who cares?), a story that promises to teach me to accomplish anything, and, yet again, "Is C++ dead?"
I say if C++ has jumped the shark, it is only a shark iterator, and the sequence is infinite.
That's not even mentioning the new jokey/snide-remark level in commenting...
Sorry. Didn't mean to horn in on your grumpy old guy glory.
According to the official FAQ (http://ycombinator.com/newsguidelines.html) on-topic is defined as: "Anything that good hackers would find interesting. That includes more than hacking and startups. If you had to reduce it to a sentence, the answer might be: anything that gratifies one's intellectual curiosity."
Now I personally am a hacker (we'll leave the good for others to argue) and I find this interesting, if not rather humorous. This post currently has 51 points, meaning at least 50 other people also considered it interesting. If that's not strong evidence that this link belongs here please explain what is.
Do you really think it "gratifies one's intellectual curiosity." ??
Honestly??
I think it's a cheap joke with no intellectual value. Some people find it funny, and they will upvote it. I'll happily believe that 50, 100, possibly 1000 people here find it funny to think about how to insult computer scientists.
After you've read it - what have you really learned?
Upvotes do not of themselves mean an item meets the guidelines.
You are, of course, correct that upvotes do not "define" what belongs in this community. However if you will not accept that evidence, look at the other comments in this thread.
I see people having fun and satisfying their creative curiosity by making and building upon jokes inspired by this link. Links which link intellect and curiosity are quite often posted to HN.
To answer your question on what have I really learned... well that's not a question you want to start asking. I can point out quite a few articles that don't belong upon HN when held against that criteria. Take for example almost everything related to the NYT. I usually don't click on those and just wait for the blog post that explains the same thing with more information and less filler.
As for another argument that you will find quite easy to strike down, this is not somewhere where I go do to market research, it is somewhere where I go to entertain myself and find things, preferably about hacking, that I may read. This link does a brilliant job of being related to hacking while also being entertaining.
> this is ... somewhere where
> I go to entertain myself
And there's the difference between my purpose for being here and yours. I entertain myself in other ways on other sites. I come here for information about hacking, startups, and things that satisfy my intellectual curiosity.
This submission does none of those. I don't think it's even especially funny or clever, but I've seen it all before.
Thank you for your reply - it has taught me something. There was a time when I would've upvoted you to show that you added value, but these days votes seem to be used mostly to express agreement or disagreement, so I'm disinclined. But I value your comment - thanks.
Well, it's not "my" community, and never has been - I've just hung around and tried to add value in accordance with the guidelines. It's been that way since PG suggested I submit something. I'm just continuing to see how much value there is, how much I can add, and gauging when it's time to move on ...
After you've read it - what have you really learned?
I've not come across the "Theorists Favor Sophistication, Experimentalists Favor Simplicity" idiom before, and thinking about those people in my life that exhibit one or the other, I see that there's a kernel of truth there.
Though the article is intended to be humourous, I've added a little bit to my understanding of the human condition by reading it. And I got a smile or two while doing so.
I think this post is timely. My brother is presently working and studying at a university (The University of Cincinnati) where the computer science department has been axed in its entirety because of politics.
They still, however, have the more applied computer engineering department. I think this post speaks volumes about the attitudes of faculty at Purdue, both in CS and outside. There's a lot of stuff this post doesn't explore because of the format, but I'm glad this showed up. Maybe you're removed from academia, or maybe the academic environment in the UK doesn't have these sorts of issues separating applied and theoretical CS, and even with applications-focused universities killing CS entirely.
In short, this is not fluff, it succinctly illustrates a significant divide in the field.
Well, I was just hoping to make people stop, think, and not contribute to this content-free crap. I used to come to HN and read pretty much everything. Now there's less and less that's got actual content, and more and more "entertaining" articles, or items of purely general interest with no focus. Some hard-of-thinking wannabee hitman, how to insult computer scientists, completely obvious comments about outsourcing, letters from Fukushima workers, any items of real value have to compete for space.
Even the really good stuff feels familiar. I wonder if I've just seen all there is to see and everything's looking familiar, in the same way that as you get older, everyone reminds you of someone else.
Maybe it's time to move on. Maybe this is a time to get the sense of the community and use that to help guide my decision. They do say - listen to your users. My contributions here on HN mean that you are, in a sense, my users.
No, I really mean it. Some noise (e.g. in the form of this meta-discussion started by RiderOfGiraffes) is OK if makes people think twice before submitting more noise.
Just curious, are you upvoting stuff on the "new" page? I see a lot of stuff that I think is interesting get few if any votes, and then slides off to obscurity. As you noted, this is a community, and so all have to play their part. If you and a few others upvote interesting stuff, it probably finds its way to the front page to fend for itself.
Yes, regularly. I use the "New" page as my main landing, before visiting the "Front" page. Every time I visit I upvote everything that I think is of value.
Well, I was just hoping to make people stop, think, and not contribute to this content-free crap. I used to come to HN and read pretty much everything.
You're probably being an ass-hat. And I probably like you for it. (It's one of my oldest son's finest qualities.)
The insult-backfiring section reminds me of some interesting experiences as an AI researcher, with a slight secret penchant (as many of us have) for robot domination. When a humanities or social-science researcher starts talking about "technological determinism", "machinic tyranny", "sapping of human agency by machine", the "instrumental society", and similar, the AI researcher sits, trying hard to suppress a smile, thinking, "yes, yes, PLEASE do go on!".
When I worked in CS research the sure fire way of making the Prof I worked for incandescent with rage was for someone from industry to start a comment with "In the real world...".
I once got, from a finance guy, in all seriousness: "So, you're the CTO...I guess you must have a lot of gadgets at home...?"
My response was along the lines of "Actually, when I was in grad school I spent most of my time on theoretical computer science and artificial intelligence." A little obnoxious, sure, but he deserved it.
It means MS Word screwed you over again, this time by switching from A4 to Letter while you weren't looking. Damn Americans and their Imperialist defaults! :D
For some context (odd that it's not included in the essay itself), this is an essay by Douglas Comer from January 2009 -- according to the HTTP header, at least, which I'm inclined to trust given the system. If that name's familiar but you're not sure why, he wrote Internetworking with TCP/IP.
I know it's older than this, since it was up when I was at Purdue almost 10 years ago.
As a side note, Comer is the type that once he's decided you aren't worth talking to, he will do whatever necessary to ensure conversations with you are short, including applying the contents of this essay in spectacular fashion.
I prefer Spider Robinson's advice. No one who you totally and completely ignore will keep pestering you for long. Some people take insults as validation, or are like kids who just want attention, but no one will keep bugging you if you don't even recognize their existence.
They almost feel younger, somehow, less color possibly? Interesting comparison. You can tell it was a smaller closer community then. I enjoyed it then and I still do, likely because of it's evolution.
Tried a shot at both simultaneously actually. saying his theory is oversimplified(theorist) and that his insult system is missing a critical element (discrete approach is invalid - experimental faliure).
Anyone has interesting ideas to create a simplest possible insult that would work for both properties ? (yeah ok, i'm no theorist)
> You can try to attack both parts simultaneously:
> I note that the systems aspect of this project seems quite complex. Do you think the cause of the convoluted implementation can be attributed to the more-or-less ``simplistic'' mathematical analysis you used?"
Purdue's CS Department is legendary in its archaic theoreticalness (if there's such a word, but would be a nice insult, I think). So much so, it seems, to produce a lot of funny commentary. For another popular example from proggit: http://www.reddit.com/r/programming/comments/gcgv8/all_hope_...
It seems that the author is a theorist, not a practitioner, as his theorist insults ("Despite all the equations, it seems to me that your work didn't require any real mathematical sophistication. Did I miss something?") are much better than his experimentalist insults ("Wasn't all this done years ago at Xerox PARC?")
If you want to insult an experimentalist, it's there's a much more cutting one - "Didn't Microsoft do something similar when they were working on CORBA?".
Didn't Microsoft do something similar when they were working on CORBA?
To which you'd get a puzzled look while they condescendingly informed you that Microsoft never worked on CORBA - that was the OMG, and Microsoft had COM/DCOM.
Xerox PARC is good, because they probably did work on whatever is being discussed.
Other options include "Didn't Englebert include that in the Mother of All Demos?" and "Didn't Taligent have that running?"
If I write code that needs to be performant, and I do it by using an efficient algorithm, analyzing it's complexity and use Big-O notation, I'm automatically a theorist?
I just finished a class with Prof. Comer. He is hands down the best instructor I've ever had. He has a very down to earth method of teaching & personality for that matter.
My boyfriend just graduated with a PhD in computer science, and his mother sent a congratulatory email to him (with me in CC) saying "Congratulations on your PhD in IT!"
I can see that your work contains much that is new and good however the new is not good and the good, not new.
A paper on this work would be illuminating if ignited.
Congratulations: You have now once again successfully solved one of the one star exercises in Knuth's TACP.
In this work, is your coauthor a professor, post-doc, Ph.D. student, Master's student, undergraduate, or a visitor from a middle school?
So, your new programming language is Turing Machine equivalent. Okay. Beyond that its main feature is that it 'encourages a programming style' you prefer. Just why is that 'style' more significant than, say, style in the rag trade?
Your work has used some theorems from math but has it added any theorems to math?
> Congratulations: You have now once again successfully solved one of the one star exercises in Knuth's TACP.
My CS professor has published papers that are solutions to exercises in The Art of Computer Programming. It's sort of a fact of life for computer scientists that whatever you're doing has probably already been done by Knuth.
Knuth has a large range of problems in his text. He scales them all by difficulty on a log scale from 0 to 50. Things rated 00-05 might be a basic facts or arithmetic (What is 6! ?).
A set of problems in the 20s would be a good chunk of work, each taking about 20-30 minutes. By the time you get to 40's, you're dealing with problems that you could write a paper on. In the introduction, he demands proof of Fermat's Last Theorem as an example for 50.
"Interesting, I'm from industry and this appears very applicable to some of the work we're doing right now."