Hacker News new | past | comments | ask | show | jobs | submit login
MIT-related work that has transformed computer science (csail.mit.edu)
123 points by kp25 on May 18, 2014 | hide | past | favorite | 52 comments



I think many of these claims as "MIT's contribution to transform Computer Science" is unjustified. Particularly, wasn't Bob Kahn already out of MIT when Vint Cerf and him created TCP/IP? What was MIT's role here? Just that they paid salary to Bob Kahn in the distant past?

On the other hand, when I think of MIT, the most important contribution to Computer Science that comes to my mind is Seth Gilbert and Nancy Lynch's proof to Brewer's Conjecture, famously called CAP theorem. This proof is so profound, so important to Computer Science and the way we build large-scale computer systems today. And d'uh, it has been left out.

This looks like pure marketing to me. And in some sense misleading.


Yeah, and if I remember correctly, RMS's work on GNU was not affiliated with MIT and he did it on top of his actual responsibilities in the AI Lab. It's like saying your high school is a great place to be educated because you became famous down the road.


The World Wide Web was INVENTED at CERN in Europe, I thought.

http://en.wikipedia.org/wiki/History_of_the_World_Wide_Web

After that Tim Berners-Lee went to the MIT and founded the W3C.


You mean the botched SGML clone sent over RPC clicked together in NextStep?


dear downvoter: I am happy to counter each and every argument you may have, in case you should have any.

Otherwise forget the PR and dig deeper than usual.

ps: Actually, this MIT piece is like the CERN web pieces, run by the PR department of said institutions.

Such accomplishments are rarely if ever obtained by a single lab, or institution for that matter. They are rather embodiments of the spirit of the times, the effort of many of a given era implemented with the technology then available (see Otlet). The only question is, who gets to win the (often propaganda) fight for the claim and who gets to write history.


You're most likely being downvoted for tone and for lack of any support for your argument. HTML was based on SGMLguid... with one exception.

The only radical change was the addition of the all important anchor (<a>) link, without which the WWW wouldn't have taken off. [1]

Minor detail.

[1] http://infomesh.net/html/history/early/


prewar networked knowledge-base, pre-Bush:

https://archive.org/details/paulotlet

or google youtu.be Paul Otlet, visioning a web in 1934

biased incomplete list of link implementations pre HTML:

https://en.wikipedia.org/wiki/Hypertext#Implementations

The reason why WWW has taken off is very well described in:

High Stakes, No Prisoners: A Winner's Tale of Greed and Glory in the Internet Wars.

and has very little, if anything to do with the <a> incarnation, which as you say is a minor detail.


Most of this work is publicly funded. They whole VC industry is built on it. It should return more of its profits to the public, just as it would to early-stage investors.


If you're interested in this, I highly recommend work by the economist Mariana Mazzucato. In particular, her book "The Entrepreneurial State – Debunking Public vs. Private Sector Myths." http://www.amazon.com/Entrepreneurial-State-Debunking-Privat...


That's what public funding of research work is for.

Basically, there's a continuum ranging from public research (i.e., high-risk, high-impact) to industrial research and startups (medium risk, medium-to-high impact) to common adoption (low risk, low impact).


The industries have returned value to the public in the form of taxes, jobs, and improved quality of life. It's already a two way street.


Ok, I'll bite. Next time you invest in a startup, or if you ever were to, you agree to forego equity. Instead take a cut of the sales tax, a job, and the opportunity to purchase the product. Fair deal?


That's basically what I do as a tax payer anyways...especially when it comes to investing in basic research, I can live with that.


The question is, would an angel investor take that deal? The answer is obvious: no way.

An analogy would be if we all belonged to a VC fund, and the fund manager took a deal where instead of equity they take a cut of future sales tax. Imagine that. No way you'd consider that a fair deal, even if the startup produced some nice products and job opportunities.

You'd want equity if it was your personal investment. Why should it be any different for taxpayer funds?


Individuals are generally selfish and don't think very well about collective society good, but ever since the Neolithic revolution we've been pretty good about collective investment and risk.

Asking for direct obvious returns from research funds means much useful research won't get done, and researchers will focus on more incremental surer bets (like industrial research or startups). We'd just completely kill our society's competitiveness and might as well start learning Chinese to better welcome our Han overlords.


The entire U.S. system of public high tech investment is mostly done under the pretext of military spending. The military has specific objectives and is looking for clear ROI, but that doesn't impair their ability to invest in core science. They just take a longer term view.

Licensing core tech (instead of giving it away) is not the same as asking for direct returns. It's just giving the taxpayers their fair share of the eventual returns on their investment, instead of letting VCs collect it all.


The whole space program was about developing ICBMs. WII led to huge technological advances (computers for one), etc...

I'm not sure what the mix today is between military-oriented and social-oriented spending is. The fed obviously give lots of money to lots of research universities, medical oriented research with no obvious military applications coming up at the top of list.


The title is misleading. It makes you think all this research happened at MIT.


For example: #12 The PC (1973), this is Butler Lampson's work on Alto while at PARC. MIT (the university) had nothing to do with it. Also, #17 TCP/IP (1977), Bob Kahn was at MIT for 2 years or so a decade before working on TCP/IP.

Including such things takes away attention from "pure" MIT inventions e.g., RSA (all of that work was done by Ron Rivest while at MIT).


That's a fair point, and we edited the title accordingly.


my favorite is the GUI. While many credit XEROX with the GUI, most of the ideas came out of the quite amazing Sketchpad by Ivan Sutherland. As an MIT student, this list makes awesome reading!

Here is the relevant snippet:

Nearly 50 years before the iPad, an MIT PhD student had already come up with the idea of directly interfacing with a computer screen. Ivan Sutherland’s “Sketchpad” allowed users to draw geometric shapes with a touch-pen, pioneering the practice of “computer-assisted drafting” that has proven vital for architects, planners, and now even toddlers.


The work demonstrated by Engelbart in his MOD was actually done at SRI, which then inspired XEROX.

When Alan Kay started his PhD at U of Utah, he was given Sutherland's dissertation to start off with (where Sutherland was a professor). That eventually led to his ground breaking Dynapad concept.

Anyways, back then, everyone knew everyone.


I believe that SICP should also be counted as one itself.


"50 ways that MIT has transformed computer science"

This is really misleading and makes MIT look bad. It's just spin.

MIT has done plenty of great work; there is no need to try to take credit for things that they really can't lay claim to.

Many examples elsewhere in the comments, but I'll throw in that Ethernet was not invented at MIT. I know the main title now says "MIT-related", but the article is not so modest.


#24 caught my attention with the poem/song about the algorithm behind the Spanning Tree Protocol: https://www.youtube.com/watch?v=iE_AbM8ZykI

Somewhere in between music school and programming for a living, I had the superficially nonsensical epiphany that algorithms and data structures were essentially music... just another language for expressing ideas about combinations and time. It's interesting to see this other perspective... an algorithm expressed as a song for the purpose of clearly communicating how it works.

The interview with Radia Perlman is a good read too: http://www.theatlantic.com/technology/archive/2014/03/radia-...


The title should be "50 ways that MIT has transformed computer industry". Not too much are about "computer science". To say that MIT has transformed computer industry, it should include include someone like Leslie Lamport.


So, Lisp machines get a mention, but Lisp itself does not.

Who writes these things?


Well Lisp itself is a Stanford invention. Scheme is an MIT invention. I agree that Scheme deserves a mention though, even though it isn't widely popular, it certainly transformed how people thought about education and things like garbage collection.


Sure not.

'Lisp was invented by John McCarthy in 1958 while he was at the Massachusetts Institute of Technology (MIT). '



"While a student at Harvard Business School, Bricklin co-developed VisiCalc in 1979, making it the first electronic spreadsheet readily available for home and office use."


Also because spelling is apparently not one of MIT's 50 achievments...


And not mine either :-)


Some PR?

When I think MIT I think of Aaron Swartz.


Because inaction in response to federal charges being brought against an unconnected individual outweighs more than 100 years of academic contribution.


It should also be noted this this was inaction by the administrators, not necessarily the students, researchers, or faculty.


That isn't how it works. You don't get to bank being a not-immoral so you can use it later.


even if it did , MIT would likely still be working on the morality-debt incurred during their involvements with various wars.

(along with every other famous tech school)

http://web.mit.edu/physics/about/history/1940-1945.html


I think this actually played into Swartz case. That possibly MIT didn't want waken a moral/ethical discussion around technology which is exactly what Aaron was trying to do. His message wasn't dangerous, but the conclusion of it means that the the army of engineers who work on MIT's weapons programs could disrupt a huge cash cow.


Hi TT. This may help you: http://lmgtfy.com/?q=aaron+swartz+mit


Clearly one of them is not writing click bait headlines:

50? Ain't nobody got time for that...


Is there a school that's done more for computer science than MIT? I can think of a few contenders - and obviously this is largely subjective - but when I think computer science in the U.S. I think MIT, Stanford and Cal-Poly in that order.


...you think Cal-Poly over Carnegie Mellon and Berkeley? Really?


probably went to cal-poly lol.


I'm not familiar with the lol campus.


:shrug: like I said, all subjective.


I'm curious what contributions you have in mind with regard to Cal Poly (also, Pomona or SLO?).


This. I agree that there is an element of subjectivity to it (Is Cambridge better than Oxford?), but I think it is simply impossible to successfully defend the claim that (as an example) The University of Phoenix has made more contributions to computer science than, say, Berkeley.

One way — not the only way, but a reasonable way — to measure it is by the # of Turing Award laureates[0]. Cal Poly isn't even on the list. (Neither is it on the list of Nobel Prize winners, for what it's worth).


do you happen to have a list of either the colleges with the most Turing or Nobel Laureates?


Yes, and I actually referenced both when I was writing that comment. I intended to cite them (note the [0]!), but I forgot.

http://en.wikipedia.org/wiki/List_of_Turing_Award_laureates_...

http://en.wikipedia.org/wiki/List_of_Nobel_laureates_by_univ...


Perhaps you meant Caltech?


It's interesting that they include the FSF, Internet Archive, Creative Commons, and Open Courseware among their "computer science" contributions. These are more political than scientific, although some of them did end up contributing massively to the development of computer science.

Now let's talk about some ways in which MIT hindered the development of computer science, including political hindrances.

1. Complicity in the persecution of Aaron Swartz (2011)

2. Anyone else?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: