I loved the ending:
"He has never been tempted to buy a television, but was persuaded to watch The Big Bang Theory last year, and said he wasn't impressed."
The show is despite what some think, not a celebration of science, but a ridicule of science, and takes the stereotypes to new levels.
Edit: I also watch the show, but as the article that GuiA links to mentions, the show wnats you to laugh at the nerds, and not the actual jokes they make.
You could argue its a Star Trek vs. Star Wars kinda thing, but i personally liked Community far more, because it really captured the fun in being nerdy, and didn't put nerds in one big collective box.
PS. I don't dress up, play fantazy card games, go to cosplays, read comics, etc. I'm an engineering student, i get drunk in the weekends, I code in .NET and Python, I love coffee, wine, cars, motorcycles, etc. Not anywhere near the steretypical hollywood nerd.
I haven't watched Community, but that is a long form version of my description to others (when asked) of why I don't watch The Big Bang Theory. I gave it a few episodes, I caught more when others watched it. It never clicked for me why I would be considered a part of the target audience when you often don't laugh with the characters, but at them. Thanks for the link.
The purpose of TBBT is not to entertain nerds, it's to acclimatize everyone else to the new normal where nerds hold a form of status. In 2013 nerds can be very rich, their fields can be very influential, and their intelligence can be useful to normals.
Normals, however, don't understand how to relate to nerds in a post-high school environment. Their previous social training has been to simply ostracize and bully. TBBT teaches normals a new way of relating to nerds: tolerate them, make friends with them, turn a blind eye to their social awkwardness and weirdness, and trade your cultural capital for the nerd's intellectual and financial capital.
Nerds can be useful to normals. Nerds can be good friends if you understand how to handle and bracket their awkwardness. They can even be good boyfriends if you understand how to handle them--Leonard can benefit Penny with his intelligence, devotion, and wealth even if he isn't cool.
TBBT isn't aimed at nerds, it's a primetime mainstream TV show.
The author is right that the public is laughing at the nerds not with them. However TBBT still represents a sea change in the public representation of nerds--now nerds are not to be ostracized, rather they are to be tolerated.
The word nerd doesn't mean nowadays what it meant 20 or 30 yrs ago. It's been appropriated by business sorts as a label so they can pass themselves off as creators of genuine stuff as part of their sales routine. Look at most of the nerds nowadays and ask if these people are really "persons of above average intelligence who don't care about their appearance"? Real nerds tend to use the aspie label nowadays.
Aspie is short for Asperger's, which is only a very mild form of autism, not that people calling themselves aspies necessarily have Asperger's syndrone, though some do. Many aspies have appropriated the term, however, because they know the business sorts calling themselves nerds will be repulsed by the term, and won't want to steal it from real aspies anytime soon.
The nerds on TBBT really are real nerds. All of them. They are different flavors of nerd. Sheldon is the aspie pervert creep. Raj and Howard are creepy manchildren. Leonard is the low self-esteem self-conscious self-hating type.
>They can even be good boyfriends if you understand how to handle them--Leonard can benefit Penny with his intelligence, devotion, and wealth even if he isn't cool.
You seem to be implying there are many fewer nerd-couples than I've actually seen.
I remember when hacker news had a culture that was embarrassed about making contentless unproductive comments like this.
The article is about Peter Higgs and the current academic environment. The Hacker News thread contains a prolonged debate about the merits of some TV show.
I guess you failed to read the comment I replied to then? Comments sometimes spin off into subjects which are not relevant to the original topic. Welcome to the internet, lighten up.
I did read it.
My point is, you contributed nothing. And you are still contributing nothing. And I shouldn't have to point this out. You should be self conscious enough to know that your comment contained nothing.
Seems like you have a double standard here. You've posted more than one comment which adds nothing to the value (per your secret definition) of a given thread. You should at least be consistent if you're going to call other people out.
> Seems like you have a double standard here. (...) You should at least be consistent if you're going to call other people out.
Ad hominem tu quoque [1]. Suppose I'm 70 years old with emphysema due to smoking cigarettes decades ago. If I advise you "smoking is harmful, don't do it", does my hypocrisy invalidate the assertion that smoking is harmful? Would the fact that I, as an addict, still smoke a pack a day magically make smoking any less harmful?
> Welcome to the internet, lighten up.
I have to agree with TheZenPsycho. Yes, this is the internet. But no, this is not /b. This is a domain of the internet called Hacker News, which (at least in theory [2]) has certain standards. pg has even expressed worry about the dilution of HN [3].
Ad hominem tu quoque [1]. Suppose I'm 70 years old with emphysema due to smoking cigarettes decades ago. If I advise you "smoking is harmful, don't do it", does my hypocrisy invalidate the assertion that smoking is harmful?
Thanks, but trotting out logical fallacies doesn't make you look smart and I already know that hypocrisy is not mutually exclusive with being correct. The point is that the parent seemed oh so saddened by the lack of constructive discourse here, yet he has at least two similar comments. So yes, ad hominem, that was the point.
> I already know that hypocrisy is not mutually exclusive with being correct
Then you already know that TheZenPsycho's hypocrisy was irrelevant? Not merely "not mutually exclusive with being correct", but basically a non-sequiter. So tell me, why bother bringing it up? At this point, I really am curious what you were thinking.
> trotting out logical fallacies doesn't make you look smart
But fallacies are useful for stating my position succinctly and accurately. If you're familiar, then the chance you'll misunderstand me is negligible. If you're unfamiliar, then the chance you'll misunderstand me (upon reading wikipedia and my example) is at least reduced. The alternative is to concoct an explanation which leads us to speak past each other. I give you Example A:
> The point is that the parent seemed oh so saddened by the lack of constructive discourse here, yet he has at least two similar comments.
No, that's not your point. That's one of your supporting arguments. Your actual point is "Therefore, TheZenPsycho should not have criticized me." I agree, fallacies don't make me look any smarter. But I imagine they avoid making me look as dumb as someone who can't defend their own position properly.
Furthermore, my comment was a Refutation of the Central Point. As a fellow HN user, surely you recognize this from How to Disagree [1]. Your point was predicated solely upon double standards (and vitriol?). And then I stated not only that double standards are noise, but also why I actually disagree with you by referencing HN's guidelines.
No, that's not your point. That's one of your supporting arguments. Your actual point is "Therefore, TheZenPsycho should not have criticized me."
No, that's not my point. My point is dealing with humorless gits like you and Zen is just wearisome. I should be "ashamed" fro replying to a comment to say that I like a TV show? Yeesh, get over yourselves.
Seriously, it's just exhausting talking to people like you. I'm sure both of you are a blast to be around in real life (that was sarcasm in case it was lost on you).
Taking your point at face value invites more problems than the point I originally inferred. For example:
> dealing with you and Zen is just wearisome.
Maybe your judgement indicates more about you than us? Perhaps you feel that dealing with us is wearisome? It's not as if I have an html tag floating next to me that says <wearisome>. It's like that joke about how Bob says "all my ex's are crazy", yet the most obvious common thread in Bob's failed relationships is Bob himself. Or like when a rookie whines, and the members say "If you can't take the heat, get out of the street."
But given your point (verbatim: "dealing with humorless gits like you and Zen is just wearisome.") was to insult others (rather than offer constructive criticism, even if it's "get over yourself"), I can't think of any reason why you would reply to Zen to begin with other than to satisfy your own ego. I mean, really, what else in the world did you expect to have changed via insult?
> I'm sure both of you are a blast to be around in real life
Sarcasm duly noted. But I admit, I'm having issues with the scope of this discussion (see what I did there? ',:D ). First, we're playing by the rules of the internet. Next, we're playing by the rules of real life. I've noticed a pattern. Rather reify to HN, you abstract away from HN. What's next, the rules of the cosmos? The 11 commandments decried by the flying spaghetti monster himself?
I think you're in denial of where we are. Like, have you ever even skimmed the guidelines? Do you say "You must be a blast in real life" when people sush you in a library? Do you derive satisfaction from rickrolling facebook, and comment "we're on the web. street rulez"? On one hand, it probably really is no big deal. But on the other hand, the librarian is probably justified in moderating the library.
So... you pick out one joke you don't like and presume that no one else should like the show? Right... so you don't like it. Who cares? I certainly don't. I like it. There, I cancelled out your dislike.
No, I just don't think making references for their own sake is particularly clever.
But if you like that kind of joke and you like the show, that's great. It's not something I would judge you as a person for. Heck, I'd be happy to suggest Mr. Penumbra's 24 Hour Bookstore, if you haven't read it already.
I don't watch the show, personally, but it's irritating that everyone has to come down on things like this. People spend their time doing all kinds of crazy things. If it's fun and you enjoy it, then I think that's great. There are worse ways to spend your time.
Apparently, that's not good enough, though. Everyone has to critique how other people enjoy themselves.
I've read books on algorithm analysis and I just excitedly purchased a book on discrete mathematics through haskell. Know what else I've read? The entire Sookie Stackhouse series. So friggin what?
My life was so much easier to live when I decided that it was okay to peacefully allow other people enjoy things that I don't.
(oh yeah, I don't think the parent post was saying bad things; it just seemed like a decent place to say it :) )
It's public culture, and popular enough to be held representative of a large subgroup of people. There's no reason why people shouldn't give their honest opinion of it.
It's a network sitcom... seriously, a network sitcom... I know I don't care for the opinions of people who base their beliefs on pop culture. You should give it a try.
And the parent wasn't trying to deny anyone their right to free speech, yeesh. He is just annoyed by those who feel the need to tell others what they should and should not like.
Honestly, I'm surprised many here have even seen the show. A lot of the echo chamber prattle around here seems to be like that Onion article about "local man won't stop saying he doesn't watch TV" or something like that.
Right, that's a zinger response, but the main thing I'm driving at isn't that I want my life to be as easy as it possibly can be. I'm driving at the value of not being judgemental and condescending, especially if the topic is really arbitrary. I think the case could be made that in some circumstances judgement and criticism are perfectly valid and useful, but I don't see vitriolic condemnations of someone's sitcom preference as ever being particularly useful to anyone. Moreover, I think that it's actively socially counter-productive.
If you watch actually funny shows that have a laugh track, such as Seinfeld, and removed them - they're still funny. Idk I just find BBT humor very bland, like the punch line to a joke may be just someone screaming out a nerd reference just for the sake of saying "Hey were nerds!"
One goes to conventions while wearing a cosplay. At the convention, one might also play a fantasy RPG or CCG/TCG, read comics, watch shows, or go to a rave. One will most likely partake of all this while really, really drunk.
Higgs isn't alone in remarking that he would not have had the time today to conduct the kind of work he did in the 1960s. Brian Greene has remarked that today’s grant-driven academia would not have allowed Einstein the luxury of a decade in which to develop his General Theory of Relativity.
Eric Weinstein wrote that, "We have spent the last decades inhibiting such socially marginal individuals or chasing them to drop out of our research enterprise and into startups and hedge funds. As a result our universities are increasingly populated by the over-vetted specialist to become the dreaded centers of excellence that infantilize and uniformize the promising minds of greatest agency." [1]
Weinstein's "deviants and delinquents" include "von Neumann [skirt chasing], Gamow [hard drinking], Shockley [bigoted], Watson [misogynistic], Einstein [childish], Curie [slutty], Smale [lazy], Oppenheimer [politically treacherous], Crick [incompetent], Ehrenfest [murderous], Lang [meddlesome], Teller [monstrous] and Grothendieck [mentally unstable]."
Higgs's observation suggests that the systematic stamping out of non-conscientious non-conformists is a byproduct of over-scheduling them. (Scheduling has to be considered independently of effort, ability and experience. [2]) Weinstein's deviants benefited from a historical period during which they had sufficient time and freedom to pursue their greatest work (von Neumann, who didn't need any time, is an exception).
But I imagine that time is a luxury even for the deviants and delinquents who find themselves displaced from academia into startups. So what is really needed is a startup culture that provides for years of uninterrupted development and focus. (Not to mention a hiring process that actively selects for displaced deviants, but that's another post.)
[2] http://www.nber.org/papers/w16502Don't Spread Yourself Too Thin: The Impact of Task Juggling on Workers' Speed of Job Completion. Decio Coviello, Andrea Ichino, Nicola Persico. NBER Working Paper No. 16502
> So what is really needed is a startup culture that provides for years of uninterrupted development and focus.
Not just startups, but any worthwhile project really. From an interview with Rich Hickey:
"I started [Clojure] while on a sabbatical I had given myself. Not a break from work, but a break to work, as a completely free person. I gave myself leave to do whatever I thought was right, with no regard for what others might think, nor any motivation to profit. In releasing it, I had the normal expectations for a new language—that ten to a hundred people might use it. Maybe I would get some help or code contributions. It has taken off, and subsequently demanded far more time than the sabbatical I planned. So, I’m trying to recoup some of the investment I’ve made. Had it been a financially motivated undertaking, I’m sure Clojure would not exist, but I don’t regret having invested in it."
That's an interesting link. I always assumed that Rich Hickey came from academia, given that he made a successful programming language. Are there any biographical articles talking about his background?
He was a consultant. FYI: It's extremely rare for anyone from academia to make a commercially successful language. Their incentives are different. They want publishable novelty, and they don't care about building a useful suite of libraries and a vibrant community. Jones (Haskell), Syme (F#), and Odersky (Scala) are the only designers w/ academic backgrounds I can think of for marginally popular languages.
Also Ousterhout with Tcl/TK. Interestingly, he was an OS researcher not a language academic, and he made the language to solve problems he had, not to be academically interesting.
And note that among the major reasons that Haskell didn't get widely popular is that it is full of publishable novelty and lacked a useful suite of libraries* and vibrant community.
Until the past few years when Galois folks and bos did a lot of library work.
Robin Milner of ML is the most prominent example I know. One of the two Turing award winners without a Ph. D (the other being Floyd.) He introduced polymorphic type inference to a full-blown programming language - this is the most jaw-dropping language feature I have seen to date:
C#, Python, PHP, Javascript, Perl, and Ruby. Obj-C, because the guy had a doctorate in biology. They are all very smart people.
As the other comments said, Tcl and Lua are designed by phd guys. As were C++, Java and Erlang. I guess we've got a pretty good track record. I should use my phd in programming languages to do something useful. ;-) maybe later.
> So what is really needed is a startup culture that provides for years of uninterrupted development and focus.
Just a few wild ideas, what do you think about these alternatives:
1. A more socially-responsible business environment, so that startups aren't the only icon to pray to?
2. Different funding schemes for research on high-arcing topics?
3. Reducing the "everyone NEEDS to get a degree" hype?
A lot of the instant gratification behind phenomenons like the startup culture is responsible for much today's bullshit in academia (I've seen a ton of it before I left it). IMHO, the issues run a lot deeper than how academia is (disastrously) managed.
Largely a focus on maximizing short-term and near-term "productivity" - as we understand it here and now - rather than allowing the leisure for both basic sanity and for larger thoughts.
I was about to write a longer post, but this is so incredulously complicated, and I am so biased, that I changed my mind halfway through. I'm honestly too biased about some of its points to provide anything marginally better than flamebait if I start going into details, but I can offer some hints.
1. The funding for research is increasingly scarce, and the ones who decide how big they are and to whom they should go are increasingly clueless. This means that ideas are funded less and less based on their scientific merit, and that more and more funds are handed over based on personal relations than on true selection.
This is an incredibly important problem. The retards running it are (like most retards) too arrogant to realize that we, as a race, suck at making predictions about what we can do. Both in terms of being too optimistic (remember thirty years ago when people thought we'd write these posts from a base on the Moon?) and in terms of missing obvious things, like electricity, which was intensely studied for several decades before it got any reasonable application.
There are many ways to avoid handing money over to crackpots. The focus should be on those, not on trying to judge the merit an idea might have in practice. We're too stupid to do that yet.
2. The bulk of the work is carried out by underpaid people, in bad working conditions. This makes smart people bitter, kills their productivity (in the short term) and their lives (in the long term). I distinctly remember my first workplace after I dropped out: it was a start-up where we were routinely pulling 60-hour weeks before we managed to hire enough people. This felt like a lot of free time to me. 60 hours is a free week for a PhD working on something worth a fuck. 120 hour stints are rare because of the physical strain they put on you, but you end up doing one every two months or so. Don't get me started on what that does to your life.
This may work if you have a bunch of Steve Jobs wannabe pretending they're scientists, researching how to sell things. Turns out, it's disastrous when trying to do real work. A lot of the jokes we played in the office revolved around that. They stopped being funny when we pondered how many drowsy PhDs were simply too tired to realize they could do <this thing> and bring us five years closer to treating cancer.
3. A lot of the undergraduate classes are becoming less and less fundamentals-focused, because academia is increasingly becoming the place where you're trained to work in the industry. Consequently, as people finish their senior years, they are increasingly less adept at research -- though, sadly, increasingly proud about their can-do attitude and so happy they have a Computer Science degree to prove that they can design websites, as if you actually had to fucking go to college for that.
Turns out these people are good enough for the wonderful world of startups. They're just as clueless about the real world as the people who pay them, their self-esteem is bad enough, and they are so utterly inept at learning anything mildly complex that they are easily sucked into the industry vortex. This isn't true only about CS and CompEng, it's happening in fields like EE and Mechanical Engineering, but the proportion of mission-critical applications in those fields is slightly higher, so most of the hipsters get cured after their first months on the job.
Turns out this system, while being almost satisfactory for the industry, breeds very bad researchers. Turns out it's hard to do research into web communication if 90% of your graduates have mad CSS skillz but still have trouble explaining what O(log n) means.
#1 and #3 above may be subjective, and there are a lot more points that I didn't want to bring in so as to maximize the chances of keeping the discussion civil. #2, on the other hand, while also being something I am quite subjective about (actually, it's the one I'm rather heartbroken about) is the one that escapes the scrutiny of people who are otherwise quick to cry out against the likes of Walmart.
Part of my dropping out of academia (see below for disclosure) is that I'm really not that smart. I'm good, but not scholarly material; I'm good at bringing together various technologies and finding atypical solutions to practical problems, but I suck at pounding on the same important issue for years and years at a time. I also suck at math. My brain isn't wired correctly for that. So, even if the working conditions hadn't been the way they were, I'd have left academia at some point, simply due to my sheer incompetence.
However, I had a lot of colleagues who were incredibly smart. People who could think of a problems in way I could never have possibly thought, and who were genuinely better at what they studied (electromagnetism) than I'd ever have been (I ended up doing research in EE by sheer chance anyway; I got into EE because I thought it would make be a better programmer -- which it did).
You wouldn't believe the way they changed in five or six years of working 10-12 hours a day, not only weekdays, but weekends too. I've heard more stories of wrecked relationships than I can think about, and too many of them are struggling with depression.
Most of them don't regret it at all. They're happy with what they discovered and genuinely feel they made the world a better place, but having been there, I honestly can't help wondering if it was worth it. I never bring it up with them, for obvious reasons, but it's very sad, especially since I can relate with that.
(Almost) full disclosure: I am an academia dropout. I dropped out during my MSc studies, despite being on a fairly good track (several articles, in several important journals (important as in "people who aren't scientists have heard of them") with my name on it, prior to me earning my BSc).
I'm taking a bunch less coursework this semester, and then I find out I need 30 points coursework to finish my MSc instead of 18. I would have been done with 18 this semester. I have to publish a thesis either way; it's a matter of whether they accept a non-Technion four-year-degree as a four-year degree or a three-year degree. I hate these anal, bureaucratic requirements; I just want to concentrate on research and in-depth issues rather than continually taking courses!
I just want to add one more issue:
4) In both teaching/coursework and research, academia is extremely detail-obsessed, constantly burrowing away from larger, important questions towards small, easily-answerable ones. For a good example, look at how many different kinds of differential equations the average engineering major at a really good university is required to learn to solve, and then check how often they actually solve those equations in either original research or in their jobs. Sometimes, yes, they do, but enough to justify having two or three distinct courses in just differential equations versus, say, a single full course in fundamental statistics? Oh, but there are a thousand different approaches to statistics!
The result is a system that, seen from the outside, appears to be trying to actively avoid tackling truly major scientific problems. Sure, it can give you a seminar on the latest approach to convex optimization problems or pure subtyping theories, but ask us what problems these solve and we academics will look at you sort of blankly.
> In both teaching/coursework and research, academia is extremely detail-obsessed, constantly burrowing away from larger, important questions towards small, easily-answerable ones.
In the research circles, this is simply because smaller, easily-answerable questions are the ones that fit the short-term grant applications. It's very unfortunate, indeed.
In the teaching circles, there is a slightly related case, which manifests through this:
> For a good example, look at how many different kinds of differential equations the average engineering major at a really good university is required to learn to solve, and then check how often they actually solve those equations in either original research or in their jobs.
This is because the mathematics courses need to reach a compromise between teaching enough fundamentals to be meaningful as math courses, and enough "practical" applications to warrant their presence in an engineering curricula.
I was also annoyed by taking math courses for three. Fucking. Semesters. But looking back to it, while I have forgotten much of the actual details they taught me, the type of reasoning they taught me stuck with me, and it's ok. I'm not sure if there's a better way to teach that.
For #3, I am starting to think we should split off a portion of computer science and call it software engineering. People who are interested in the theory of algorithms, etc can do computer science. Those that want to learn how to build fault-tolerant, scalable systems can do that in software engineering. It is similar to how chemistry and chemical engineering are split.
Of course you still need to get a basic understanding of how computers work with a software engineering degree, but knowing how to mathematical prove some algorithm is O(log n) is pretty pointless for most (not all) jobs in industry.
One of the things that is not taught well across the board in computer science programs is how to actually write code that is readable and maintainable and how to work with a team using source control, bug trackers, etc. Just to be clear, I am sure there are programs that do this well, but based on my experience they are not the norm.
> Of course you still need to get a basic understanding of how computers work with a software engineering degree, but knowing how to mathematical prove some algorithm is O(log n) is pretty pointless for most (not all) jobs in industry.
I agree, but IMHO most jobs in the industry that don't require you to prove that an algorithm is O(log n) shouldn't need a degree at all.
In the area of the world where I live, if you want to be an electrician, you can do that after you finish high school. You have to take a course and get a certificate for it as a legal requirement (which is true of any profession where you can get people killed) but it's pretty straightforward, and the course only involves pretty basic stuff.
And don't look down on electricians. My degree is in EE and when I need some work done on the installation in my house, I call an electrician. I could do what they do, but sloppily and with far more dangerous results.
No, you need a degree for most CS jobs. CS is too complicated. You need to know how the computer works on a theoretical level which requires a degree.
The CS equivalent to an electrician would be people who do basic IT support (i.e. tell someone to reboot their Windows machine) and easy programming like creating a blog.
Do you? Half of my colleagues who do web development would be utterly unable to tell you what TLB or virtual memory are, probably don't remember Ohm's law, and while they would probably be able to write a sorting function on their own, chances are they'd fail even the most basic exam on data structures and algorithm.
And they can do their job just fine. This isn't 1995, when writing dynamic websites was a pioneer's job. What they do is enough of a commodity that it can be outsourced to college freshmen at the end of the world. People don't need a degree to do that now, just like they didn't need one back then, only for different reasons.
Also, no, doing electrician's work is far more complex than IT support. For one thing, the chances of dying because of doing it improperly are quite disproportionate, and the amount of theoretical knowledge you need in order to be an electrician is not negligible at all. Your average hipster who's proud of the amazingly cool things he's hacking on his Arduino knows a lot less about it than your average electrician, even if the electrician isn't so obnoxious about it.
The software industry does not revolve around web development. Yes it is a large percentage but it is not the only type of software, and many web development jobs require understanding data structures and algorithms. As always, it depends.
I don't see how this contradicts what I said. There are tons of jobs in the software industry that don't require anything so advanced that you'd need a degree for. The JS hipsterisms are a prime example of those, but they aren't necessarily the only ones.
> You seem to, maybe fairly, denigrate startups and industry in general as well as academia.
Yeah, this is exactly why I changed my mind halfway through that long post :-).
I honestly love the startup environment. I'd much rather work in start-up than a large company. What I do denigrate is starting up for the sake of starting up. Finding an (often otherwise legitimate) need, pounding at it for six months while it's still hot, and coming up with a technologically half-assed product to sell for a reasonable price.
This is, in my opinion, destructive both intellectually and technologically. It teaches bad habits and gives little time to learn both adequate technologies and the fundamentals of their trade to programmers. I'd be a very rich man if I had a penny every time I told a colleague who was enthusiastic about a new technology he'd discovered on kickstarter or here on HN that <this operating system from the 70s/80s/90s> or <this thing from the 70s/80s/90s> had this. You know something is wrong when so many new things are so similar to old things -- so similar to the point that they repeat the mistakes.
As for academia, I still keep in touch with some of the people I worked with there. It's the most intellectually stimulating place I've ever worked in, and contrary to popular belief, one of the most refreshing feelings a professional can have is walk into a roomfull of people and realize they're the dumbest in there. I miss being the dumbest man in the room.
> What should an intelligent person do?
I honestly have no idea.
> What did you settle on?
I didn't settle yet, but if I were to look back, I'd say what I'm doing now is better than what I had back then.
Now I'm working for a large-ish company. Their business is entirely software, but they want to start doing hardware and they brought me in to help. They pay me well and I can come in the office at 10 AM, which is good (I have some sleep issues). The work itself is shit; there's a long ladder of managers who are increasingly clueless about what embedded development means, but each of them has to deliver results (no matter how irrelevant) because they made promises. Consequently, most of what I do is pointless, but not entirely uninteresting. In the last month or so, I dabbled in a USB driver in the Linux kernel, hacked on an HTTP proxy, helped a colleague build a PCB... it's useless, but not entirely devoid of fun. I intend to leave as soon as someplace where they actually need me to build stuff, not massage egos, shows up, but until then, I can bear it.
That being said, it also leaves me enough free time to do some hacking of my own, enough time for my hobbies, enough time for family and friends. I have enough time to brew beer with my girlfriend, read whatever books I want, learn Go and post crap on Hackernews. I'm far more unhappy with what I do than I was in university, but overall, I'm a happy person.
"Brian Greene has remarked that today’s grant-driven academia would not have allowed Einstein the luxury of a decade in which to develop his General Theory of Relativity."
Academia has been a force for stagnation well before the 1900s. e.g. Einstein wasn't working in Academia when he released his 1905 General Theory of Relativity paper. He spent 1901-1908 working in the patent office while releasing papers outside of Academia. Only in 1908 he was appointed lecturer in acknowledgment of his existing works.
If you want to figure out what's wrong with Academia I suggest asking economists who deal with incentives and market forces as a subject of research as opposed to the anecdotes of a few well-meaning physicists. Yes, the physicists are clearly smarter. But with few exceptions they're just not trained or equipped to asses these problems.
Seems he had most of the concept nailed down by 1907 but was struggling with the math. So, he worked the module step-by-step only adding gravity in 1915 after learning about Riemannian manifolds.
As a student in a Ph.D. program, I can't help but think I'm caught in the grant-driven world and am being steered away from deeper consideration of big ideas. It's an easy track to get stuck on, mostly because grant-proposing is a game, identifying a minor unexplored area is relatively simple, and there are short-term rewards for doing so.
What's really getting at me is the arrogance of established "experts." It's kind of baffling (because it should be about curiosity, not certainty), and when you come across someone who has made it through the academic system and still retained curiosity, independent-mindedness, etc., it's a really refreshing sight.
That being said, a lot of what I do is unfulfilling and I've discussed several times leaving the academic world for some kind of tech startup life, although I'm not sure if that is a pipe dream or grounded in reality.
> So what is really needed is a startup culture that provides for years of uninterrupted development and focus. (Not to mention a hiring process that actively selects for displaced deviants, but that's another post.)
Years? a lot of great research is done as the culmination of baby steps, and papers can be published at midpoints. Even Einstein, from his patent office published at least 5 papers in 7 years. That's a pretty decent rate considering it wasn't even his fulltime job, and a PhD student working as such would probably be considered proficient(then again, I'm not aware of the paper rate of physicists).
I've met a lot more people in research who would just do nothing without deadlines than hidden geniuses that are hindered by their deadlines. Granted, I've met few geniuses too.
I'm really curious as to what task really requires years of uninterrupted work that cannot be segmented and presented for feedback.
Of Adrew Wiles (of the proof of Fermat's Last Theorem fame), Wikipedia says:
"He dedicated all of his research time to this problem for over 6 years in near-total secrecy, covering up his efforts by releasing prior work in small segments as separate papers."
I take that as an indication that I am correct. There are the examples of General Relativity and Grigory Perelman's solution of the Poincare Conjecture, not to mention recent advances on the twin prime conjecture.
Look, you have a point, but we've swung to the other extreme. Nowadays, you're supposed to start building up your publication record while in undergrad if you want to have a really competitive background for an academic job.
I'm paraphrasing Weinstein [insensitive!] from his answer to the Edge 2013 question. The wording of Weinstein's original requires more cognitive load (to associate the individual with the adjective he assigns) than necessary. My paraphrase closely follows Weinstein and attempts to reduce the reader's cognitive load, without endorsing Weinstein's choice of adjectives--which I do not endorse.
"In the past, many scientists lived on or even over the edge of respectability with reputations as .... " is what I wrote on Edge.
Actually I don't think von Neumann or Curie had anything wrong with them in chasing the embrace of another. And yes, there is a major double standard in reputations. Curie after all chased love while von Neumann, it appears, chased sexual opportunity. Thanks for pointing it out. But the post was not insensitive. Most of these people are my heroes and I'm proud of all their accomplishments.
Pierre was killed in an accident in 1906. In 1910-11 she had an affair with one of Pierre's former students, Paul Langevin, who was estranged from his wife but still technically married. The Wikipedia entry links to this page:
No. A historical figure in mathematical logic and complexity theory recently suggested that I earn a second Ph.D. in economics. It probably should have been the first, instead of mathematics. I work on agent-based simulations as a hobby. (No jobs in that area.) I make an effort to read whatever I can, but it isn't nearly enough.
I feel like the inability to focus for long periods of time is holding back progress in many areas. If we can't even stay away from email for a few days, how can we clear our minds of conventional thinking?
A few days? People get upset if you haven't replied in an hour.
Seriously though, long periods of solitude is not what we need. I'd not be productive if I became a hermit, but I'd be a lot more productive if I had periods of quiet time, punctuated by moments of discussion. This is how thinkers and artists (and hackers) worked for a long time, before we wrongly convinced ourselves that being interruptible improves productivity.
It's not about solitude so much as the ability to control the contents and direction of our thoughts. Being responsive on email often requires giving up that control, fragmenting our attention and preventing the formation of larger thoughts. It provides short term gains, but we don't see what we're losing.
My problem is then I'll get too many phone calls. And for me, a phone call is more disruptive to my workflow then an email. Then again, that's probably just because I dislike talking on the phone in general.
I solved this problem too. People know that I hardly ever answer my phone either, unless you arranged to call me. If it's important, leave a voicemail with a reason why you called not just some "call me back" contextless shit. By "arranged to call me" I mean sending me an email and asking if you can call and at what time. I check mail about 3 times a day.
We really new to get over this instant communication fad. It's counterproductive.
Talking to people is "instant communication" and we've been doing that for a while.
There are things that can get resolved literally 100 times faster by talking than by e-mail or text or whatever. Not everything requires that urgency, but back-and-forth in e-mail can end up being a huge time sink for both parties.
And, relevant to the topic: he is quite productive, but I don't think he would make it in academia at his current output rate. It does help that he splits his work into separate papers, though, and that he publishes 'dull' papers about his tooling, too (IIRC, the original plan was to have two volumes of TAOCP. Now, we have 4 plus fascicles plus some books about the software he wrote to lay out his books and create the fonts therein)
I don't think it's about "quiet" - if you meant it literally - the right kind of distraction can be a sort of mental noise cancellation mechanism. Though this tends to work better in solitude, for obvious control reasons.
It's a problem, but I think if you really value time, you can make it.
I've found that the best way to spend less time on e-mail is to send it very infrequently. I used to send tons of e-mail at work, and that was valuable to me at the time. Now I choose very carefully what to reply to, because every message I send means I will get something in return. It's pretty astoundingly effective.
Another strategy is to keep an odd schedule. I generally get to my desk around 2pm, so I'm sufficiently out of sync with people who get to their desk at 9 or 10am. It's also good because there so little (bike) traffic that I can space out and think on the commute.
Sure, but don't you have control over your role? Maybe not in the short term, but in the long term you can take steps to move into a role or job where you have bigger free blocks of time. You just have to decide if that's what you value over, say, money, prestige, etc.
Paul Graham wrote a nice essay where he mentioned that "prestige" is basically a way to get people to do shitty work.
What I'm saying doesn't really work for managers or leaders, because your job boils down to talking to people, and e-mail is a big part of that. But it works really well for programmers.
Brain is remarkably adaptable. I would not go as far as labeling new type of academia as deficient. Sure it is different, does it make it better or worse? Toss a coin.
Somewhat related story: I work with someone who recently got a PHD. That person can concentrate and work on a hard problem even when there is conversation going on. Only when name of the person is called do they interrupt their work and start paying attention.
What is being discussed here is precisely the scenario when their name is being called, specifically the frequency with which it's called (if I may stretch the illustration). That your colleague is one of those who can concentrate with other conversations happening is wholly irrelevant; AFAIK, whether one's concentration is helped or hindered by ambient noise is genetic.
Here's an experiment for you: when they're doing that, call out their name every five minutes or so. See how long it takes before they try to kill you.
I'm trying to figure out where your non sequitur came from. The brain being adaptable does not mean its ability to adapt is infinite and no one, but you, suggested that. Did they waterboard you in your academic program? Is that supposed to be an analogy to some other behavior? I can't even conceive of what they might do that would be close to that hard on a person's mind.
It's a crude, sarcastic form of reductio ad absurdum. TrainedMonkey said, in effect, "making it difficult to focus is fine because the brain is adaptable". 001sky showed that "the brain is adaptable" is not a strong argument. This is true precisely because the brain's ability to adapt is not infinite. Energy you spend adapting to something is energy you're not using to do directly useful things, and that energy has to be justified with more than a statement of "well, you can do it and there might be some benefits" (which is almost exactly what TrainedMonkey said, just phrased less flatteringly).
He didn't show it wasn't a strong argument, he showed his tool for arguing is a fallacious statement. Reductio ad absurdum[1] is a weak argument. It involves non sequiturs, consequents which do not (by any sane logic) follow from the initial statement or assertion.
There are certainly issues in academia. "Publish or perish" is a terrible model for retaining people and measuring success. In the enterprise world the endless emails and meetings seriously detract from the work effort in many cases. That doesn't mean things don't get done, it means the balance is off. Be proactive and instead of making up absurd arguments about the impossible consequences of a thing, fix it. Develop tools that manage communication better. Show that in your office the daily stand ups are not necessary and ask forgiveness along with submitting the case study that demonstrates you improved productivity with your alternate approaches. Avoid making weak, sarcastic, slippery slope arguments like a religious fundamentalist that believes condoms lead to sheep fucking.
EDIT: [1] I should say, in this case, as water boarding nor anything near it in terms of affecting the mind is or is at risk of being applied in academia or enterprises where deep thought and creativity are necessary to the completion of the task.
I attended grad school at one of the most prestigious institutions that exists. After I finished my PhD, I mentioned to somebody I had attended there and it had taught me to do science well.
THe person I was speaking to looked at me funny and said, "Oh no. You went there to learn how to write grants that get funded, so that you can manage a lab that carries out the next set of experiments, so you can write grants that get funded.". They were right: the value of my training program was in how to get funded, so I could do research.
Some of us saw the writing on the wall (grant funding for biomedical sciences doubled in a short period of time in the early to mid 90s, which greatly increased the rate of minted PhDs, which dramatically increased competition for slowly growing faculty positions).
I looked around, evaluated my options, and left for industry. I never regretted it. i still collaborate with academia, and write papers and do research, but my compensation structure and the options I have vastly outweigh those has I stayed in academia and become a soft-funded researcher.
Ultimately if your goal is to do blue-sky theoretical research you're going to have to work very hard to succeed. Maybe it's possible to do good theory and also be good at grantsmanship. I honestly don't know. But I decided not to play that game: there are LOTS of options out there.
Don't be stubborn and insist on your blue-sky theory job. Instead, be realistic and figure out what you want to do, and how to obtain the time to do it.
He was a professor of physics at Edinburgh University, and he certainly showed up on the campus there. I saw him around, but I don't recall witnessing him doing any work since I didn't study much physics.
I'm guessing what he did was teach classes and supervise graduate students, put out the occasional paper now and again, just like any other professor. Also, his name would allow Edinburgh University to say 'We have that famous Professor Higgs working for OUR Physics department' to any passing Science Funding Authority or Corporate R&D department thinking of handing out a research grant in physics, or to any rich foreign parents looking for somewhere to send their physics-inclined offspring.
I'm sure he was pretty good value for money just for his name alone...
aha! just as I suspected... during 1951-1964, he published 16 papers, better than one per year.
But, after his breakthrough, he eased back to 9, over 1965-2013... and only the first 3 look to be research papers (the remaining 6 seem to be historical recountings).
Now, I don't blame him - do something incredible, why not slack off? He's way more accomplished than I could ever hope to be.
But if you want an exemplar of doing great work, the pace before his breakthrough, leading up to it, seems the better guide. i.e. one per year.
Why is it slacking off? Is it instead not normal for scientists to publish 3 papers over ~40 years? How do we know whether or not this is normal or abnormal?
I'm inclined to believe we don't. And that we only assume so because of the incentive structures in place in our grant-first, publish-always system.
Nowadays his 'prime year' 1 paper per year would be considered an unacceptably low rate and result in him being thrown off long before his breakthrough.
…published fewer than 10 papers after his groundbreaking work, which identified the mechanism by which subatomic material acquires mass, was published in 1964.
Not a physicist, but writing one good paper every four years sounds like a very good thing. Would you rather read 40 papers, each of which is a lowest-publishable-unit, or 10 which noticeably move science ahead?
Because of his fame, Higgs had the luxury few do--- he could contemplate quietly, and people wouldn't dare say he was doing nothing. If you do that as a mere tenure-track professor, I'd figure the university would think you're too high risk and sack you.
That would depend on whether the atom of scientific knowledge in one of those many published papers was what was needed to progress another's/one's own work or not.
I'm not suggesting quantity and regularity are better - I don't think you can force scientific progress to adopt a given release schedule - but neither am I happy to say that publicly funded results should be sat on pending a larger breakthrough.
Presumably thinking about things that never resulted in papers because they didn't work out, or else working on those 10. Working for 4 years on a paper isn't out of the question if it's totally original work and one is being super careful. Not that doing that is rewarded these days, which I assume was exactly his point.
Even for those of us just learning, far more "productivity" is required. If I wrote a paper every 4 years, my PhD program would be a 10-14 year program.
honestly, what is "a paper"? a publication is an atrifact of an idea. says nothing about the quality of that idea. today's system is just geared toward printing out a lot of garbage. one way to understand this is that all subjects have similarly increased publications over the years. does that mean each subject is progressing as a function of the subject's publications? of course not. they are being told to "jump" for money, so those in academy respond: "how high"? Acdemics are truly just rats in a maze at this stage of the game. They are guaranteed a salary by tenure, but not "funding" to actually do their work. Tenure offers no job security, in the broader sense, of having your costs covered.
Critical listening to others, building and being part of a community of practice, developing and challenging research students. Possibly informal communication, although I'm not sure of that. Teaching.
Basically dialogue rather than written papers.
Downvoters: please specify your knowledge and experience in academic research. I know what mine is.
"didn't"… "any" implies none. And wouldn't one be better off being concerned with what is taking up their own time? Higgs probably won't have the same question for you or me…
Generally yes, but if someone, Higgs in this case, is saying there is something broken with academia's definition of productivity then it begs the question what should that definition be? What was Higgs doing that was more important or more productive than publishing papers? It is a totally fair question given the circumstances.
Had Higgs never said anything then I wouldn't be interested in how he spent his time in the office. Nor would it be appropriate to care.
So you're saying because of his experiences with academia and the status he has from his hard work, he needs to help set new goal posts for the masses to pigeon hole themselves into which could resemble the situation in which he says we have today? It is a fair question if you don't expect any answers from him, but from the question itself, it doesn't seem that way. It seems like a question that would be more apt for the self.
I think the question was simply, "what was the standard he was measuring himself by, by which he considers himself to be successful?" It doesn't have to be a yardstick applied to anyone else--but presuming that Higgs doesn't believe himself to have wasted his life after the publication of his famous works, I wonder what he thinks makes that true.
Asking for definitions of what people should do, when ultimately the individual will be the one who will have a choice to make regardless of the circumstances, seems like yardsticking to me (that's what definitions are used for, right?), though I suppose I could wonder about that too.
I also suppose I could wonder about how many aspiring/(non)tenured PHD's spend their entire lives producing publication after publication without an inkling of fame or any major contributions relative to those of the likes of Higgs (or in general, anything any individual decides to do in the spectrum of their peers), who think that they haven't wasted their life.
Yes but you don't necessarily lose tenure if you don't publish, no? So realistically the promise isn't that important if you don't really suffer from failing to keep it.
Promise as in showing the potential for future research, not for your promising to do future research.
But yes, it is a gamble, that often fails. But people aren't given tenture just because of what they've done in the past. Yes, it relies on what they've done in the past, but the intent is that past behavior is a predictor of future behavior.
I feel slightly validated for leaving academia after reading this article. When I was in grad school I was working with an adviser who was in the final phase of applying for tenure. She was under immense pressure to publish, and she totally passed all that pressure on to me and her other grad student at the time. It was the worst two years of my life. Ultimately we both left her charge, and she was denied tenure.
The "publish or perish" atmosphere at most institutions is toxic. I remember running some in-vivo experiments using our system and looking for data that validated our hypothesis, but when I found none I was ordered to "find anything" and rewrite the hypothesis to match it. This was so that we had something positive to submit to journals, regardless of what it was. The papers were utter trash, and I was ashamed to put my name on them. The absolute focus on publishing ends up creating what is essentially "journal spam."
I don't want to say that research is dead, but it's definitely in the doldrums at this point.
"How should we make it attractive for them [young people] to spend 5,6,7 years in our field, be satisfied, learn about excitement, but finally be qualified to find other possibilities?" -- H. Schopper
The numbers make the problem clear. In 2007, the year before CERN first powered up the LHC, the lab produced 142 master's and Ph.D. theses, according to the lab's document server. Last year it produced 327. (Fermilab chipped in 54.) That abundance seems unlikely to vanish anytime soon, as last year ATLAS had 1000 grad students and CMS had 900.
In contrast, the INSPIRE Web site, a database for particle physics, currently lists 124 postdocs worldwide in experimental high-energy physics, the sort of work LHC grads have trained for.
The situation is equally difficult for postdocs trying to make the jump to a junior faculty position or a permanent job at a national lab. The Snowmass Young Physicists survey received responses from 956 early-career researchers, including 343 postdocs. But INSPIRE currently lists just 152 "junior" positions, including 61 in North America. And the supply of jobs isn't likely to increase, says John Finley, an astrophysicist at Purdue University in West Lafayette, Indiana, who is leading a search to replace two senior particle physicists. "For the most part, I don't think departments are looking to grow their particle physics programs," he says.
A warning to non-western members about values at CERN:
"The cost [...] has been evaluated, taking into account realistic labor prices in different countries. The total cost is X (with a western equivalent value of Y)" [where Y>X]
I think this sort of problem starts at the top, with our time limited democracies. Democratic governments find it hard to implement long term thinking because its unlikely the current government will ever get credit for it. This results in policies which are short term enough to promote re-election. This bleeds down to the rest of society. Where it does happen, it tends to be compromised because one side has to make sure the other side carries the project on. This leads to committee style compromises.
So, in general, from what I can see, long term thinking in western democracies is rare, and where it exists, compromised.
Implication being that Chinese long term strength lies in its ability to plan in the very long term, which they do. I think the west needs to find a way in which we can also effectively and flexibly plan long term.
Can we please stop promoting the "democracy sucks" meme? It's not as if the Chinese are actually planning in that long a term; they're using markets now like everyone else.
The problem is not with doing planning, the problem is with implementing tradeoffs after such planning - in democracy, you can't take actions that hurt you in this year but will benefit 10 years afterward, if it is not completely obvious to uninformed laymen. If you do, they get reversed after the next elections anyway.
Thanks for submitting this. This was a good introduction to Higgs. It was reading the comments here after reading the article that reminded me of a paper by Leif D. Nelson, Joseph P. Simmons, and Uri Simonsohn, "Let's Publish Fewer Papers,"
suggesting that psychology, and maybe other disciplines, could be improved if scholars published less often than they now do. "We agree that it is impractical, but it is just a thought experiment. Still, we stand behind the notion that the ideal is much closer to 'a paper a year' than to 'publish as many papers as you can.'"
The issue is Volume versus Quality. The conservatives (not political but professional) administering our research infrastructure find it much easier to greenlight an incremental (or inconsequential) project versus anything truly revolutionary. We need more Higgs and less incremental academic researchers - industry can do that better.
I think the reality is that more and more bright people avoid the Academy these days. I'm not saying I'm particularly bright, but I started out thinking I was going to be an Academic, but when I saw that I would have to deal with the same BS, politics, and hoop jumping that people in business deal with, I went with the option that was the same, but offered more money.
I think that's what the majority of the Academy has become for people now. If two jobs are pretty much equal, except in pay, which one will you take?
Stagnation is not only a manifestation of economy but every other field such as science. We have seen rapid growth in the past century and some forces be it policy, or expectation and pressure for faster growth will ultimately contribute to a stagnant period in terms of growth vs growth to the prior.
School is an obedience/memorization test, the exact opposite of the real-world applied thinking skills that are needed.
Private schooling is bad; public schooling is even worse. That's where ideology substitutes for academics as well.
Memorizing 20 ways to do something, and then finding a way to cram in each one on a test in an "applied" situation, leads to indiscriminate use of technique and poor debugging skills.
I think you are trying to make some kind of point about education or something ... I mean it's cute but teaching kids about arithmetic, or memorization as you call it, is a very different thing than what is being discussed here, namely the state of academic research and university expectations, etc.
I actually am not sure what's here ... I think it is sort of like the typical anti-academia stuff but seems to be more like anti-education ... I dunno ... What's better, watching a bunch of blogs on YouTube about how to build another blog in Node.js? Who knows.
> Private schooling is bad; public schooling is even worse. That's where ideology substitutes for academics as well.
Not picking on you, but this comes up a lot about public education as a tool for brainwashing. Often the implication is it's by those "liberal teachers". I must say, based on the voting record in the US, those teachers are incredibly ineffective brainwashers.
Speaking as someone who tends towards analytical, rational, and critical thinking with a background in independent studies from a very diverse background the "brainwashing" aspect of institutional education was blatantly obvious to me.
A good example is the approach towards history, which tends towards typical "revisionism" in attempts towards being "corrective" against narratives which were historically dominant, but in isolation such revisionism becomes indistinguishable from bias. In the '90s when I was in college, for example, we read a lot about things like the horrors of the nuclear bombings of Japan and of the internment of citizens of Japanese ancestry during WWII. We didn't read much about the Rape of Nanking or the practice of "comfort women" though. This biased narrative had a predictable outcome. In class discussions it became obvious that a lot of students held the view that Japan was a victim during WWII and America was the "bad guy".
Similar phenomena exist in many other forms throughout the educational establishment. People who have gone through the educational system have a very strong tendency towards a very particular set of view points. Ironically many of them believe that this is because those viewpoints are objectively superior to the alternatives rather than that they've simply been indoctrinated in them over a period of years.
My kids come home brainwashed from school all of the time. I have to sit them down and have them think about what is being taught to them so they can look at it objectively and make their own decisions.
Anything that is not history, math, science, biology, etc.
In the 6-7-8th grade (at least here in California) there are a number of assigned books and lessons that teach (as best as I can describe it) "empathy". Which is great... but some of the books had a very strong viewpoint one way or the other that was persuasive but not always well rounded.
Personally when I was that age the "empathy" lessons were way better in middle school. They lasted a semester or two, we watched a movie called "The Lottery" where a village picked from a lottery to stone one person (I can't remember why, a good harvest I am not sure) but a mother was chosen and it was so ingrained in the children to stone the person her own children looked forward to the stoning and the mother tried to get the children to think "hey this is my own mother". Another lesson was the poem about when you let bad things happen to others, when bad things happen to you there will be no one left to speak up on your behalf (can't remember the title), and the third one was on cultural understanding if someone hundreds of years from now examines your house and get what is inside of it "wrong" (like assuming the toilet is a communication device you yell into, or toothbrushes (with hooks) were ear-rings) not to think badly of those people (and as such you may make the same mistakes when look at the lives of people you're not familiar with).
Well, first you need a sorting hat.
Then you have different houses that cater to particular styles of learning.
If all goes well, one of the students will figure out how to vanquish evil.
I anticipate that a lot of it will come through other means. Check out http://www.youtube.com/crashcourse, it's doing a fantastic job of making knowledge fun, interesting, engaging and cool.
Here's what I always thought it would work best: free education up to 4th grade (reading, writing, arithmetics, AKA the basics), then free libraries everywhere. It's sad that we already have a lot of public libraries, but few people visit them voluntarily.
No, but I would think that it is often useful to indicate when one does not know of a better option than the one one is criticizing.
Because "this is a bad choice" has a connotation that that is not the one that should be made, but that a different known one should be made instead, I think.
Of course, the indication can be implicit, or from context, etc.
And there might be exceptions,
But I think this generally holds?
We're not genetically different from early Man, who got nothing done except for basic survival and raising a family. Being unproductive is the default.
> He doubts a similar breakthrough could be achieved in today's academic culture, because of the expectations on academics to collaborate and keep churning out papers.
The opposite of an Einstein or Higgs. Academia no longer tolerates deep, unconventional thinking. Physics used to be a hobby, now its an industry to be protected from major change. For example, if you have basic knowledge of relativity you can go to http://finbot.wordpress.com/2008/03/05/dark-energy-obviated/ and use the steps there in conjunction with generally accepted equations (from the Usenet Physics FAQ) to see that free objects launched upward from Earth at close to the speed of light must accelerate away from us, solving the big mystery of dark energy. But even assuming this idea is valid it couldn't make it into a respected peer-reviewed journal today. Ideas for hunting dark energy particles, however, would be welcomed, since those could lead to lucrative grants.
Reading your link, it appears that the author just flatly assumes that space is not expanding, and everything else stems from that assumption.
The author provides another article to attempt to support that assumption, but their argument for that is incomplete, and while their alternative appears to offer explanations for some observations, it ignores other things (like cosmological redshift). Add in the fact that that the article appears targeted towards laypeople and there's no scientific paper to be found suggests that it's more psuedoscience and thought experiments than actual scientific investigation.
It seems unreasonable to claim that academia would reject the idea out of hand when no-one's even written a paper to attempt to submit to a peer-reviewed journal. I suspect the reason academia ignores the supposition is not commercial interest like you claim.
I've been a student of that blog. It doesn't ignore cosmological redshift. Rather, the dark energy solution notes that space measurably expands, but in a relative (it depends on the observer) way as opposed to the absolute (observed by all observers) way that is generally accepted today. When space measurably expands (whether relative or absolute) you have cosmological redshift.
Space itself not expanding is the absence of an assumption. Today it's generally accepted that space itself expands, an assumption.
> Add in the fact that that the article appears targeted towards laypeople and there's no scientific paper to be found suggests that it's more psuedoscience and thought experiments than actual scientific investigation.
Thought experiments are scientific investigation. Nothing in the rest of your point actually suggests pseudoscience.
It seems unreasonable to claim that academia would reject the idea out of hand when no-one's even written a paper to attempt to submit to a peer-reviewed journal.
If you showed me scientific papers that had been rejected by peer reviewers, then you might have a point. But if it's entirely articles targeted at laypeople and no scientific papers, it's pseudoscience.
It's not pseudoscience for that reason alone. You're using the definition incorrectly, even if yours is the definition commonly improperly used. For example Einstein's Relativity of Simultaneity thought experiment is targeted at laypeople and is both scientific and a logical proof. Had he put the idea into a blog and nothing more it still would've been an advance of physics, regardless whether anyone else paid it mind.
(I probably won't say anything more because I get tired of this type of discussion.)
You're right, of course. It's not pseudoscience because it's a blog targeted at a non-scientific audience. It's pseudoscience because it's complete nonsense.
I've seen many people say that, but none who proved it scientifically. After all, it simply uses generally accepted equations to make its point. As the author suggests, I've plugged the equations into Excel to get the same charts.
My favorite part is this:
- The Relativistic Rocket site reports that a rocket accelerating / decelerating at 1 Earth gravity can travel from Earth to the Andromeda galaxy, 2 million light years away and arriving at low speed, in 28 years on the crew’s clock. Then the rocket’s crew would observe a beacon floating at the midpoint between the galaxies recede 1 million light years in the 14 years after they pass it.
That's an unassailable conclusion, and it follows that the beacon would accelerate away from the crew as they observe, since the beacon moved away from them at an average rate greater than that the rate at which they passed it. That's the explanation for dark energy in a nutshell.
Since 1 million light years in 14 years is way faster than the speed of light, and relativity doesn't allow you to exceed the speed of light by acceleration, clearly this is something other than regular old F=ma "acceleration".
Additionally, the fact that we don't see things 14 light years away accelerating to a million light years away very quickly, despite being under a constant 1 gee acceleration, would seem to indicate that this is not what actually happens.
Putting numbers into equations and getting other numbers out doesn't mean anything by itself. I can use the standard d = 1/2at^2 equation to "demonstrate" that combining one apple and three tangerines produces 4.5 dandelions, and you would get the same result putting those numbers into the equation, but it doesn't mean the exercise makes any sense.
I will be honest: I don't understand enough about physics to point out exactly what is wrong with the proposed theory. But I understand enough to realize that it is very wrong, and point out some obvious flaws.
Finally, since the equations are well understood, if this really does explain "dark energy" then it should be possible to put in real-world numbers for things like the gravitational field of the Earth and get numbers out for the accelerating expansion of the universe which match real-world observations? Has the author actually done this and compared the results with observations? I can see no indication of this, even though it should be an easy exercise. This is another major indication that this is all nonsense, if it is in fact the case that this analysis has not been done.
> Since 1 million light years in 14 years is way faster than the speed of light, and relativity doesn't allow you to exceed the speed of light by acceleration, clearly this is something other than regular old F=ma "acceleration".
As the blog notes, the speed of light limit in relativity theory applies only to observers in free fall or observers measuring things passing right by them. The rocket's crew isn't in free fall and the thing they're measuring is distant.
> Additionally, the fact that we don't see things 14 light years away accelerating to a million light years away very quickly, despite being under a constant 1 gee acceleration, would seem to indicate that this is not what actually happens.
That's because our 1g field drops off quickly with distance. The rocket's crew has a uniform 1g field all the way to the beacon.
> I can use the standard d = 1/2at^2 equation to "demonstrate" that combining one apple and three tangerines produces 4.5 dandelions, and you would get the same result putting those numbers into the equation, but it doesn't mean the exercise makes any sense.
That's because you're using the equation illogically, by mixing labels. The blog uses the equations properly.
> But I understand enough to realize that it is very wrong, and point out some obvious flaws.
OK, I've been a student of the blog for years and can likely always show the flaw in your arguments.
> I can see no indication of this, even though it should be an easy exercise.
It wouldn't be easy. As the blog notes, it's not just the Earth's gravitational field, it's also the Milky Way's and more. When observing supernovae in distant galaxies we're talking about a much larger g-field than the Earth's.
> This is another major indication that this is all nonsense, if it is in fact the case that this analysis has not been done.
That's unscientific as hell, like saying Newton's Principia is crap because he didn't include a calculation that used his equations to weigh the Earth (which Cavendish "easily" did much later).
Nonsense. The mass of galaxies is reasonably well known. Put the numbers in and see how well they match reality. The numbers are uncertain? Then put on appropriate error bars and see if reality falls within them.
We know the mass of each of millions of galaxies? I doubt the galaxies involved have even been counted. But never mind, because ideas aren't proven invalid when they aren't applied in some way you desired.
No, because special relativity had plenty of quantitative corroborating evidence for it, gathered both before and after the theory was actually proposed.
Seriously, this is insane. You have a theory that makes quantitative predictions but when I ask whether those predictions line up with reality you protest that it's just too hard to actually come up with any numbers. You loved equations and graphs before, but now, perish the thought of actually looking at quantities, let's just wave hands and pretend that this theory is definitely right even though we can't be bothered to check.
You have a few blog posts with no quantitative reasoning whatsoever even though all of the math is present and straightforward, no actual papers, no evidence, no nothing, and you're sure that this is proof that academia is suppressing new ideas, not, say, proof that the author is a complete crackpot.
Special relativity as published included no experimental confirmation. None of its quantitative predictions in the original publication were "lined up with reality" by Einstein. What came after the theory was published, experiments done by others, is obviously irrelevant to the point you're making.
The blog proves that general relativity predicts that sufficiently high-redshift supernovae accelerate away from us, the 1998 observation of which is currently a mystery. The blog offers the same level of corroborating evidence as special relativity did when it was published, both offering solutions to observational mysteries. Neither Einstein nor the blog author lined up anything with reality numerically in their original publications.
Yet you insist (at a minimum) the blog author add up the masses of all the millions (trillions?) of galaxies within a sphere centered on the Earth and whose radius extends to those supernovae, billions of light years from the Earth, to make a numerical prediction of the rate those supernovae are accelerating away from us. You suggest that not doing such a calculation means the blog is hand-waving. Well I say again that's unscientific as hell. Yes by your standard, special relativity as published would be junk and Einstein would be a crackpot.
(I probably won't say anything more because I get tired of this type of discussion. You seem to be done with scientific attempts to discredit the blog.)
In reading that link here are just a few things that seem to be wrong with it on first glance.
(In the link attempting explain the flatness problem), one cannot just assume all galaxies are moving towards each other without first positioning a center (of those galaxies). That in itself invalidates the rest of the flatness argument, which is required for the article on your link.
This was based on a precursory read, so it may not be entirely right
There's no need for a center. Take the infinite universe as it is, and steadily decrease all the distances between the galaxies. There you have it: all galaxies moving towards each other with no cosmic center. It's definitely possible in principle.
Something I noticed 'An atheist since the age of 10, he fears the nickname "reinforces confused thinking in the heads of people who are already thinking in a confused way. If they believe that story about creation in seven days, are they being intelligent?"'
How someone [intelligent enough to get the Nobel prize] who knows relativist equations better than the average could argue that creating the universe in 7 days is stupid ?
Didn't we know since Einstein discovered it that time can expand and compress depending of the environment ? And who can say that Earth local time is the norm ? Better back to pre-Gallileo time then, when everyone though Earth is the center of the Universe.
The show is despite what some think, not a celebration of science, but a ridicule of science, and takes the stereotypes to new levels.
Edit: I also watch the show, but as the article that GuiA links to mentions, the show wnats you to laugh at the nerds, and not the actual jokes they make. You could argue its a Star Trek vs. Star Wars kinda thing, but i personally liked Community far more, because it really captured the fun in being nerdy, and didn't put nerds in one big collective box.
PS. I don't dress up, play fantazy card games, go to cosplays, read comics, etc. I'm an engineering student, i get drunk in the weekends, I code in .NET and Python, I love coffee, wine, cars, motorcycles, etc. Not anywhere near the steretypical hollywood nerd.