Hacker News new | past | comments | ask | show | jobs | submit login
You and Your Research (1986) (virginia.edu)
186 points by ftxbro on May 1, 2023 | hide | past | favorite | 91 comments



There's a part of this that's never ever commented on

>I don't like to say it in front of my wife, but I did sort of neglect her sometimes; I needed to study. You have to neglect things if you intend to get what you want done. There's no question about this.

As someone that got married during their PhD and is now finishing up and reflecting: it isn't worth it. Neglecting the people in your life that love you in the pursuit of "science" is no different than neglecting them because you have a drug habit or because you're an inveterate gambler or because you're chasing fame and fortune. The science will get done regardless and the only thing you accomplish is ensuring your own interests. Suffice it to say I'm taking steps make sure my next gig has more room.


It really isn't talked about enough. I am contemplating leaving my PhD despite being ABD with the majority of the interesting work done simply because the neglect it inflicts on my family is the cause of the overwhelming majority of conflicts in my life.

I truly love working on the problem, investing time in research papers, and experimenting, but I have no desire to stay in academia. Plus, I have a well-paying full-time job (that I took during the pandemic to save my wife from having to shoulder our living expenses basically single-handedly while growing our child inside of her).

> Suffice it to say I'm taking steps make sure my next gig has more room.

I hear you. It's crazy how mentally abusive a lot of people's relationship with their PhD is. I'm convinced it's one of the reasons so many people with PhDs marry other PhDs. They're the only ones that "understand" the level of single-minded devotion you have to have to be an early career researcher.


As a PhD, I now believe that pursuing a career in science is an enormously selfish and entitled life choice for anyone who doesn't have a trust fund.

It's not just neglect during the PhD. Even a non-neglectful academic is asking a lot of their partner. Stipends are low. Post-docs require chasing term-limited positions around the country, often with little to no savings, for up the half a decade. Building wealth is impossible, having a family is just barely possible, and the process takes you well into mid-life depriving your partner of a career and a real relationship.

I have seen more divorces during post-docs than during PhDs.

Everyone I know who made it through PhDs and post-docs without scars fell into one of two categories: unmarried or wealthy. I think academia's biggest open secret is that a HUGE number of academics -- especially in and around large metros or in nice climates -- are chasing a prestigious and comfortable job because their trust funds allow them to not care about the money and their upbringing makes it difficult for them to deal with having a manager.


> As a PhD, I now believe that pursuing a career in science is an enormously selfish and entitled life choice for anyone who doesn't have a trust fund.

This is because the career was originally set up for those who were independently wealthy, or at least rich enough to have a spouse who didn't work. All of this stuff happened within the last 150 years, and it was made worse in recent years following the massive increase of skilled people in the discipline.

A career in science also differs based on whether you're an academic or in industry. On whether you're out their seeking grants, or out making products to sell.

These are all decisions being made based on what people demonstrate that they will tolerate. No different than the Japanese Karoshi-culture. If your Ph.D. or Post-Doc supervisor won't allow you to have a life outside of science, then dump them. Leave them up the creek without a paddle. They are not your only option.

At its absolute worse, the hours involved in an academic Ph.D./Post-Doc aren't significantly worse than a chronic precariat worker. I had two jobs during a particular semester when finishing up my A.S., and averaged 3.5 hours of sleep per night (outside of the weekends). I've read about a guy who spent years walking and taking transit for 8 hours each day from his house to his 8-hour a day job. 16 hours per day just dedicated to work. Ultimately his job and/or co-workers pitched in to buy him a car.

The harsh hours in scientific academia is ultimately a choice. And the choice is not between science and a life. The choice is between science with this particular supervisor and a life.


I wonder if it would be better to go back to that time where PHD's were only for the wealthy? Would there be less incentive to do bullshit studies and cave to political pressures and bribes if the people doing the studies didn't need the job to survive?


It was bad even for some of them. (https://www.science20.com/quantum_gravity/blog/phd_octopus_w... ) A one-size-fits-all approach to education and credentialing is the original mistake.

A secondary mistake for Ph.D.s in particular is requiring an original contribution to science instead of a demonstration of appropriate levels of mastery, because this requirement really screws over students who realize that their thesis is wrong, or intractable, too late. And even worse incentivizes cheating above and beyond the already existing incentives.

> Would there be less incentive to do bullshit studies and cave to political pressures and bribes if the people doing the studies didn't need the job to survive?

People who genuinely need the job to survive, quit, and get a job that allows them to survive, sometimes alternating this with academia (https://en.wikipedia.org/wiki/Theodore_Streleski ). Or like the graduate students and Post-docs at the University of California, they band together and strike.

Social pressures and sunken-cost fallacies affect enough people of all socioeconomic classes. Especially when the added grief is spun as a status symbol.


> As a PhD, I now believe that pursuing a career in science is an enormously selfish and entitled life choice for anyone who doesn't have a trust fund.

Focusing specifically on the the word "selfish"--isn't academia about dedicating yourself to scientific discovery? That dedication may require a sacrifice from your spouse, but it should benefit humanity at large.

That may or may not be a sacrifice your family is willing to make--and it's a shame that anyone is required to make it--but it's not selfish!


Eh plenty of people pursuing science are doing it because they want to be the one to make the discovery, not because of some pure dedication to science. Sure, that's not everyone, but it's a big enough chunk of the people that stick around.

Regardless of personal intentions, it's also a fact that we have a massive oversupply of people trying to make it in academic science. Functionally the only way it is selfless then is if you are meaningfully better at science than replacement level (and again potential replacements run deep these days).

This makes most people either knowingly selfish (if they are sacrificing others' wellbeing for their research anyway) or very arrogant about their intelligence level. Or naive about the system, but I don't think you'll meet many postdocs that don't know these facts.

The people that stick around and grind away at research work even once any chance at a tenure track position has passed are a lot more defensible as selfless of course, but that's not really the most common situation to find in a lab. The majority are gunning for a career in academic science, even down to the poorly treated pre-PhD RA labor.

Now obviously if some rich kid or someone with no personal attachments wants to fuck around in academia they should totally go for it. I just don't think selfless is the right word for that, and it's certainly not the right word for your typical academic ladder climber.


But I would hope academics are not doing literally the same research? In other words, are you saying that if we had fewer people working in academia, the same quantity of research would get done in the world?

It's okay even if different researchers are working on similar problems, because replicability is important.


Enough people drop out of the race, or are filtered out, that it becomes effectively zero sum.

A common control in sociological studies of educational systems is to compare people who are just above a particular cutoff (say a 2.00 GPA), and just below that cutoff (say a 1.99 GPA). These people are often effectively equivalent except for one good or bad day, or one harsh or forgiving instructor.

If an academic department makes 15 positions available for graduate students, it will have 15 graduate students. Regardless. Is the 16th applicant who just didn't make the cut worse than the 15th who did? Probably not. And they might have been better, or at least have been a better fit.


> Functionally the only way it is selfless then is if you are meaningfully better at science than replacement level (and again potential replacements run deep these days).

I'm better at science than I am at the replacement job. Ergo the maximum I can contribute to the world is through science.


If you're single and not responsible to anyone else, I agree. I think the commenter was commenting with the assumption that the self-insert has a partner who isn't also dedicating themselves to science.


I agree to some degree. I am finishing up my PhD at the moment and have had this below-surface feeling that following this path is inherently selfish for a while.

Choosing to go into research means your career choice is entirely determined by what you are most interested in, what you are passionate about, what you want to spend your day thinking about. I feel like the benefit to society is often secondary in that choice. It's nice that often science benefits humanity as a whole, but often it also doesn't and is just obscure niche research.

And indeed, the relational sacrifices that come with a (high ambition) career in science are IMO not worth it. I would not recommend anymore to pursue some abstract high brow principle like "the pursuit of knowledge" over deep, loving, healthy, sustainable relationships with people to a young ambitious person. People are more real than principles.

Ideally you can combine it of course. But the academic job market is not easy and rarely allows this without significant friction.

I am 99% sure I will leave academia after my PhD. Not for the this reason per se, but it appears in the equation. The relational aspect is a big part, though.


I am a reasonably successful researcher, and I admit to neglecting my wife sometimes, especially in my early career.

However, I would venture that being successful in any career --- be it research, business, politics, arts --- requires a certain amount of focus and neglect of your family and friends.

But later, if you succeed, you can make it up to them and make it worth it for them as well.


> However, I would venture that being successful in any career --- be it research, business, politics, arts --- requires a certain amount of focus and neglect of your family and friends.

A warning for the younger readers in our midst: this is NOT how most relationships work. You normally don't get to mismanage a relationship and then "make up for it" later. When you do, there are almost always lasting scars.

That said: this might be true for business and politics. But academics? LOL. Becoming a professor, even at a top university, is a pretty pathetic definition of "success". A professor is a mid-level manager position that pays about the same as an entry-level position at a top tech or finance firm. Most people involved in allocating budget / selecting projects understand that the work being managed is mostly not valuable; that's why they don't mind telling you that you need to pay your subordinates about what they'd make at McDonald's.

It's a first line management job where pay is not enough to "make up" for the lost years and the work almost always literally doesn't matter.

> But later, if you succeed, you can make it up to them and make it worth it for them as well.

This might be true in business and politics, and to some extent in arts, but it's not true in academia. And to the extent that it is true, it's enabled by pushing shit down the hill.

Which is why I'm a lapsed academic. I turned down my TT offers because I realized that I couldn't, in good conscience, build a career out of abusing junior labor. And universities put hard constraints on how you pay and manage PhD students, so avoiding at least financial abuse is mostly impossible.


Have you considered some people might not conflate "success" with "net worth"? Wealth = success is a very tired trope, I thought everybody realised what a sham that way of thinking is.

Who cares if they make more or less than a techbro? If they're happy with their job and they earn enough to pay for things they want (house, vacations, whatever), then they should chase the rat-race of the "ladder of success" because...?


Where do I make such a conflation?

We aren't talking about nurses or school teachers. We are talking about professors at large research universities.

Your sibling comment speaks of working "nights and weekends" with frequent travel. That puts an enormous amount of work and stress on their partner, and faculty usually don't make enough money to offset those contributions.

Deciding not to optimize for wealth is perfectly fine. Doing to do so while working nights and weekends with frequent travel isn't. Optimizing for "prestige" is infinitely worse than optimizing for "wealth", because at least the latter can be shared and has utility beyond pure ego.


Again, it's not about prestige, it's about love of science and research. But yes, you do have a point about working nights and weekends. It's not as if "the grind" is not something which is glorified in the tech industry, though :)


I don't think there is any problem with loving science and research.

Deciding to sacrifice your nights, weekends, and financial life to work on science and research is okay. But it's also enormously selfish. Other people who spend time doing "what they love" -- ski bums, for example -- at least recognize their selfishness as such.

Being selfish can be okay. But it's probably not great to be selfish and try to build a life-long partnership. Especially if you don't realize you are being selfish.

I won't tell anyone not to ski bum or not to do a PhD. But I will gut check people when they get confused about the difference between selfish and selfless dedication to a craft. An academic career -- the type where you spend nights and weekends without at least contributing a modicum of financial comfort to those around you -- is selfish.

At the end of the day, most grant-funded projects are born useless. There isn't as much of a difference between ski bumming and PhDing as professors like to pretend.


Maybe not for you, but it's pretty clearly about prestige for a good proportion of the "rising stars" that will actually get tenure track positions at large research universities. I suppose I can only speak directly for my own field, but I have friends in a few others that do not paint a rosy picture either.

If you think the majority of people in your current field are not optimizing for prestige (and you're past the mid-point of a PhD), I would love to know what field that is - seriously.


I am sorry that you feel that way. I know this is a widespread opinion. My own experience is different.

I am at now a point as a researcher where I am financially secure, work on interesting problems, and have time for my wife and children. My colleagues, junior as well as senior, seem to be in similar situations.

To be clear, in my case "focus and neglect" meant working weekends and evenings and lots of travel for some years before we had children. I see successful people in other careers doing the same.

My current situation does not involve anything remotely like "pushing shit" or "abusing junior labor". I have no "hard constraints" and I see no "financial abuse" at my university.


If your university has a PhD program, we simply have different definitions of financial abuse.

I couldn't accept the job and look myself in the mirror while knowing that not only do my direct reports struggle to get by and can't save tax-deferred for retirement, but that I'm one of the only employers in the country who doesn't even pay FICA taxes.


> I couldn't accept the job and look myself in the mirror while knowing that not only do my direct reports struggle to get by and can't save tax-deferred for retirement, but that I'm one of the only employers in the country who doesn't even pay FICA taxes.

With finance or ad-tech like you are recommending the whole output of the job is often zero or negative sum. That can happen in academia as well but it seems less likely.


Finance isn't zero sum, trades can be mutually beneficial and efficiently allocating capital is an important problem (that the NIH utterly fails at incidentally). There are a ton of shady finance people because that's where the money is, sure. But it is not inherently zero sum!

If you moved money from a sinkhole garbage establishment project to some other promising scientific endeavors, that would almost certainly be a net positive for the world even if the total spend is the same and you "just pushed money around". A similar principle applies when considering many industry investment decisions.

Regardless, "finance" is broad as hell and there are a huge number of potentially well-paying careers that are not finance nor ad-tech. Not to mention that a FIRE research scientist who is less beholden to the current system might very well contribute more scientific progress over their lifetime than the latest career academic.


Consider latency arbitrage. Let's say the lowest latency between NY and Chicago is 22.6ms and a trading firm gets it down to 22.5ms with a huge investment: big benefit to society right?

The reward for that investment is the same as the reward for the next guy who gets it down to 22.45ms, despite the first guy saving 0.1ms on the state of the art and the second guy only saving 0.05ms. Surely 0.05ms is worth a lot less than 0.1ms to society, and it shows this whole thing is almost totally detached from any value to society.

It's just lowest number wins and the industry will consume any number of resources (running CPUs in spinlock loops instead of more efficient ones, having human labor climb microwave towers at some risk of life) up to the reward amount to claim it, regardless of any marginal value to society of the improvement.


I think financial abuse is an exaggeration, especially for PhD students who if they are doing things correctly should also be receiving an education as part of their compensation (something a good advisor has agency to control).

That said, I find it hard to believe you don't have constraints on what you can pay labor? Maybe it is field dependent, but the NIH sets some real low ball salary caps on how much postdocs (and other titles) can be paid with their grant money. So if you rely on this sort of government funding it is difficult to pay postdocs a fair wage for their experience and stage of life -- forget about it if you live in a particularly high CoL area, which NIH does not adequately account for.

So yeah, postdocs are taken advantage of in a lot of fields in a way that is (to some extent) beyond the control of the advisor. In a cheaper area it's not necessarily dire straights, but it is often a huge underpayment nonetheless. And if you talk to postdocs a lot of them are doing it because they're still chasing tenure track dreams that are not super likely.

At what point is it wrong to facilitate minor league baseball? I don't think there's an objective answer here, and obviously I don't blame people for trying to do their best as a PI. But I also don't think the OC's reaction to the situation is outside the realm of reasonable for an informed person.

As an aside, you're (unsurprisingly) kinda fucked if you wanted software to be part of your NIH-funded project, because they do not acknowledge the cost of hiring anyone with a remotely valuable skill set.


>>But later, if you succeed, you can make it up to them and make it worth it for them as well.

Just incase anybody is unaware - this is really not how interpersonal relationships work. Especially romantic interests. You may one day find yourself with a lot of money/power but nobody who truly loves you (or enjoys your company) for who you are - or with not much to show for your years of tunnel visioned neuroticism, symptoms of complete burnout, & also nobody who loves you.

I’m not trying to say there’s something wrong with dedicating a large chunk of your life to a pursuit like they’ve mentioned - I’m just saying don’t be surprised when people you’ve neglected have moved on to greener pastures in that time.


Higher education isn't made for people with a family life. There's no real reason for this, but there's no pressure to change it because there's always someone else in line ready to take your place.


The reason for the observation in the first sentence (which I agree with) is the second sentence.

Lack of pressure to change something is absolutely a reason in and of itself. How could the system be changed to avoid this?


Good question. The proven model so far is reducing funding. Some universities found new ways to raise money by offering online and async course modalities. This benefits students who need the extra flexibility, and it's probably the most significant change that can be made. Increasing proliferation would require grants to surmount the initial investment and the need for guidance from the already successful institutions who paved the way.

Another change I'd like to see is shorter undergraduate programs. Most can stand to lose a few classes off the beginning that duplicate high school and a few off the end that duplicate grad school. That might have to be a political decision in response to the high cost of education. Shaving off a year could help you avoid the family issue entirely, especially if you worked for a year or two before returning to grad school.

Similarly, another proven model is that senior year of high school can also be used to reduce the duration of college. You can already get college credit, but it's a very fragmented and uneven system. State governments could fix that with policy, again to address the high cost of education.


I think the system is naturally starting to feel the pressure, albeit it took too long (and may have been expedited by COVID at that). Just look at the number of articles about a postdoc shortage that are cropping up in major journals within the last couple of years. The NIH recently had a "request for information" from late stage PhD students and early postdocs about what could improve their opinions about doing/staying in a postdoc. Anecdotally, many in biology are feeling the heat, and I've heard similar rumors in some other departments.

That said, I don't really trust the funding agencies and department heads to do the right things that will actually improve the long-term health of the system. I think we'll end up with a shitty band-aid solution instead like giving postdocs mediocre raises. It could take a really long time for the system to truly crumble enough to force change.

We'll see though, I hope things take a turn for the better in the coming years because certainly I know many PIs that are frustrated for one reason or another. If you get enough profs on the same page it could have some actual power.


What did PIs do before post-docs existed, and why can't they do it again?


Work.

They could, but why?


As someone that got married during their PhD and is now finishing up and reflecting: it isn't worth it. Neglecting the people in your life that love you in the pursuit of "science" is no different than neglecting them because you have a drug habit or because you're an inveterate gambler or because you're chasing fame and fortune. The science will get done regardless and the only thing you accomplish is ensuring your own interests. Suffice it to say I'm taking steps make sure my next gig has more room.

The answer is simple and obvious: be alone. It wont work for everyone but it works for me. Valuing your time means having to say no to intrusions and impositions.


It's been mentioned several times here, in addition to discussions outside of HN.

https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...


I strongly encourage you to take a closer look at each of the algolia results.


Commented on isn't the same as criticized. If you were meaning that only that one sentence must be quoted to count, though, then yes, you are right that this is an innovation on Hacker News.


Every single algolia response quotes the enclosing paragraph and not a single one speaks to the sentence I pulled out.

Speaking of innovation on hn, this is definitely innovative levels of pedantry.


Thanks! That's hard to accomplish here.


What if every scientist had said this? We’d be living in mud huts and dying at 35. Progress is important. Nobody personally has to commit to it but as a society we need some people to.


Look up the percentage of current postdocs that will receive a tenure track position at a major research university. Then cut that number in half after 5 more years of ridiculous grind as an assistant prof. You're looking at less than 20% chance even if you're willing to move to bumfuck. If you have location restrictions or higher standards on university (which impacts e.g. the amount of funding you can realistically get), it's approaching low single digits in most fields.

Modern academic science is really its own special breed, and unfortunately a lot of that grinding these days goes to fighting the crazy career ladder, not so much directly to obsessive, curious science.


Not only was there neglect, but I also noticed they had no kids (not sure by choice or whatever).

Hamming definitely made lots of sacrifices to become famous. Although if you noticed his achievements, they were actually all done early in his career, rather than late.


And, honestly, science many of us do isn't that impressive to be worth it.


> it isn't worth it. Neglecting the people in your life that love you in the pursuit of "science" is no different than neglecting them because you have a drug habit or because you're an inveterate gambler or because you're chasing fame and fortune.

It depends; the neglect during studies is a temporary thing! All thos other things you mention are permanent things!

There's very little similarity between "temporarily neglecting family to provide better opportunities to my kids"[1] and "permanently neglecting family to get a high".

[1] Make no mistake, you can introduce your kids to better opportunities in a your social circle of Phds (or MBAs, etc) than your social circle of whoever your current colleagues are.


I agree that this issue has never been talked enough. I am doing PhD now and recently went through a breakup which mostly happened because I could not give enough time for the relationship.


How are your future prospects looking? How have your goals shifted? I'm curious more specifically what kind of job titles you'll be looking for, if you have any in mind.


In contrast, grad school allowed me enough flexibility on how and where I worked that my wife was able to pursue her career ambitions without me having to compromise my own.


I think there is an enormous survivorship bias here. Research advances come in alternating of periods torrents and droughts, not in a steady trickle. Hamming, Shannon, and their contemporaries worked at a time of revolution in computer science.

Let's put deep learning aside for a second. Can anyone here name a single academic in any other subfield of computer science who had a string of groundbreaking research works where every one of those works is from 1980 or after? I can't. And that's almost half a century that we're working with. Many people who defended their PhDs in the 1980s are already retired.

This applies to plenty of other fields as well. Medicine, for instance: There were surgeons in the 1960s and 1970s who have half a dozen new surgical techniques to their name. (If you want some examples, look up Michael DeBakey, Mark Coventry, and Thomas Starzl.) You will not find a single academic surgeon today who comes even close to that, and at least in my opinion it's not because modern surgeons lack drive or creativity or the ambition to tackle big problems.

Hamming is probably right that the role of luck in the sense of serendipity is overblown, but he ignores role of luck in the sense of structural factors that are out of your control. Because he and his colleagues largely operated in a context where all those structural factors were completely optimized.

If you decide to start a PhD in a field 10 years before that field is destined to hit a multi-decade rut, if your advisor fails to make tenure partway through your PhD and your lab is shut down, if you get your first academic appointment in an environment where research funding is trending down and regulatory burdens are trending up, these are all factors that are by and large out of your control that can significantly alter the trajectory of your career.


Related: I’m annoyed by programmers who implore others to study “worthwhile things.”

I’m guessing most of us were obsessed with computers in our teens, and the job market just happened to align with one of our hobbies.

Kids who are fascinated with classical piano or performance art don’t stumble into lucrative careers unless they’re in the top 0.1%.


It's also something that can be turned on programmers. One of the things I'm told the most by computational students wanting to join my group is that they want to do "something worthwhile" instead of going to work for Facebook.


> Let's put deep learning aside for a second. Can anyone here name a single academic in any other subfield of computer science who had a string of groundbreaking research works where every one of those works is from 1980 or after?

Shafi Goldwasser (Probabilistic encryption, zero-knowledge proofs, etc.) got her PhD in 1984.

Leslie Lamport is just a bit before your deadline, his 'Time, Clocks' paper came out in 1978, but the bulk of his work was after 1980, including paxos.

While there's definitely some truth to your Kuhnian view of 'times of revolution' in a field, I think it's hard to apply that to recent progress because it may just be that it's not clear which research works were groundbreaking without the benefit of hindsight. To me, the revolutionary period of CS research is still ongoing.


Almost every breakthrough in distributed systems is more recent. LZW is 1980s. Most data storage systems used today had their guts invented in the last 40 years, with only the top layer being older than that.

If you look at "Computer science" as the narrowly-defined field of data structure and algorithm design in a vacuum, maybe things slowed down after 1980, but that's because problems with different constraints just became more interesting.


Indeed. The fact of the matter is that today’s environment for researchers is completely different from what it was like in 1986 when this talk was given, though I mention it not to take anything away from this talk. The days of places like Xerox PARC and Bell Labs where researchers had considerable freedom and autonomy are over and have been for quite some time now. Many industry labs promote business-driven research instead of purely curiosity-driven work and expect their researchers to produce a regular flow of research results that can be productized, lest they be shown the door. Academia these days isn’t exactly a bastion of freedom, either, with its “publish-or-perish” pressures needed to secure tenure and remain in good standing, as well as the need to raise grant money, both of which require pleasing external reviewers. Thus, successful researchers under this environment must find a way to manage the inevitable uncertainties of research while also producing enough output to make their evaluators happy. I find these “productivity” metrics stifling, but if I want to continue as a researcher, it’s either play the game successfully or find another way to make a living.

I’ve been thinking long and hard about this for years. Perhaps researchers who value freedom of inquiry and freedom from “publish-or-perish” pressures could work independently, perhaps being funded by fellowships (like the MacArthur Grant), from part-time work, or from a side business.


i agree! hamming even talks about this a bit in his talk

your chances of doing great research are slim if you're born in kenya in the 01940s, no matter how much drive and creativity you have, but that's not because 'research advances' have 'droughts', it's because you're not being allowed to do great research, even if plenty of research advances are happening out of your reach

(unless you're richard leakey)

that's what's happened to modern surgeons, nuclear engineers, aeronautical engineers, chemists, etc. the usa and eu today have mostly become the kenya of the 01940s, where only the occasional richard leakey is allowed to excel, and thus we have entered the so-called great stagnation

but restrictions are much looser in computer science than in most fields. we still have:

- oleg kiselyov (sxml, probabilistic programming, relational programming and thence the reasoned schemer, type checking as small-step abstract evaluation, tagless staged interpreters, stream processing without runtime overhead)

- dan bernstein (twisted edwards curves including curve25519, breaking aes with cache timing attacks, nacl, tweetnacl, qmail, salsa, poly1305, striking down usa export controls, organizing the pqc conferences)

- fabrice bellard (qemu, ffmpeg, tcc, lzexe, and bellard's formula, aside from nncp, which is deep learning)

- jeff dean and sanjay ghemawat (epi info, leveldb, tensorflow, bigtable, mapreduce, spanner, 'the google cluster architecture', lots of internal google stuff, and again a bunch of deep learning stuff)

- raph levien (io, advogato, the gnome canvas, libart)

- graydon hoare (monotone and thence git, rust), and

- rob pike (blit, utf-8, much of plan9, sawzall, and golang)

- wouter van oortmerssen (cube/sauerbraten, flatbuffers, amiga e, false and thus more or less the field of esolangs, lobster, bla with first-class environments, aardappel, fisheye quake)

i've left out the people i personally know well


I disagree with this sentiment.

Although I'm not familiar with today's groundbreaking research, I think dismissing an entire branch of research simply because it goes against one's argument is a little funny. However, Hamming specifically states in his lecture that this could be applied to engineers starting companies, and there are numerous examples of founders who have started multiple successful businesses in this area.

In the lecture series, Hamming explicitly emphasizes the importance of dedicating time to predicting what the next big step in one's field would be. He dedicated half of every Friday to trying to predict the future of computer science. One of the major points he drives home is that the next breakthrough in any field will not be what he or anyone else has worked on in the past. That is why it is worth striving to work on the right things.


> I think dismissing an entire branch of research simply because it goes against one's argument is a little funny

I don't think it goes against my argument at all. Deep learning is a place where many structural factors are currently optimized, so it has a glut of prominent researchers. The big names in deep learning like Hinton, LeCunn, Bengio took their first post PhD appointments 30 to 40 years ago. All three put out some good work between 1985 and 2010, but all three became an order of magnitude more productive after 2010. They went from being members of a mostly irrelevant and overlooked research niche (neural networks were considered a dead end in ML) to being heads of research at multi-billion dollar companies. If it is really all about the individual, why weren't they that productive from the start? What changed? Well, the number of GPU cores hit an inflection point. Technologies for GP-GPU programming like cuda reached maturity.

As I said, there is truth to what Hamming said too. The 1920s were primed for a revolution in physics, but there were thousands of physicists contemporary to Einstein, de Broglie, Heisenberg, and Bohr who didn't do anything of note. So what set the ones who won Nobel prizes apart from the rest? That's where Hamming's advice comes in.

In other words, I can be convinced that Einstein or de Broglie would have reached the tops of their fields in any time or place and in nearly any field. But you cannot convince me that they would have made the history books if they were marine biologists in the 1990s. It doesn't matter how hard you think about where the next revolution in your field will be if there is no revolution to be had.


The survivorship bias here reflects the positive-results bias in research itself. One neither wants to nor can publish negative results, despite their potential to be more valuable than massaged and hacked positive results.

Not only is this unfair to unlucky research paths, but it also breeds a social media-like positive feedback loop wherein the individual researcher feels alone in failure.


> wherein the individual researcher feels alone in failure.

When thanks to them not publishing their failure, there will probably be a few others who fail the exact same way on the exact same research idea.


> Let's put deep learning aside for a second. Can anyone here name a single academic in any other subfield of computer science who had a string of groundbreaking research works where every one of those works is from 1980 or after?

That's because of the nature of the field. CS researchers are kind of like NPCs or bureaucrats. The technical stuff we do is not interesting to the general public, but it enables other people to do more interesting stuff.

You mention deep learning as an exception. I, as a researcher in another CS subfield, cannot name a single person who has done fundamental work in it.

The "tragedy" of CS is that it's too relevant in the short term. If someone makes a breakthrough, other people will probably commercialize it in a decade or two. Afterwards, history books won't remember the person who discovered the thing but the company that commercialized it or the product launched by the company.


> The technical stuff we do is not interesting to the general public, but it enables other people to do more interesting stuff.

People upthread said a lot of scientific research is useless, but my understanding was that the research eventually builds on top of each other in a lot of cases?


Deep in the trenches a lot of research these days is actually just pretty fucking useless. There are many, many results that unfortunately do not replicate in even slightly different situations. It's a very inefficient way to be searching the scientific space so to speak, and it's a more modern phenomenon. We've scaled (or tried to) science in IMO a really bad way. At least in neurobiology I'm very convinced of that.


"Can anyone here name a single academic in any other subfield of computer science who had a string of groundbreaking research works where every one of those works is from 1980"

Hennessy and Patterson for their work on RISC Patterson for RAID Dean and Ghemawat for MapReduce

That's without thinking. There have been a lot of very important works since the 1980s.


Lemire. SPJ. John Hughes. Neil Mitchell. And i did not even try hard, these just instantly came to mind...


Here's a recording from '95 entitled the same: https://www.youtube.com/watch?v=a1zDuOPkMSw

I also highly recommend reading the Stripe Press book also with the same name: https://www.amazon.com/dp/1732265178


I came to also recommend the book. The last half is about specific engineering topics, but the first half is packed with great ideas and thoughts that anyone will find valuable.

Its also a beautiful and well made book.


Better video quality, same source material: https://www.youtube.com/watch?v=e3msMuwqp-o&list=PLctkxgWNSR...


This (deservedly) famous article by a legend is a must-read but it is also like having Michael Jordan or Michael Phelps teaching how to be successful in basketball or swimming. What worked for them won't work for almost anyone else!

FWIW, I was fortunate to do a postdoc at Bell Labs in the early 90s. The established and successful researchers, mostly men, would all talk about the sacrifices their wives made so that they could focus on meeting the insanely high expectations of the place in its glory days. It was the custom back when they made their reputations. The younger permanent staff members were leery of this because social mores and the nature of the research environment were both changing. The freewheeling pre-divestiture days were gone by that time and there was a pressure to be relevant to the needs of the corporation. The management still wanted publications in prestige journals and an international reputation by 35 but couldn't say how to do all that while satisfying the relevancy mandate. The stress on the younger permanent staff was palpable but they and everybody else made life delightful for us short-term appointees - it was all the things Universities ought to be.

At the end of each Science Magazine issue is a "Working Life" feature where a (typically) young person writes about problems s/he is facing in launching a career. Work/family life balance is a consistent theme and more importantly an /expectation/ even for those on the tenure-track at high powered places. The times they have a-changed.


Do you have any idea of where a young person might find that sort of environment today? Did you by chance work with Odlyzko there?


I've been immersed in a very applied corner of industry for nearly 20 years now and have lost touch with the world of basic research. My sense (and I have no proof to offer) is that the old Bell/IBM/DuPont/GE/... era of industrial labs with a healthy exploration culture is gone and won't come back in my lifetime.

- Seemed that Google had that spirit at one time with its X Lab and also the Google Labs that put out fun products but all I read here and elsewhere is that is in the past

- The Janelia Research Campus may be carrying on the spirit of small, collaborative groups led by a hands-on scientist (there were no empires at Bell Labs.) Could be a very good place to be a young student that has an interest in biology, instrumentation, and/or computation. See what they expect from the lab heads here:

https://www.janelia.org/our-research/our-labs/group-leaders

/excerpt "Group Leaders have time to focus their energy and creativity on research with the freedom to pursue long-term difficult projects through collaborative, interdisciplinary work. They are not evaluated solely on conventional academic measures, such as publishing. Rather, Group Leaders will be expected to make progress in their research areas and make contributions to the work of other Janelia scientists through collaboration, constructive criticism, or mentoring." /excerpt

One of the guys I did know personally wound up at Janelia and eventually at Stockholm.

I was on the other side of the very long Murray Hill building in the physical sciences area. I went to one of Odlyzko's presentations at the Journal Club circa 1992 or 1993 - I think it was on an important new result in prime numbers/factorization/encryption. I never met him, let alone worked with him, though. I do remember one of my senior colleagues (physical scientist) wondering why any of that stuff mattered. My response was that it was probably much more important than anything he or I were doing, had done, or would ever do. I had read something in Byte or a similar popular magazine about primes and encryption it seemed like the field had serious practical implications on top of being incredibly interesting. One of the few correct predictions in my career!

Addendum: On re-reading this, I realized that I am taking a very US-centric perspective. The spirit/culture I was talking about could exist in other countries.


One of the points of the talk is emphasis on working on "important" problems. It does make sense to not work on incremental things for merely publications so it is definitely good advice. However once you decide to do so what is "important" becomes a difficult question.

I like this article from Daniel Lemire which explores this further

https://lemire.me/blog/2010/03/22/so-you-know-whats-importan...


The way I’ve always interpreted this is that if even you don’t think it’s important, why are you doing it?

Sure you can’t accurately predict what will and won’t be important long term, but you should think your work is important. Whether you’re right or wrong time will tell.


Yes my interpretation is that this is more an advice on what not to do - frivolous stuff you don't believe in


> It does make sense to not work on incremental things for merely publications so it is definitely good advice.

True but unfortunately this is how funding works based on what I saw during my time working at various labs.

You need to provide enough evidence that what you are after is going to "work". Most of the brand new ideas get resources by repurposing data from existing funded projects. If you don't have what you need, you finangle the funded project to produce the data the new idea needs.

I'm all for not syncing resources into crazy ideas that will never work but current state of affairs (it's getting worse) is too conservative


It is important to point out that on his speech (and the book) he tells people to work on important problems (always on the plural), and that he never say not to work on non-important ones.

In fact, I remember the book having a very clear assumption that you can't work on important things all the time anyway. But well, there has been some time since I've read it. But it is a very sensible and nuanced advice for highly ambitious people.


It's not surprising this is from the early 1980s, right around the time Bayh-Dole was passed and before rampant corporatization of science really took hold, before the generation of profitable patents became the most important concern of leading public research institutions like MIT and the University of California.

That, I suppose, is why the concept of 'academic fraud' doesn't appear once in the the talk, which is otherwise full of good general advice. However, fraudulent science is now a major problem in academia, and one of the worst took place some 15 years after this publication, at Bell Labs itself:

https://en.wikipedia.org/wiki/Sch%C3%B6n_scandal

If you're working in a lab and find the PI is fabricating data on a regular basis in order to get publications into journals, or so that they can be the first to claim a patent for a particular process, well, what's to be done? Hamming's comment doesn't really work in that case:

> "Another fault is anger. Often a scientist becomes angry, and this is no way to handle things. Amusement, yes, anger, no. Anger is misdirected. You should follow and cooperate rather than struggle against the system all the time... Another thing you should look for is the positive side of things instead of the negative."

Perhaps there was a Golden Age of Science in the past, when academic fraud wasn't fairly common and scientists weren't also entrepreneurs looking to market their dubious patents to large corporations for a percentage of the profits?


It's funny, I was just thinking about this article this morning. When I first read it over 10 years ago, this quote really struck with me:

> Another personality defect is ego assertion and I'll speak in this case of my own experience. I came from Los Alamos and in the early days I was using a machine in New York at 590 Madison Avenue where we merely rented time. I was still dressing in western clothes, big slash pockets, a bolo and all those things. I vaguely noticed that I was not getting as good service as other people. So I set out to measure. You came in and you waited for your turn; I felt I was not getting a fair deal. I said to myself, ``Why? No Vice President at IBM said, `Give Hamming a bad time'. It is the secretaries at the bottom who are doing this. When a slot appears, they'll rush to find someone to slip in, but they go out and find somebody else. Now, why? I haven't mistreated them.'' Answer, I wasn't dressing the way they felt somebody in that situation should. It came down to just that - I wasn't dressing properly. I had to make the decision - was I going to assert my ego and dress the way I wanted to and have it steadily drain my effort from my professional life, or was I going to appear to conform better? I decided I would make an effort to appear to conform properly. The moment I did, I got much better service. And now, as an old colorful character, I get better service than other people.

> Many a second-rate fellow gets caught up in some little twitting of the system, and carries it through to warfare. He expends his energy in a foolish project. Now you are going to tell me that somebody has to change the system. I agree; somebody's has to. Which do you want to be? The person who changes the system or the person who does first-class science? Which person is it that you want to be? Be clear, when you fight the system and struggle with it, what you are doing, how far to go out of amusement, and how much to waste your effort fighting the system. My advice is to let somebody else do it and you get on with becoming a first-class scientist. Very few of you have the ability to both reform the system and become a first-class scientist.

I tried to live that way for a couple years. Frankly, I think this way of living is unnecessarily restrictive. So what if you get slightly worse service? The clothes you wear and your expressions highlight your history and your culture. By self-censoring yourself, you end up just perpetuating the censorship of other views in the workplace. This goes double for scientists, as we are rather public facing and have room for wearing nontraditional clothes within our jobs.


I’ve always had trouble balancing between working on things that I consider to be important vs things that I am excited about. In my experience I’ve always gotten good results from the things I’m excited about and not what I consider to be important.

Should I consider what I feel excited to be important? Personally, sure, but objectively speaking the world might not think that way.


What makes you excited about something in the case when it's not important?


> I have now come down to a topic which is very distasteful; it is not sufficient to do a job, you have to sell it. `Selling' to a scientist is an awkward thing to do. It's very ugly; you shouldn't have to do it. The world is supposed to be waiting, and when you do something great, they should rush out and welcome it. But the fact is everyone is busy with their own work. You must present it so well that they will set aside what they are doing, look at what you've done, read it, and come back and say, ``Yes, that was good.'' I suggest that when you open a journal, as you turn the pages, you ask why you read some articles and not others. You had better write your report so when it is published in the Physical Review, or wherever else you want it, as the readers are turning the pages they won't just turn your pages but they will stop and read yours. If they don't stop and read it, you won't get credit.

> There are three things you have to do in selling. You have to learn to write clearly and well so that people will read it, you must learn to give reasonably formal talks, and you also must learn to give informal talks. We had a lot of so-called `back room scientists.' In a conference, they would keep quiet. Three weeks later after a decision was made they filed a report saying why you should do so and so. Well, it was too late. They would not stand up right in the middle of a hot conference, in the middle of activity, and say, ``We should do this for these reasons.'' You need to master that form of communication as well as prepared speeches.

I ran across this speech about 5 years into my software development career. These two paragraphs were a MAJOR game changer. Until then I could not believe that more than a minimum of soft skills were necessary being a developer and that my code would speak for itself.

After reading Hamming's words I started putting much more effort into things like presentations and status emails. The results were not immediate and I am not exactly "next VP" but I am definitely in a stable and comfortable position in my current company.

More importantly I am not frustrated when I write some cool-new-tool and nobody takes an interest. It almost certainly means there's nothing wrong with cool-new-tool and I need to better communicate what it is and why its useful. (Related: Hammings point about working on an important problem...your cool-new-tool may not actually be solving anything important).



Richard Hamming used to travel around and give that talk at every university he visited. I was lucky enough to hear him speak during my PhD studies around 1980.


A perennial. Here are the threads with comments (if anyone finds others, please let me know!):

You and Your Research (1986) - https://news.ycombinator.com/item?id=31796353 - June 2022 (33 comments)

You and Your Research – Richard Hamming - https://news.ycombinator.com/item?id=27451360 - June 2021 (1 comment)

You and Your Research - https://news.ycombinator.com/item?id=25242617 - Nov 2020 (1 comment)

Richard Hamming: You and Your Research (1986) - https://news.ycombinator.com/item?id=24171820 - Aug 2020 (1 comment)

You and Your Research – A talk by Richard W. Hamming [pdf] - https://news.ycombinator.com/item?id=23558974 - June 2020 (1 comment)

You and Your Research by Richard Hamming (1995) [video] - https://news.ycombinator.com/item?id=18505884 - Nov 2018 (10 comments)

You and Your Research (1986) - https://news.ycombinator.com/item?id=18014209 - Sept 2018 (10 comments)

You and your research - https://news.ycombinator.com/item?id=14179317 - April 2017 (1 comment)

You and Your Research, by Richard Hamming - https://news.ycombinator.com/item?id=10280198 - Sept 2015 (1 comment)

You and Your Research - https://news.ycombinator.com/item?id=9279585 - March 2015 (1 comment)

Hamming, "You and Your Research" (1995) [video] - https://news.ycombinator.com/item?id=7683711 - May 2014 (25 comments)

Video of Hamming's "You and Your Research" (1995) - https://news.ycombinator.com/item?id=5567448 - April 2013 (1 comment)

Richard Hamming: You and Your Research (1986) - https://news.ycombinator.com/item?id=4626349 - Oct 2012 (27 comments)

Richard Hamming: You and Your Research - https://news.ycombinator.com/item?id=3142978 - Oct 2011 (7 comments)

You and Your Research - https://news.ycombinator.com/item?id=915515 - Nov 2009 (5 comments)

Richard Hamming - You and your research - https://news.ycombinator.com/item?id=852405 - Sept 2009 (1 comment)

You and Your Research - https://news.ycombinator.com/item?id=625857 - May 2009 (13 comments)

You and Your Research (1986) - https://news.ycombinator.com/item?id=542023 - April 2009 (4 comments)

You and Your Research - https://news.ycombinator.com/item?id=524856 - March 2009 (1 comment)

Richard Hamming: You and Your Research - https://news.ycombinator.com/item?id=229067 - June 2008 (7 comments)

Why do so few scientists make significant contributions and so many are forgotten in the long run? - "You and Your Research" - https://news.ycombinator.com/item?id=52337 - Sept 2007 (11 comments)

You and Your (Great) Research - https://news.ycombinator.com/item?id=13218 - April 2007 (6 comments)

---

Note for anyone wondering: reposts are ok after a year or so (https://news.ycombinator.com/newsfaq.html). I'll play the "or so" card and call this one ok. In addition to it being good for curiosity to revisit perennials sometimes (just not too often), HN is also a place for junior cohorts to have the pleasure of encountering the classics for the first time—an important function of the site!


Also a perennial is comments linking the videos/books... They have to because the page hasn't been updated in >21 years! (It's also increasingly ugly, and would benefit from explanation of the increasingly obscure references that may have made plenty of sense to Bell Lab people in 1986 but don't know - did you know he meant Unix by 'the computer in the attic'? I sure didn't.)

That is part of why I've made my own cleaned-up & annotated version: http://gwern.net/doc/science/1986-hamming


Thanks for sharing this, I came across the same material in a different format (either slides or may be a video couple of years back, can't exactly remember). What stuck to me is "If you are not working on an important problem, your work is not important". I tried to take a positive spin from this statement by considering whatever work I am currently doing is important enough to solve an important problem but it's really hard to convince yourself after some time when it is clearly not. Needless to say, I am still looking for that important problem.


A favorite of mine! My friend Oz (of teachyourselfcs.com) and I chatted about this talk in the first episode of our show - focusing on how Oz has seen its influence and impact on the early/mid-career software engineers he's taught: https://show.csprimer.com/episodes/e1-doing-meaningful-work


I heard him give this talk at Ames back in the 80s. The advice and perspective are worth following even if you don’t plan to be a research scientist.


Get the book! It's packed with great content.


Any recommendations on similar materials? I.e. advice on how to manage research.


"How to Take Smart Notes" by Sanke Ahrens.

Even if you decide not to use the Zettelkasten method, this book is still a great read.


Also, any audiobooks about similar topics that are comfortable to listen to?


Currently reading The Sciences of the Artificial? by Herbert A. Simon




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: