Hacker News new | past | comments | ask | show | jobs | submit login

I wish there was some way to express meta data like 'best paper' from trusted sources into the web crawler space. This is an excellent compilation, and I've added a half dozen papers to my tablet for later reading, but its a human compilation. My thought is whether there might be an opportunity to flag something such that a web crawler could automatically compile this sort of list.

Challenges I see to that would be spam injection and author spoofing.




Actually, when you "search" for papers on a paticular topic, you need more information that what was best paper; the "best" paper for a topic is probably not distinguished in a conference that cover multiple topics!

Citation rank is one way to evaluate a bunch of papers on a paticular topic, and its definitely one of the best way to find related papers in the first place. However, citations are fairly easy to game (cite yourself, get your colleagues to cite your paper) and therefore are not a great indicator of quality/influence/impact. Next, the venue of the conference is important; an OSDI paper will probably be pretty good given their low acceptance rate. But you can find lots of noise even at the best first-tier venue; its not that hard to get published (in CS) and many rising academics will flood the system with papers to make their tenure case stronger (where the tenure committee is not composed of peers, sheer numbers + conference rank are very important).

Sometimes you'll also find a gem in someone's dissertation 10 years back who didn't really focus on publishing: if you don't pursue an academic career, you don't have too much motivation to publish broadly. Or a good paper that was never cited at all on some topic that was then just emerging or way before its time. The quality of my own papers (as judged personally and subjectively) are inversely related to their citation counts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: