Hacker News new | past | comments | ask | show | jobs | submit login

HN's article submissions/promoting algorithm is shit. If I had to guess, they give priority to articles submitted by certain users (either based on fake internet points or ??).



well I submitted it - and i have not noticed any favouritism in the past :-)

I suspect it's probably some mix of re-submitting affecting timestamps - I am not sure, which was why I asked. it's not terribly important.

I think the problem to be solved however is hard, and i don't know of any obvious solutions, so perhaps "shit" is probably too harsh.


It is shit. Duplicate stories are arbitrarily allowed to be submitted (exact same article/url), while sometimes it actually works and prevents duplicates.

Oftentimes you see submissions for an interesting article languish in the 'new' section, but then a user re-submits the exact same thing ('duplicate detection' algo at work, lol) and it's on the front page in minutes. Whether that user is a 'power user', besties with dang, lucky, or ???..


I think it would be more effort to write your vision of the submission process than what is probably there.

The algos is presumably a straight up string compare with some url decoding, but any amount of idiotic GET key value pairs can screw that up.

I find Occam's razor the best explanation for these kind of things - I suggest not taking it to heart too much.


> The algos is presumably a straight up string compare with some url decoding

Well, except it fails in some cases to straight up compare two strings that are identical, so there must be some other 'secret sauce' in the algorithm that makes it unnecessarily complicated and error-prone.


> string compare with some url decoding

Perhaps page body edit-distance would work better?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: