Hacker News new | past | comments | ask | show | jobs | submit login
Craigslist's Increasingly Complicated Battle Against Spammers (techdirt.com)
16 points by chaostheory on May 24, 2008 | hide | past | favorite | 11 comments



Garret Hardin's Tragedy of the Commons.

http://en.wikipedia.org/wiki/Tragedy_of_the_commons

Or my short restatement: Everything that's free is abused.


Hrm. I am inclined to not see it that way, as Craigslist is most certainly not a 'commons' in the sense of a resource owned by no one, which is the point of "the tragedy of the commons".

CL are doing their damndest to stop these people, just that there are a lot more spammers than CL people.


It doesn't matter if it's really a commons, or just something that looks like a commons. If someone owns an empty lot that everyone in the neighborhood treats as a commons, then the same rules apply. Without a constant effort to clean them, those places get just as crappy as public parks in exactly the same way. In practice, there's little difference between everyone owning something, and everyone getting to use it for free.


This also affects Massively Multiplayer Online games as well as Web 2.0 sites of all kinds.


Pretty impressive how some people insist SO MUCH to fuck things up for everyone just to make a couple dollars (and some even do so for free!).


It shouldn't surprise you. The search for ways to make $$$ is a part of seeking $$$. (The meta-level, if you will.)

As for the people who ^%&#$& things up for free, they are seeking to profit in non-monetary ways.


Craigslist seems like it's willing to do a lot of work to avoid the simple solution of just charging a little for postings.


It doesn't seem like it would be very hard for Craigslist to win the fight against auto-bot posting. Some of the more sophisticated captchas are pretty hard to crack, and they could just make people do different random things instead of just captcha every time.


The more sophisticated and less automatable techniques have higher overhead and can degrade user experience. You don't want to ask users to perform 10 special actions in order to identify themselves as human. And if you pick 10 of those actions randomly out of a bank of 100 types of action then the spammers will reverse-engineer each type of action individually and then will produce a function to map what's displayed by the user interface to each type of action.

The problem is that the more the CAPTCHA-like artifacts creation process is automatable the more their cracking tend to be automatable. And if you truly find a way to generate CAPTCHA-likes cheaply and that are impossible to crack by machines then the job will be shipped to the third world or the spammers will set up sites where you have to accomplish the cracking, perhaps without even knowing it, to get to the free stuff.

I don't expect to see the Tragedy of the Commons problem resolved this century.


It took me ten seconds to find this:

http://sfbay.craigslist.org/sfc/w4m/693380090.html

...and this:

http://www.google.com/search?q=%22Simultaneously+with+it+I+d...

(27 re-submissions of the same thing all over the place.)

Whatever craigslist is doing, they aren't doing it very well. Google should sell an api to allow a count(*) return on text.


It's sort of like slash and burn agriculture on the web. Spammers permanently destroy a resource for a short term small financial gain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: