Hacker News new | past | comments | ask | show | jobs | submit login
Photos That Violated Facebook's Policies (techdirt.com)
37 points by fogus on Dec 7, 2012 | hide | past | favorite | 19 comments



A very understandable misfire by FB's porn detection algorithms, considering the unintentionally saucy text submitted with those cute photos of Nut the cat: "...Here you can see in more detail how Nut presses her face as hard as she can into mine. She does this all night, by the way. If I move my face away, she rearranges herself to grip the back of my head as tightly as possible. If I'm face-down on the pillow..."

--

Edit: changed "probable" to "very understandable," and quoted some of the text submitted with those photos.


I hadn't thought about the language. I noticed there's an awful lot of skin-tone with the extreme closeup angle there. Even if they do have face detection, it may be too close to even catch it.


[deleted]


You know they run facial recognition on every photo, right? That makes doing any statistical analysis of "face", "nut", and "hard" look very, very cheap.

Also, while it is totally true that image recognition software does chew computational resources, FB (and Google, and the other web-scale companies) have computational resources up the wazoo. When you've got 100,000 servers and could sic an Ivy league worth of vision research PhDs at the problem, this issue (like many others) gets slightly easier. Besides, it scales better than people, and the alternative is tasking 1/2 your CS team (Facebooks has several hundred people IIRC, might be larger now) on what is, approximately, the worst job you could imagine.


I remember watching Google testify regarding SOPA and how difficult it would be to find and block copyrighted content.

During their testimony they alluded to algorithms which can find pornographic material based on skin tone and other criteria. It wouldn't surprise me if Facebook implemented something similar.


You severely underestimate the amount of processing power that Facebook utilizes. By orders of magnitude. They have kept and continue to analyze every action anyone has ever taken on Facebook and any site with Like button.



a few years ago Tenable's Nessus scanning tool had a "porn detection" feature that was... not terrible. I imagine time + facebook == something even better. Or maybe not.


It's not that hard, Geocities was doing it back in the 90's!


Just call customer serv... oh, nevermind. You pay Facebook by surrendering privacy and personal information, but that's not enough for them to consider you a customer or offer support. The same server that automatically flagged your harmless photo will gladly issue you an equally automatic and thoughtless rejection when you appeal.

The faceless Facebook machine will march on, paying no regard to the innocent consumers it accidentally crushes in pursuit of more ad views.


Obvious: The cat was completely naked.


I saw three people on my own facebook feed complaining of their photos being flagged as inappropriate. My take is that this is a new feature or a regression.


A false positive on a facebook algorithm warrants an entire "my liberties are being trampled" article? All follow-up notifications are obviously automated dominoes falling from the initial warning; why the feigned surprise?


It's not just the false positive. If the algorithms had misclassified her photos, but there had been a straightforward grievance process to reverse the decision and allow the obviously-non-pornographic photos, I doubt it would have been article-worthy, but the presumption of guilt, unintuitive process, and requirement that users signify agreement to things they don't actually agree to before continuing to interact with the service are all problematic.


I hate how companies wants to decide how one can use their freedom of speech. If I wanted to post nude photos of myself on facebook, well, why shouldn't I be allowed it? Maybe they could force them not to be public, but friends only. Then people could unfriend people, if they were offensive.

Same with Apple not allowing nude pictures on their iBook store.


Well, you're using their servers, so it's not really a freedom of speech issue. When you're on their servers, its their rules; and if you don't like those rules they'll politely inform you (when they're policies they can't change) that you can go elsewhere.

And when you start providing that sort of adult content, I imagine (in the US at least) different rules begin to go into play. Just to be friends with someone, you'd need to prove you're 18+; because, as you defined it, some friends-only photos could be 18+. Then you've got other issues, like what it makes facebook look like. When it's possible to have pornography on a website, it changes the demographic that's willing to go to that website. It certainly wouldn't be a large percentage of the world, like it is now.

Showing pornography also affects the kind of people that are willing to advertise on your site, and so you may lose money that way. Mature images open a biiiig can of worms.


True, Zuckerberg thinks we should all share everything. Apparently everything doesn't include things that might scare off advertisers.


It's quite possible that the photo was marked as inappropriate by someone else - or their filter picked on the description. I can see how an algorithm could easily find that dirty. Should be interesting to see how this turns out


This is exactly one of the cases where it would be helpful (and good for Facebooks PR) to have some customer support.

Or even a basic -- let 10% of my friends review this and then come back -- option would be better than that.


Given the suggestive text, "cat" was clearly a polite euphemism.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: