I've thought about something like applying a threshold filter to thumbnails to distort them sufficiently so they're not potentially "offensive" but have it still be apparent if they might contain bad content. It's too difficult a problem for me to actually implement though, and I suspect it could be easily hacked somehow.
I don't know if there is a service to rate images. Google seems to have the best technology. It seems like it wouldn't be that hard to programmatically upload an image using an api and get a rating back. This might be a way to curtail teen sexting, which is against the law. A parent could for example, admin a setting to change nudie selfies to a Sesame Street character on their way out. Or maybe better yet, process any photographs taken with the cell phone at the time they are taken and make all nudie images turn into Sesame Street characters ;-)
I don't know if anyone's actually doing it either. I could see Google having the power to pull it off. Though i'm not entirely sure given the current climate of paranoia I'd necessarily want Google keeping track of knowing exactly what and how much inappropriate material can be (even tangentially or erroneously) associated with me. A third party application would be nice but I think it would need to be anonymous - a concept Google seems dead set against.