This presentation briefly mentions but then seems to mostly forget about "Elsagate" which they call the "Peppa Pig scandal".
James Bridle argues convincingly that the genre of bizarre YouTube videos which appeals to the toddler reptilian brain ( https://medium.com/@jamesbridle/something-is-wrong-on-the-in... ) is not created by hostile or evil actors but instead has evolved orgnically based on what stuff toddlers want to click on. Kids' click patterns reward more video themes like "Elsa tied up on train tracks kissing Spiderman", so the content industry crams more of that stuff into its new content.
The result, after a few iterations, would not have passed editorial controls at 1990s Nickelodeon (!), which would normally have halted the feedback loop, but with no one at the helm -- to "censor" or otherwise exert editorial control -- YouTube's kid-targeted videos are just a whole forest of weird.
Does YouTube want to allow their platform to become a laboratory for rapidly discovering local maxima in very young children's fantasy worlds? Do they have any choice? Should they step in and publish rules for what children's content is allowed? Should they hire some kind of human curator or editor to enforce those rules for child-focused videos? Should Web platforms act in loco parentis?
In this cases, in "the Peppa Pig scandal" style situation, the producers are machine-generating content that gets clicks and the consumers are children.
When the issue is the viral proliferation of "fake news" and hate speech, the content producers are people or state propaganda apparatuses, and the consumers & re-sharers are grown adults.
It seems like it's a different topic with maybe different guiding principles to decide how & whether to censor these different groups of consumers & producers.
Kids' videos on YouTube Kids and Censoring the internet are very different things.
It's like saying, we don't wanna sell alcohol, because if we did, we would have to sell alcohol to kids.
On a platform that caters to everyone and does not have age restrictions, you would have to sell alcohol to everyone and a subset of everyone is indeed children
By the way, is there any evidence of this? Or are such claims perhaps similar to how old people never understood new things and claimed that "punk rock and heavy metal could seriously tamper with kids' mental health"?
This document is showing the hole in that position -- an outright attack (elsagate) aimed at children. A cursory inspection by the parent sees the child watching a harmless Peppa the Pig video, while in fact she's watching snuff.
This is a problem that Google has to address somehow (because that is what is demanded of them), while not censoring things aimed to adults. That's why the conclusion is a call for consistency and openness.
That isn't a hole, it's an iterative process. Parents didn't know there was snuff on YouTube Kids. As soon as they find out, their kids are not allowed to use it anymore and with declining viewership, whatever garbage feedback loop allowed that to happen is destroyed.
As a parent, I don't have time for this shit, so I did not ban such movies. I banned YouTube, all of it.
People put their trust way too much in free market competition. You know, if consumers were actually conscious of their choices and free market competition actually worked for pruning the weeds, we wouldn't have diabetes or obesity or pollution or global warming.
> Kids' click patterns reward more video themes like "Elsa tied up on train tracks kissing Spiderman", so the content industry crams more of that stuff into its new content.
To be fair many of the Tex Avery and Tom and Jerry cartoons with which almost everyone grew up with were a lot more wild than that, thankfully they weren’t censored back when we were kids.
> Did you watch them? Some of them are literally snuff, with tons of gore. The stuff of nightmares
I don't have kids so I only watched what I could quickly find on a simple YT search, and I remember watching that spider man scene the OP mentions (hence why I commented) which I didn't find that scary (even though it was quite tasteless). The gore stuff (probably meaning blood showing and similar stuff) should probably be restricted, of that I agree.
I’m on the phone and too lazy to copy-paste video references, but as I remember there was lots and lots of violence that I don’t think would pass many of today’s censors. I also remember a couple of episodes involving a “lady cat” (Tom’s love interest) presented as a “femme fatale” which would push some hot buttons in today’s world (sexism, kids being subjected to watch sexual innuendo scenes etc). Ah, and there were also those early 1940s episodes where Tom’s owner is this black servant lady whose face is really never shown, that would generate some really heated conversations were it to be released in today’s political and social context.
The problem is that the whole YouTube UI is machine learned to maximize engagement, so that they can show a lot of ads. The algorithm will do whatever it can to get people to watch more YouTube. We notice the weird results when it comes to videos for toddlers, but the same thing is happening to adults, we just don't see it in quite the same way -- it's always easier to see self-destructive behavior and make attributions from the outside.
Ultimately, interacting with software that has been machine learned for a metric that doesn't serve you or your kids' interests amounts to deliberately swallowing a parasite.
I realize its an often misused excuse for passing various bogus regulations. But your dismissive one liner is disparaging the comment while completely ignoring the massive context and reasoning provided; this is not the kind of conversation we expect of here.
"Think of the children" as a justification for censorship is still "think of the children" as a justification for censorship, regardless of the context. Dressing it up in an attempt to make it more palatable and reasonable-sounding doesn't change what is at its core. And that's a nice touch, painting me as an outsider by pointing at the sign with the rules.
The document is a showcase of trends, as well as pressures being applied to Google from various groups. Like it or not, Elsagate is a well-documented phenomenon, and this document is treating it like the problem that it is.
This is not a generic "think of the children" reasoning to ban things that adults enjoy -- Elsagate videos are targeted at children, they exploit various mechanisms to make children watch them (some children psychology, but mostly YouTube's recommendations algorithm.)
But the actions of tech giants talk a different language. Alex Jones, crazy as he may be and scamming people with his "man-pills", didn't address children. There were others that have been excluded that were less crazy, just pointing out a prominent example.
In consequence I doubt the intentions are as clear cut and restricted to these cases.
Children are easily distracted and are easy target for clickbait, that is true. They are also more affine to access information their parents want to restrict. I think that is true even for people here. And you did that too.
Alex Jones was not banned for targeting children, and "Think of the Children"-style arguments were not used in his case.
We're talking about elsagate here, right? "Peppa the Pig" snuff videos? They're aimed at toddlers. They game the algorithm because toddlers select videos from YouTube's suggested videos basically at random, so all someone who wants to monetize a video has to do is make sure to hit as many categories as possible. And they make it snuff, because.. Well, I'm not sure why but they do.
This isn't about teenagers or even pre-teens going behind their parents backs, this is about toddlers vegging out in front of YouTube on a tablet. Basically this generation's TV babysitter.
Do you think adults are watching these kinds of videos? These aren't content for everyone that is being censored because of the needs of children; This is content explicitly for children that is being censored because of the needs of children.
James Bridle argues convincingly that the genre of bizarre YouTube videos which appeals to the toddler reptilian brain ( https://medium.com/@jamesbridle/something-is-wrong-on-the-in... ) is not created by hostile or evil actors but instead has evolved orgnically based on what stuff toddlers want to click on. Kids' click patterns reward more video themes like "Elsa tied up on train tracks kissing Spiderman", so the content industry crams more of that stuff into its new content.
The result, after a few iterations, would not have passed editorial controls at 1990s Nickelodeon (!), which would normally have halted the feedback loop, but with no one at the helm -- to "censor" or otherwise exert editorial control -- YouTube's kid-targeted videos are just a whole forest of weird.
Does YouTube want to allow their platform to become a laboratory for rapidly discovering local maxima in very young children's fantasy worlds? Do they have any choice? Should they step in and publish rules for what children's content is allowed? Should they hire some kind of human curator or editor to enforce those rules for child-focused videos? Should Web platforms act in loco parentis?
In this cases, in "the Peppa Pig scandal" style situation, the producers are machine-generating content that gets clicks and the consumers are children.
When the issue is the viral proliferation of "fake news" and hate speech, the content producers are people or state propaganda apparatuses, and the consumers & re-sharers are grown adults.
It seems like it's a different topic with maybe different guiding principles to decide how & whether to censor these different groups of consumers & producers.