Not just TikTok, I checked out SnapChat because my kid is the last one in his class to not have it (according to him). First movie I see 2 people falling off an e-bike, pretty painful, then someone making fun of someone with down syndrome, then some weirdly squirming middle aged women with duck faces, and then some very young ones (pretending to?) * off someone off screen while staring into the camera.
Also I denied all access but it still suggested all my sons friends? How? Oh, and it won't even start without access to cameras.
I was pretty shocked. Still, friend off mine, a teacher tells me: You can't let your kid not have SnapChat, it's very important to them.
The Chinese apparently say: Just regulate! TikTok in our country is fun, educational even with safeguards against addiction. Because they mandate it. Somehow we don't want that here? We see it as overreach? Well I'm ready for some overreach (not ChatControl overreach, but you get what I mean). We leave it all up to the parents here, and all parents say: "Well my kid can't be the only one to not have it."
Meanwhile the kids I speak to tell me they regularly have vapeshops popping up in SnapChat, some dudes sell vapes with candy flavors (outlawed here) until the cops show up.
Yeah, we also did stupid things, I know, we grew up, found pron books in the park (pretty gross in retrospect), drank alcohol as young as 15, etc. I still feel this is different. We're just handing it to them.
Edit: Idk if you ever tried SnapChat but it is TikTok, chat, weird AI filters and something called "stories" which for me features a barely dressed girl in a Sauna.
>I was pretty shocked. Still, friend off mine, a teacher tells me: You can't let your kid not have SnapChat, it's very important to them.
Yeah, it's OK to say no.
If the kid wants a phone and snapchat, there's nothing wrong with saying you simply won't be supplying that and if they want it they'd best figure out how to mow lawns. If you're old enough to "need" a phone you're old enough to hustle some yardwork and walk to the T-Mobile store yourself.
It's an unfortunate situation where they will be ostracized for lack of participation in social media like Snapchat or TikTok. Children ostracizing those who don't fit in has been a thing forever, but has been thrown into overdrive by ubiquitous social media usage by children.
I don't think making a kid work for the phone is the solution here. The problem is intentionally addictive algorithms being given to children, not a lack of work ethic regarding purchasing a phone.
> I still feel this is different. We're just handing it to them.
I think you are right to be worried, and I think you are correct that it is different:
IIRC, there were some Kremlin leaks some years ago indicating they knew how to "infect" a population with certain propaganda and have the disinformation live on or linger. Together with Meta's/Facebook's (illegal?) study where they experimented on people to try to make them sad by showing them certain types of posts.
So, I think it stands to reason that controlling what you consume means being in control of what you think; in other words: we are what we watch.
We know there are some feedback loops occurring, but I think that it is easier to get desensitized and start becoming accustomed to very extreme content due to the pressure to fit in; perhaps — once one has participated, it might be even harder to be deprogrammed (it requires facing the fact one behaved wrongly towards others).
There's also the fact that being a good person takes a lot of willpower, dedication, is inconvenient and is notoriously difficult to market as "fun".
It is more palatable for an impressionable kid to watch cheap foreign-state-backed radicalizing-propaganda than it is to learn about injustices being perpetuated in our behalf by the state apparatus.
We have developed the habit of being wary of what we consume in order to police our emotions (i.e. minding our mind so no desensitization happens in our watch).
We have seen what the "baddies" can do: the indifference to the suffering they cause, and the cruelty and pettiness they are capable of.
But I digress,... I think you are right to be worried, but I am unsure about how to train kids to not fall into the pipelines.
Or it's our state apparatus that doesn't want teenagers seeing the injustices they're perpetrating, and the think of the children argument is being pushed right now to hide videos of what's going on in Chicago with ICE, and elsewhere.
"Safety" is how it was originally billed: your kids can call you if they get in trouble. They also created apps that let parents spy on where there kids were.
So true, and all normal chats apps get more and more "social" sht incorporated, even Whatsapp has "stories" (Signal too btw). SnapChat is just completely shameless about it, integrating all the most addictive stuff found in all other apps.
Why would kids just not immediately switch to something else? This reads like a parent saying video games should only be educational because of course the kid only cares about it being a video game and not the content.
It works in China because they have chat control to the extreme.
And shockingly, the company actually listens to the Chinese government. Because the company knows that if they resist, their executives will at best be quickly "re-educated", or straight up disappeared overnight.
Here, the company would simply bribe the lawmakers, who would in turn spout off some mealy-mouthed gibberish on their party's favorite propaganda network, and business would continue as normal.
> After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex.
I (40m) don't think I've ever seen literal flashing or literal porn on TikTok, and my algorithm does like to throw in thirst content between my usual hobby stuff.
Are they making the claim that showing porn is a normal behavior for TikTok's algorithm overall, or are they saying that this is something that specifically pervasive with child accounts?
Does TikTok direct what you see based on what other accounts you interact with are interested in? I would expect teenagers to have a different interest profile than your average 40 year old. I would expect algorithms to more or less unwittingly direct you to the kind of stuff your peers were interested in.
TikTok’s recommendations are based off as much info as it can get, really.
Approximate location, age, mobile OS/browser, your contacts, which TikTok links you open, who generated the links you open, TikTok search history, how long it takes you to swipe to the next video on the for you page, etc.
I don’t think it's really possible to say what TikTok’s algorithm does “naturally”. There’s so many influencing factors to it. (Beyond the promoted posts and ads which people pay TikTok to put in your face)
If you sign up to TikTok with an Android and tell it you’re 16, you’re gonna get recommended what the other 16 year olds with Androids in your nearby area (based on IP address) are watching.
You think thirst traps are okay for kids? If we rewind time, the Girls Gone Wild commercial is not supposed to be even remotely possible on certain channels.
We’re a derelict society that has become numb, “it’s just a thirst trap”.
We’re in the later innings of a hyper-sexualized society.
Why it’s bad:
1) You shift male puberty into overdrive
2) You continue warping young female concepts of lewdness and body image, effectively “undefining” it (lewdness? What is lewdness?).
3) You also continue warping male concepts of body image
No, I don't think Thirst Traps are necessarily OK, but it's a fine line, and given current gym/athletic wear it's not always easy to discern what is actually a genuine (say) workout video vs a trap.
Because parasocial relationships with ewhores isn't healthy, particularly at a stage in their life when they should be forming real relationships with their peers.
Scrolling through attractive women (generally the thirst-traps are women) doesn't imply forming a parasocial relationship. I agree that parasocial relationships are bad, but this is independent of them being thirst-traps. Internet thirst-traps are just the modern equivalent of sneaking a look at a playboy mag or a lingerie catalogue. Nothing inherently damaging about it. The scale of modern social media can make otherwise innocuous stimuli damaging, but this is also independent of it being content of sexy women.
You are the one claiming there's a problem, and you are the one (presumably) demanding legal and other action to deal with that "problem". That means that any burden of proof is 1000 percent on you.
... and before you haul out crap from BYU or whatever, be aware that some of us have actually read that work and know how poor it is.
Parasocial relationships are a different topic than pornography.
Are you saying that the intersection is uniquely bad? In either case limits to content made in an effort to minimize parasocial relationships cut across very different lines than if the goal is minimizing access to porn.
These people come out of the woodwork, when it comes to defending porn. It’s their whole identity. And unfortunately the tech scene is infested with these types.
It's goalpost shifting. If the concern is parasocail relationships to content creators formed with pornography as the hook, then pornographic content where the actors aren't cultivating or interacting with a social media followerbase should be better, right?
Then support them. Too often you show up to scream "think of the children" without actually citing any research or empirical damage. If you refuse to argue in good faith and don't want to be told you're wrong, voting is the only thing you're capable of doing. Don't tell us about it, vote.
Everyone knows those laws do nothing, though; go look at the countries that pass them. Kids share pornography P2P, they burn porno to CDs and VHS tapes and bring in pornographic magazines to school. They AirDrop pornographic contents to their friends and visit websites with pornography on them too. Worst-case scenario, they create a secondary market for illegal pornography because they'll be punished regardless - which quickly becomes a vehicle for creating CSAM and other truly reprehensible materials.
They don't do it because they're misogynistic, mentally vulnerable or lack perspective - they do it because they're horny. Every kid who aspires to be an adult inherently exists on a collision course with sexual autonomy, most people realize it during puberty. If you frustrate their process of interacting with adulthood, you get socially stunted creeps who can't respond to adult concepts.
You can email hn@ycombinator.com and request a name change if you don't like the connotations of your current name. Dan and Tom will rename accounts for people.
This content isn't as overt as it may seem, maybe you did come across it and just didn't notice flashing. Those "in the know", generally younger people whose friends told them about flashtok, know what to look for
Also: kids click on links adult ignore without thinking. Our brains have built in filters for avoiding content we don't want; for kids everything is novel.
I wonder when this study happened? FWIW, there was some pretty intense bombing of full-on nudity content to TikTok a month or two ago--it all looked like very automated bot accounts that were suddenly posting scenes with fully nude content cut out of movies--that I saw a number of people surprised were showing up in their feeds. It felt... weaponized? (And it did not last long at all, FWIW: TikTok figured it out. But it was intense and... confusing?)
>Are they making the claim that showing porn is a normal behavior for TikTok's algorithm overall, or are they saying that this is something that specifically pervasive with child accounts?
the latter is what they tested, but they didn't say specifically pervasive.
you quote the article so it seems like you looked at it, but questions you are curious/skeptical about are things they talked about in the opening paragraphs. it's fine to be skeptical, but they explain their methodology and it is different than the experience you are relying on:
>Global Witness set up fake accounts using a 13-year-old’s birth date and turned on the video app’s “restricted mode”, which limits exposure to “sexually suggestive” content.
>Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.
>The terms suggested under the “you may like” feature included “very very rude skimpy outfits” and “very rude babes” – and then escalated to terms such as “hardcore pawn [sic] clips”. For three of the accounts the sexualised searches were suggested immediately.*
>After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex. Global Witness said the content attempted to evade moderation, usually by showing the clip within an innocuous picture or video. For one account the process took two clicks after logging on: one click on the search bar and then one on the suggested search.
Yeah, I (50m) have never encountered literal porn on TikTok. Suggestive stuff, thirst traps, sex ed, sex jokes, yes, but no literal porn or even nudity.
You could make that more complicated where moderators tag the content and then you apply filters based on what children are allowed to view in a jurisdiction, or you could be conservative in only allowing non-controversial stuff for kids to avoid that.
Obviously different jurisdictions are increasingly disagreeing with it being a non-problem.
I regret to inform you that there's a bug in your code.
Specifically, it relies on the "moderatorApprovedForChildren" flag, which is sometimes sent incorrectly because of glitches in the system that sets that flag. Apparently the number of such glitches increases sharply with the number possible values of "j", but is significant even with only one value.
Also, flag-setting behavior is probabilisitic in edge cases, with a surprisingly broad distribution.
You are therefore not meeting your "zero porn" spec, while at the same time blocking a nonzero amount of non-porn.
Don't bother to fix the bug, though; given the very large cost of the flag-setting system, the company has gone out of business and cancelled your project.
> Obviously different jurisdictions are increasingly disagreeing with it being a non-problem.
Different jurisdictions are doing a lot of stupid things. You get that in a moral panic. Doesn't make them less stupid.
Weirdly enough, other companies manage to not accidentally sell/give porn to kids just fine. I see no issue with holding large media companies like TikTok, Meta, Google, etc. to account just like we would if someone put hardcore porn on the Disney channel. This is only a problem when you want to be a massive company that operates in every market while not taking any responsibility for what you do/not hiring the necessary staff to manage it.
Similarly, if your alcohol/weed store sells to children and you get caught, you can be criminally prosecuted. This is well-trodden ground. Companies worth trillions can be expected to do what everyone else manages to do.
Same deal with malicious ads. These companies absolutely have the resources to check who they're doing business with. They choose not to.
Banks also don't get to just not bother with reconciling accounts because it's hard to check if the numbers add up, and yeah bugs can result in government action.
Uh-huh. User-generated content is exactly like the Disney channel.
Let's keep using the TikTok example. According to https://arxiv.org/abs/2504.13279 , TikTok receives about 176 years of video per day. That's 64,240 days per day, or 1,541,760 hours per day. To even roughly approximate "zero porn" using your "simple" moderation approach, you will have to verify every video in its entirety. Otherwise people will put porn after or in amongst decoy content.
If each moderator worked 8 hours per day, reviewing videos end-to-end without breaks (only at 1x speed, but managing to do all the markup, categorization, exception processes, quality checks, appeals, and whatever else within the video runtime), that means that TikTok would need 192,720 full-time moderators to do what you want. That's probably giving you a factor of 2 or 3 advantage over the number they'd really need, especially if you didn't want a truly enormous number of mistakes.
The moderators in this sweatshop are skilled laborers. To achieve what you casually demand, they'd have to be fluent in the local languages and cultures of the videos they're moderating (actually, since you talk about "jurisdictions", maybe they have to also be what amounts to lawyers). This means you can't just pay what amounts to slave wages in lowest-bidder countries; you're going to have to pay roughly the wage profile of the end user countries, and you're also going to have to pay roughly the taxes in those countries. Still, suppose you somehow manage to get away with paying $10/hour for moderation, with a 25 percent burden for a net of $12.50/hour.
Since you live in fantasyland, I'll make you feel at home by pretending you need no management, support staff, or infrastructure at all for the fifth-of-a-million people in this army.
You now have TikTok paying $19,272,000 dollars to moderate each day's 1,541,760 hours of video. TikTok operates 365 days a year, and anyway the 1,547,760 is an average. So the annual wage cost is $7,034,280,000.
TikTok financials aren't reported separate from the rest of ByteDance, but for whatever it's worth, [some random analyst](https://www.businessofapps.com/data/tik-tok-statistics/) estimates revenue at about $23B per year, so you're asking for about 30 percent of gross revenue. It's not plausible that TikTok makes 30 percent profit on that gross, so, even under these extremely, unrealistically charitable assumptions, you have made TikTok unprofitable and caused it (a) shut down completely, or (b) try to exclude all minors (presumably to whatever crazy draconian standard of perfection any random Thinker Of The Children feels like demanding that day).
No, TikTok can't just raise advertising rates or whatever. If it could get more, it would already be charging more.
That's all probably about typical for any UGC platform. What you are actually demanding is to shut down all such platforms, or possibly just to exclude all minors from ever using any of them. You probably already knew that, but now you really can't pretend you don't know.
Totally shutting down those platforms would, of course, achieve "zero porn". But sane people don't think that "zero porn" is worth that cost, or even close to worth that cost. Not if you assign any positive value to the rest of what those platforms do. And if you do not assign any positive value, why aren't you just being honest and saying you want them shut down?
If they want to centralize and provide recommendations for public video clips posted by anyone in the entire world but can't actually economically do that in a responsible way, then sure I don't have a problem with them being fined into oblivion. I don't see much need for businesses with hundreds of millions of customers to exist (and see plenty of downsides to allowing one company/platform to be that large. Especially a centralized communications platform), and if they can't actually handle that scale, then okay. Maybe their whole premise was a stupid idea. Or maybe they'll need to charge users to cover costs. Or ban children.
Well, I'd be happy to see them replaced by decentralized systems, too, and while I'm capable of recognizing that many people value the recommendation services and rendezvous points that those platforms provide, I'd really rather see that done in a way that didn't require big players.
But I don't know why you think that'd be an improvement.
Do you actually think that a fully decentralized, zero profit, no-big-players system for posting and discovering short media (or any kind of media) would put less "sexualized content" in front of teenagers (or anybody else)?
Moderation in such systems is usually opt-in, both because it fits better with the obvious architectures, and because the people who tend to build software like that tend to be pretty fanatical about user choice. So, if they choose to, kids are definitely going to be able to see pretty much anything that the system allows to exist at all... which will probably include tons of stuff that's really hard to find on, say, TikTok.
As for "recommending", I suspect any system that succeeded in putting the content users actually wanted in front of them would give teenagers, and indeed actual children, more "sexualized" content. The companies you're railing against are, in fact, trying to tamp that down, whether or not you believe it, and whether or not you think they're doing enough. A decentralized protocol does not care and will do exactly nothing to disadvantage that content.
Nobody really knows how to do decentralized recommendations (without them being gamed into uselessness), but if somebody did figure out a good way to do it, I'd expect it to be worse, from your point of view, than the platforms. So would a "pull-based" system that relied on search or graph following or communities of interest or whatever.
For a person with the priorities you seem to have, I can't see how decentralized systems would be anything but "out of the frying pan, and into the fire".
Decentralized systems like the web already have a solution: lots of jurisdictions are making it illegal to provide adult content without age gating it. The point is for people to assume the same set of liabilities they would in person instead of the status quo where the web magically means you can do whatever. Then you just set up filters at home (or have ISPs offer following) to block the other jurisdictions. e.g. I lose nothing from simply blocking Russia altogether on my router.
> Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.
I hate to direct traffic to people like that, but, you know, how about their actual "study"? I realize that the "journalists" at the Guardian aren't willing to provide the actual source link, but it's not hard to find.
Their methodology involves searching for suggested terms. They find the most outrage-inducing or outrage-adjacent terms offered to them at each step, and then iterate. They thereby discover, and search for, obfuscated terms being used by "the community" to describe the content they are desperately seeking.
They also find a lot of bullshit like the names of non-porn TV shows that they're too out of touch to recognize and too lazy to look up, and use those names to gin up more outrage, but that's a different matter.
This is, of course, all in the service of whipping up a moral panic over something that doesn't fucking matter to begin with.
Thank you for linking the source material, unfortunately it badly contradicts you. It clearly shows that the _very first_ list of ten suggested search terms contained (pretty heavily) sexualised suggestions.
I suppose some of that stuff could reasonably be called "sexualized". Pornographic? No. A problem? Not unless you have really weird hangups.
Here's a unified list of all the "very first list" suggestions they say they got. I took these from their appendix, alphabetized them, and coalesced duplicates. Readers can make their own decisions about whether these justify hauling out the fainting couch.
+ Adults
+ Adults on TikTok (2x)
+ Airfryer recipes
+ Bikini Pics (2x)
+ Buffalo chicken recipe
+ Chloe Kelly leg up before penalty
+ cost of living payments
+ Dejon getting dumped
+ DWP confirm £1,350
+ Easy sweet potato recipes
+ Eminem tribute to ozzy
+ Fiji Passed Away
+ Gabriela Dance Trend
+ Hannah Hampton shines at women’s eu [truncated]
+ Hardcore pawn clips (2x)
+ Has Ozzy really died
+ Here We Go Series 3 Premieres on BBC
+ HOW TO GET FOOTBALL BLOSSOM IN…
+ ID verification on X
+ Information on July 28,2.,,,
+ Jet2 holiday meme
+ Kelly Osbourne shared last video with [truncated]
+ Lamboughini
+ luxury girl
+ Nicki Minaj pose gone wrong
+ outfits
+ Ozzy Funeral in Birmingham
+ pakistani lesbian couple in bradford
+ revenge love ep 13 underwater
+ Rude pics models (2x)
+ Stock Market
+ Sydney Sweeney allegations
+ TikTok Late Night For
+ TIKTOK SHOP
+ TikTok Shop in UK
+ TIKTOK SHOP UK
+ Tornado in UK 2025
+ Tsunami wave footage 2025
+ Unshaven girl (3x)
+ Very rude babes (3x)
+ very very rude skimpy
+ woman kissing her man while washing his [truncated] (2x)
A soccer mom I know shared that she once tried TikTok. Within seconds of installing the app, the algorithm was showing nsfw content. She uninstalled it.
I assume that the offending content was popular but hadn’t been flagged yet and that the algorithm was just measuring her interest in a trending theme; it seems like it would be bad for business to intentionally run off mainstream users like that.
after reading some of the article it seems to me that they’re saying that on a restricted account thats got the bday of a 13 year old with the suggested search terms tiktok shows and a few clicks you can see actual porn.
Really? I've signed up to bluesky and tiktok and on both have seen literal porn extremely early without engaging directly (such as liking or responding, speed of scrolling could be something).
All of these apps are 100% using your scroll speed/how long you spend engaging with the content as a data point. After all, "time spent engaging with the content" is the revenue driver.
Looks like basic thirst trap content , probably not included to maximize their rage baiting. I don't understand why these puritans don't just move to the middle east. They instead ruin the internet for the rest of us.
They already censored the not-porn (but still NSFW) photos. I don't think it would've made as much of a difference censoring the porn photos as well, especially when trying to convince people that they're not just creating click-bait.
A lot of folks use TikTok on a regular basis. This article is the one making the claim that's far and away different from what most folks experience on the platform.
Since I'm not about to go on there, pretend to be a 13-year old boy, and start seeking out the porn myself, I really need to see some evidence that this is a thing that is actually possible before I start picking out a pitchfork.
I guess I'll just trust them bro ? Have you tried finding them on tiktok? Because even when I deliberately tried I couldn't find any. They claim that children accidentally stumble into these videos.
If you consider "skimpy outfits" pornographic that both Facebook and X are worse than TikTok for me. I've seen a few pieces of content I had to report before but not many.
X, on the other hand, has literal advertisements for adult products on my feed and I get followed by "adult" bot accounts several times a week that when I click through to block them often shows me literal porn. Same with spam facebook friend requests.
I think it boils down to a simple fact that trying to police user-generated content is always going to be an up-hill battle and it doesn't necessarily reflect on the company itself.
> Global Witness claimed TikTok was in breach of the OSA, which requires tech companies to prevent children from encountering harmful content...
Ok, that is noble goal but I feel that the gap between "reasonable measures" and "prevent" is vast.
> I think it boils down to a simple fact that trying to police user-generated content is always going to be an up-hill battle and it doesn't necessarily reflect on the company itself.
I think it boils down to the simple fact that policing user-generated content is completely possible, it just requires identity verification, which is a very unpopular but completely effective idea. Almost like we rediscovered, for the internet, the same problems that need identity in other areas of life.
I think you will also see a push for it in the years ahead. Not necessarily because of some crazy new secret scheme, but because robots will be smart enough to beat most CAPTCHAs or other techniques, and AI will be too convincing, causing websites to be overrun. Reddit is already estimated to be somewhere between 20% and 40% robots. Reddit was also caught with their pants down by a study recently, with an AI robot on r/changemymind racking up ridiculous amounts of karma undetected.
I'm not convinced that will fix the problem. Even in situations where identity is well known such as work or school, we commonly have bad actors.
It's also pretty unpopular for a good reason.
There is a chilling effect that would go along with it. Like it or not, a lot of people use these social platforms to be their true selves when they can't in their real life for safety reasons. Unfortunately for some people their "true self" is pretty trashy. But it's a slippery slope to put restrictions (like ID verification) on everyone just because of a few bad actors.
Granted I'm sure there's some way we could do that while maintaining moderate privacy but it's technologically challenging and I'm not alone in wanting tech companies to have less of my personal information not more.
Heh on Facebook you don't even need any clicks. I logged in after a few years and the first video among the facebook shorts or whatever it's called was a woman removing her underwear.
You weren't ostracized for not engaging on social media because it didn't exist.
In 10-15 years Gen Z will be complaining about how their generation didn't need to have the most expensive AI boyfriend/girlfriend to avoid getting bullied, or something ridiculous like that.
The resolution of the "national threat" chapter of Tiktok was pretty much a defeat for the other social media giants, so I guess they'll pursue this angle now.
Was it? The US version of Tiktok is going to be run by One Rich A-hole Called L. E. pretty soon. It seems like maybe this research is just being published a little to late? Or maybe it's to manufacture consent for actually using the Oracle version.
I'm old enough, and except for places like 4chan, and ads on torrent sites (before adblocks were a thing), I pretty much never saw porn online by accident. The closes to porn were censored boobs on social networks when someone shared some daily mail article or other mainstream media stuff.
On the other hand... There is "WikiHitler", a game where people click on a "random article" on wikipedia and try to reach the "Adolf Hitler" page in the least amount of clicks... so yeah, technically, on wikipedia, you're always a few clicks away from Hitler too, but not by accident.
These are all versions of Six Degrees of Kevin Bacon [1] (or the Erdos variant if you like) and thus no surprise at all. And these includes this "clicks needed to get to porn" variant.
I bought a new phone with tiktok built in. Ive never used it. I went on it as a guest and within 60 seconds I was watching a childs gooner content. Grim.
I firmly believe algorithmic social media will go down in history as one of the most harmful inventions in the capitalist world, on par with the massification of cigarretes or leaded gasoline, or worse. I'm not being hyperbolic.
Here's a link to the wiki for actual reality television show that exists in real life, Hardcore Pawn (https://en.wikipedia.org/wiki/Hardcore_Pawn). That isn't a misspelling. Welcome to the phase of the TrumpTok Takeover where We Need To Do Something To Protect The Children. I wish you luck in the Telescreen portion. Remember, if you make woke facial expressions at the camera during any of the daily loyalty oaths you will be declared Antifa and reeducated.
>After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex.
Where? I'm a grown ass adult who likes sex and has had a tiktok account for years now and I can't find any of this. I can find people dancing, dressed in a way that would be perfectly acceptable in public, but where are the women flashing and penetrative sex? Can anyone confirm that they've seen any of these things at all on TikTok, not to mention after a "small number of clicks"?
I know I got rid of Tiktok a while back. But never have I ever seen actual porn, or even topless women on Tiktok.
Of course, news rag cant publish the pictures/video and the accounts as proof. But we're supposed to take their word for it? Hard pass on that.
Now, I have seen advertisements that used sexism of various sorts. And this is common wherever advertising and capitalism take hold - its a quick and dirty hack to help sell garbage. https://en.wikipedia.org/wiki/Sex_in_advertising
In the first picture, there's the nude->rude replacement games, bikini (which is not nudity), models (which is not nudity). Unshaven girl could mean legs, armpits, and/or public area.
The second picture also has no nude people in it. The closest "skin" picture is another bikini, which again, is not nudity.
The 3rd picture is too blocked out, but likely more bikini pictures. Again, yes, you can see the labia bunching up (cameltoe). But this again is completely legal and normally seen in water parks and beaches.
I also note they said "We have deliberately not included examples of the hardcore pornography that was shown to us.". So yes, I do doubt they exist for any length of time here.
And the text search autocomplete is also hard, because many words are banned. So nude becomes rude. ICE protest and similar becomes party. Drones are "dior bags". And banning the stringing replacement words together is basically whack-a-mole.
But no, this whole operation smells like "let us do anything in the name of ThE ChiLDrEn", including that scourge Chat Control.
And to be fair, I'd rather adolescents look at titties. Sexuality is completely natural. And the fact this article is targeting 13 year olds, remember - they're already going or gone through puberty. Sexuality, and wanting to see what the other sex looks like is completely natural. This weird quazi-religious shaming is just terrible for everyone.
The conversation around tiktok is so politicized and biased to the point I just can't take such results seriously.
If this was Instagram nobody would care.
> Global Witness, a climate organisation whose remit includes investigating big tech’s impact on human rights, said it conducted two batches of tests, with one set before the implementation of child protection rules under the UK’s Online Safety Act (OSA) on 25 July and another after.
Also why the hell is a human rights / climate org doing research on tiktok?
Yes, people would care. People who have been ringing the alarm bell regarding kids on social media for literally decades now.
Congress wouldn't care. Mark would make $ure of that...$omehow, but I can't quite figure out exactly what he would do. $omething $omething "campaign donation$".
Also I denied all access but it still suggested all my sons friends? How? Oh, and it won't even start without access to cameras.
I was pretty shocked. Still, friend off mine, a teacher tells me: You can't let your kid not have SnapChat, it's very important to them.
The Chinese apparently say: Just regulate! TikTok in our country is fun, educational even with safeguards against addiction. Because they mandate it. Somehow we don't want that here? We see it as overreach? Well I'm ready for some overreach (not ChatControl overreach, but you get what I mean). We leave it all up to the parents here, and all parents say: "Well my kid can't be the only one to not have it."
Meanwhile the kids I speak to tell me they regularly have vapeshops popping up in SnapChat, some dudes sell vapes with candy flavors (outlawed here) until the cops show up.
Yeah, we also did stupid things, I know, we grew up, found pron books in the park (pretty gross in retrospect), drank alcohol as young as 15, etc. I still feel this is different. We're just handing it to them.
Edit: Idk if you ever tried SnapChat but it is TikTok, chat, weird AI filters and something called "stories" which for me features a barely dressed girl in a Sauna.
reply