Hacker News new | past | comments | ask | show | jobs | submit login

The post isn’t about Instagram at the exclusion of the fashion ecosystem: it’s an argument that truncates research that does find that posts that the drop in self-esteem is directly related to seeing posts either from fashion professionals or peers trying to emulate those. You can’t read those studies and think that detail is not part of the key findings.

Those are not two similar but separate problems.

This is like saying that mail bombs are the problem of the post office and not people having access to explosives: the problem isn’t envelopes. It’s people with access to explosives, and FedEx would likely have the same issue—unless federal rules allow them to scan and to refuse to deliver explosives.

I’m not being crass with a bad metaphor: focusing on the support has been a key issue in the argument for a while. Fashion magazines used to be decried as “glossy paper,” but no one thought the difference between magazines sending problematic self-image and serious news was that newspapers came on broad sheets of mate paper. Still, it’s now how the problem is presented: without clear separation.

I did work at Meta on the team looking at Teenagers, and I did raise that point internally: should we look at gambling and alcohol and extend the same rules to fashion? Should we boost peers over brands to avoid problematic ideation?

Those conversations, without the shadow of fashion partners, were generally productive. It wasn’t perfect: the goal remained “engagement” but there were no sacred cows to avoid.

As soon as the findings about teenage body dysmorphia were put in the context of fashion (and presented to one particular executive who cared about advertising more than algorithmic boost), that question was buried. Several friends of mine got blackballed hard not for suggesting Instagram-specific treatment, but for instance, for asking Anna Wintour about the quip where she fat-shamed her best friend in _The September Issue_ (it’s a niche fashion reference, but it’s widely considered a smoking gun in the industry).

There are inherent biases to how Instagram and Facebook work: you post when you achieve something, and there’s a bias towards success, but the internal findings rarely found that those were crippling. Thinness, on the other hand, leads to clear, widespread medical issues. I remember asking if the issue became more prevalent because teenagers didn’t have access to that many magazines before but that the dose effect was comparable. I don’t think anyone has looked into that.

I’m not saying that to disengage Meta’s responsibility: I had argued for explicit filters, like giving teenage boys who know they can’t resist and will behave in a way they think is unhealthy, the ability to exclude scantly clad women in their feed. I used another example, naturally (more typical of internal debate). That idea could have had legs ten years ago; now, with the debate being a lot less constructive, I'm not sure.




the post office did actually implement stricter screening, including x-rays, chemical detection, and GPS tracking, due to mail bombs... in a way mail bombs sped up the process of implementing package tracking systems for everyone — they took a very serious approach to safety despite not being the direct cause of the problem

from my perspective, this is the kind of regulation the FTC should seek out — so for different reasons I agree with your analogy

magazines are an entirely different beast and it's strange to not address that — I have to physically go out of my way to purchase a magazine or have one delivered... that's something I opt in to

people opt-in to instagram, but for teenagers there are much larger stakes involved... instagram has become an integral part of many teens' social life (by design) and they're constantly getting targeted and personalized ads fed to them directly

the fact that the goal remains engagement makes it pretty clear where meta's priorities lie, and it's not on the side of safety




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: