Here's an anecdote about what happens when you design a product in the opposite way.
I own https://www.executeprogram.com, which has interactive in-browser courses on various software development technologies. Currently they all cover languages, more or less: TypeScript, SQL, various JS topics, regexes. (Disclosure: it costs money after you finish your 16th lesson.)
Almost all (maybe literally all) of our competitors are amenable to binging. It's true of books, video learning platforms, and most/all other interactive learning platforms.
Execute Program is very intentionally non-bingeable. When you start a course, you get 5 lessons on the first day, then it stops you and tells you to come back tomorrow. On the next day, you get some brief reviews of yesterday's lessons, then a few new lessons, then it stops you again until the next day. That cadence repeats until you finish the course. You can't binge/cram even if you want to.
(A bit more technical detail: it's a spaced repetition system with exponential review intervals, similar to those used for language learning in e.g. WaniKani and Anki. But it also has a lot of fine-grained knowledge of its own course structure, so it can use reviews to intelligently unlock different lessons depending on how the user performed on their reviews.)
Occasionally, we get support email from new users who don't like this. They want to cram a whole course in a day. But cramming is a very time-inefficient way to learn, so this is self-defeating! Since launch, we've had good success adjusting the app's behavior and internal explanations to reduce these complaints.
However, we still get emails from long-term users who appreciate the time limitations. Generally these fall into two categories:
1. Users like that an enforced break before the reviews provides tangible evidence that "yes, I genuinely understood yesterday's lessons". If we allowed cramming, that reassurance wouldn't exist; it's too easy to succeed at a review when you just finished the lesson 30 seconds ago.
2. Users like that the usage limits remove a source of anxiety and worry. You do your reviews and lessons, you finish, and then you wait until tomorrow. There's no temptation to think "I really should've done 10 lessons today instead of 5; I'm so lazy".
It's still possible for a very dedicated user to do all of our courses in parallel within their first monthly billing cycle. (Median course start-to-finish time is 8-18 days depending on the course.) So this scheme doesn't make users pay us more than they would otherwise. And they're spending the same amount of wall-clock time that they'd spend if they crammed all of the lessons in one day. That makes it pure win: they memorize the topics more deeply, they worry less, and they get those benefits for no extra time expenditure. The only exception I can think of would be people who think "I must get exposure to all TypeScript syntax and semantics before tomorrow morning, even if that significantly reduces my ability to remember what I learned."
Obviously I'm very biased here, and the goals that we're optimizing for don't even exist in most other product spaces. But I thought it would be nice to have a counterexample to "engagement at all costs".
Just because there are spaces, it doesn't mean something isn't "addictive." I was addicted to WaniKani when I first started it even if I wasn't binging it. Optimizing your product so users come every day (and hence every month paying for a subscription) you care that users engage every day more than the amount that they engage every day. Advertisement monetization streams care about how much time users are on the platform where subscriptions just care that users continue to use the platform.
I think your argument here could be made about pretty much any activity that humans do repeatedly. The critical difference is that both WaniKani and Execute Program quickly tell you "go home, you're done", so it's difficult to dump hours of low-quality time into them. Whereas advertisement-driven platforms are incentivized to retain your eyeballs for as long as possible, as you pointed out.
The thing is, if your product is a subscription then maximizing your engagement in a way that the users will stick longer and pay for longer is not a counterexample, it’s just adapting the engagement to fit your business model.
It could also be seen as adapting the business model to fit the process.
Essentialy, the goal was to sell a product that guided users to learn in the most effective way possible. The learning model is spaced repetition, which requires users to use the product over a long period - therefore the best way to sell it is as a long term subscription.
If the best way to learn was to cram as much as possible in as short a time as possible, then maybe the product would be sold as a bunch of individually priced courses paid in full up front.
As long as the product is fairly priced and provides a valuable service, there's nothing wrong with making money as effectively as possible with it.
Uh, it's interesting that you see the design of your product as a counterexample, because to me it looks exactly like what mobile games have been doing as a mechanism for maximizing engagement.
You want to stop them from playing through too much of the game's content and burning themselves out on it. So you lock them out with a timer, forcing them to come back later. Then you reward them for coming back every day. This encourages them to turn the game into a habit and integrate it into their routine.
Looks to me like you've accidentally stumbled onto one of the very tactics games use to turn people into addicts.
Habits can be good or bad. In this case, one would want the user to be rewarded for coming back every day to learn programming. Habit acclimation is a neutral process, just because the parent uses the same tactics as those who use them for building bad habits does not mean that those processes are bad in and of themselves.
Of course! I'm not ascribing any moral weight to it, just pointing out the similarity in tactics. Engagement can be a good thing if you're using it to improve someone's life rather than simply trying to extract as much time and money from them as possible.
>Engagement can be a good thing if you're using it to improve someone's life
Can we agree that entertainment and being entertained can improve someone's life? If we can agree to that, then we can also agree that games designed to entertain an individual are good for said individual.
It's not a leap from there to say that a game that is engaging and keeps someone playing is good for them, because it's entertaining them, and that, as we've established, is good.
Once you take the step to make a product addicting, as a formal part of your business model, everything after that is just shades of grey, I believe. Morality and justification of your own truth is a fascinating process. Or maybe I'm overthinking this.
I wouldn't say you're overthinking it at all! You make an excellent point, and as a game developer it's something I have to consider. Our mission is to craft joy for our players, but I certainly cannot afford to rule out any potential in terms of designing our games to be more engaging.
On the other hand, I've witnessed many instances of gamers who have fallen victim to habit-forming mechanics that continue sinking hours and dollars into a game daily but also say that they no longer truly enjoy the time they spend in that game. There's many aspects that play into that sort of behavior which games can be designed to exploit, not least of which being the sunk cost fallacy and fear of missing out: a player who has reached the endgame might have run out of interesting content to play through but feel obliged to continuously "defend their title". They're encouraged to keep coming back long after it stops being fun by being constantly presented with what appears to be an existential threat to the supremacy they have labored towards.
At that point I think we can safely say that such a game is no longer a positive impact on that person's life.
My way of attempting to ameliorate these sorts of conflicting interests is to structure our organization as a multi-stakeholder cooperative, in a way that gives players meaningful influence over business, design and development decisions.
The issue is these games stop being entertaining. Pulling the lever on a slot machine might be fun the first few pulls, but after that it’s simply waiting for a dopamine rush.
MMO’s basic gameplay without leveling, item drops, or any form of progression have some fun aspects. But, they need to tell players to collect 50 rat tails because otherwise players wouldn’t.
Mobile games have distilled this down even further, with the minimum possible amount of actual gameplay possible.
But again, splitting hairs into dopamine rush versus actual fun is simply semantics. Fun = dopamine rush. Does it matter if you are cognizant of the fun, or do the chemicals matter?
To be clear, I am 100% in agreement with you. Most mobile games (and honestly, most pc games at this point with their item and resource gathering mechanics; and I am absolutely talking about MMO's - there's a reason many many many franchises are working toward a multi-player experience instead of focusing on the single player game) aren't supposed to be 'fun'. They're supposed to be addictive, and I believe it is a real problem, especially for kids growing up learning that a dopamine hit is just one iphone game away.
Anyway, though, I guess my point is - it's all semantics. When you say, it's not 'fun', it's just a dopamine rush, developers and sales people can say that's just the same thing. They can argue that people wouldn't play unless they received some kind of value out of it.
To be honest, I've forgotten where I started with all this, other than to say - when you have to split hairs on the definition, it leaves room for people to interpret their own meaning and ignore nuance. Therefore, this is an argument that people don't want to hear, engage with, or consider, I believe.
I don’t think you can say fun = dopamine rush. A dopamine rush works when the periods between them aren’t fun. However, when drunkenly singing drinking songs with your friends it’s overall a pleasant experience rather than having moments of happiness and long segments of boredom.
So what I am saying is gamers have mostly forgotten what it is to have fun in games. Playing around with cheese wheels in Skyrim is different from grinding a character to game breaking power and killing everything in one hit. Challenges based in getting better at the mechanics are different than challenges based on pure time investment.
IMO the three pillars of a great game are entertainment, fun/joy aka playing, and dopamine rushes. Portal 1 was a standout for having all three, but it’s hard to pull off.
> You want to stop them from playing through too much of the game's content and burning themselves out on it. So you lock them out with a timer, forcing them to come back later. Then you reward them for coming back every day.
Simultaneously offering a pay-to-play option that enables the user to bypass the timeout with money
This is the key difference. They mobile games don't want you to wait and come back tomorrow. They want you to pay to remove the blocks and just keep playing, today, tomorrow, every day.
This is not _entirely_ true, not that it changes anything significant about the predatory nature of f2p developers.
They absolutely want you to come back tomorrow. Apparently (inferring from what e.g. Mihoyo says and does about "total lifetime income") a whale that spends a bit less at once, but keeps doing this for years is far more valuable than one that blasts through the game and burns out immediately. You also need some dolphins and free-to-play people, both to fill out the multiplayer lobbies and appear less predatory.
Yes... I build iOS apps for learning Japanese ( https://reader.manabi.iohttps://manabi.io ) and recognized how the same kind of SRS system I was building is both optimal for learning and habit-forming in the same way as game dark patterns. It felt economically and morally fortunate but I recognize it’s not an easy or light responsibility to customers to get it right.
> Uh, it's interesting that you see the design of your product as a counterexample, because to me it looks exactly like what mobile games have been doing as a mechanism for maximizing engagement.
Addiction is a habit that interferes with other areas of your life. Smoking isn't bad because you're having something to do with your hands during a break, it's bad because it's expensive and impacts your health. Playing games for 8 hours is fine if you have time for that, but bad if it stops you from keeping your bathroom clean.
So yeah, intentional habit forming is using the same techniques as addiction building. The difference is, essentially, in the informed consent involved in the former.
Unrelated to original thread - I find it interesting the way you approach this comment. You mention pricing almost as if you're apologizing. Your competitors, even if they are binge-focused, don't apologize for charging customers. They would never have a an advert that says "we've got this great course, but before you go check it out, I'll warn you, it costs money".
I like your thinking of non-binge learning, and think you could really use that as a differentiator, in your marketing.
Your site looks great, and I really like the way you approach it, or describe it here, but that isn't coming through in your branding. Think Salesforce's "no software", they showed who the enemy was, and put them squarely against it, and if you really look at it, they were selling a CRM, not selling "no software", you're even closer to your product.
If you haven't yet, you may want to check out the book Play Bigger and category creation.
I appreciate it. We've struggled with communicating the non-binge aspect, and usually approach it from the other side: by talking about reviews, spaced repetition, remembering, etc. But I think you're right that approaching it from the "non-binge" side is a good idea. Coincidentally, we're about to do a major landing page revision, so the timing is right to give that a try!
As for apologizing for charging money... you're right there too. A lot of people will complain when someone dares to charge money for learning resources, even those that take multiple full-time staff to maintain. I think that's worn me down a fair bit!
I'd look at it from another perspective. Committing to learning an entirely new language is a big undertaking, so ideally you want to make sure that the resources you're using are of high quality. In that sense, it costing money is actually a good thing, since it signals that this is premium quality content and worth spending your time on. Of course this needs to be clearly communicated as such, like Brilliant.org for example (at least their marketing communication focuses on the quality).
Cool, I'm not a marketing expert, but I've had a bit of experience in the past. I'd be happy to give you my thoughts if you want to run it by me. Details are in my profile.
I don't know if I'd like that. I grabbed some of Maximilian Schwarzmüller's courses off Udemy a couple years ago and, by far, my favorite thing was being able to blast through the simple concepts that I already understood and to slow down on the new stuff.
I think a recommended pace that's easy to achieve would make a nice goal, but I would balk at the idea of it being enforced if I were paying money for it.
Ah, that looks really neat! I have a couple minor pieces of feedback after looking at the homepage. The graphic for "Review Exponentially" [1] was a little difficult to grok initially. My first thought was that time was flowing downwards. I think a labeled arrow indicating the flow of time would be helpful, perhaps also a label to indicate that the numbers at the bottom represent days (instead of relying on me to understand that implicitly because one of them is labeled "Day 4: Lesson".
Also in the "Course" section I think you should include a link to "All Courses" because I almost bounced because none of those three courses were interesting to me (but I eventually found the list at the bottom, and SQL is a topic I'm interested in solidifying further).
Lastly I think it would be great to have a sign-up list to be notified of new courses (I do realize that perhaps if you sign up for a free account you _might_ also get information about new courses, but that's doesn't seem fully certain).
Your product has a feature that curtails 'addiction' for the benefit of the user - ie to improve knowledge via spaced repetitions. A knowledgeable customer will recommend you and come back for more. Great!
Social media requires time from those users - they want to know you, crack you open psychologically, so they can then be better at selling you stuff (and, incidentally, pass all that info on to 3 letter agencies for their population modelling etc). That is a different model. They want you to be deeply engaged for a long time. The longer the better.
Interesting. I was recently recently reminded about the "Pomodoro" method ("Pomodoro" = 25 minutes on, 5 minutes off, 1 "set" is 4 Pomodoros followed by an additional 30 minute break) - it might be interesting to have a learning platform with a Pomodoro timer baked right into the site.
I don't think he is judging the way in which you learn.
He's simply referring to the current state of research like:
Putnam, et al. (2016). Optimizing Learning in College: Tips From Cognitive Psychology
There has been lots written on learning strategies but one thing we are rather certain of is that cramming is usually the worst way to retain information (if you want to learn it well). You can even read about funky neuronal reasonings for that argument if you are interested.
Expanding your thought a bit: for example, for people with ADHD-PI, cramming may in fact be the only way in which they can learn the stuff they're not currently being obsessive about.
That said, you can't optimize for everyone simultaneously. There's plenty of cramming-friendly resources available.
I don't like the current trend of social media apps burying users under push notifications, calls to action, and other engagement hooks.
However, there's a second, parallel problem adding fuel to the fire: The more we talk about overindulgence in social media (or Netflix, or video games, or fast food) as an act perpetrated by evil corporations on us helpless individuals, the less sense of individual agency we give ourselves. I'm not suggesting that we let social media companies off the hook, but battling this problem is going to require more than simply shaming them in Medium posts. We have to start reminding people that they are in control of their decisions, and that they can take steps to reduce their social media usage to healthy levels.
I know the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education. We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see. We also need to encourage people to take control of their feeds, muting users and topics who draw them into unproductive discussions.
> I know the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education. We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see.
I see it more akin to the opioid crisis. Just like drug manufactures shouldn't be pushing opioids as a way to deal with minor pain and depression because they know it hooks users. Maybe social media companies shouldn't be pushing hateful and outrageous content to hook their users.
You can sing about personal responsibility all you want. But these companies pay scientists millions of dollars a year to come up with ways to keep you hooked. The only way to win is not to play the game. Normal people are seriously outgunned here.
This is quite right. PragmaticPulp's phrase "an act perpetrated by evil corporations on us helpless individuals" is a bit of rhetorical jujitsu, creating a strawman to set up a rallying cry "I believe people have agency!" You don't have to assume that individuals are "helpless", as if they are generally helpless and lacking agency, to agree that these corporations are exploiting them.
That said, people's agency is limited (bounded rationality). Awareness of how one is being manipulated is not evenly distributed among the population (asymmetric information). And even when there is awareness, people are unevenly affected by it and unevenly empowered to deal with it.
It’s not just normal people who are outgunned — I am tech-savvy and have had an iPhone since the day they were released. I know how to modify notification settings, but there’s nothing I can do preemptively when can simply Twitter add a new category of notifications, as they did the other day. I had to go into the notification settings within the app and disable this new category, and then do it again for each of the three accounts on my phone.
Similarly, Lyft makes it nearly impossible to figure out where to turn off the coupon notifications. And don’t get me started on LinkedIn...
This isn’t just an issue for normal people. It’s also time-consuming and troublesome for anyone short of the dev who built the dark pattern.
Why would you have notifications turned on for Twitter? Just turn them off. The trick is to intentionally go on social media and waste time, instead of being pulled in by a notification.
I use Twitter for my startup and want to get mention and retweet notifications. I turn off all others, and as a result I get only a handful of notifications per week.
What annoys me is new categories of notifications that Twitter turns on by default, even for users who have disabled all of these novel/generic/impersonal notification types.
> This is quite right. PragmaticPulp's phrase "an act perpetrated by evil corporations on us helpless individuals" is a bit of rhetorical jujitsu, creating a strawman...
How is this a strawman? This article is absolutely this inflamatory, if not more so.
> Reddit, for example, has always obfuscated the true Karma score (“to prevent vote brigading”)
There is no reason to believe otherwise. Everyone from Youtube and cheat-bot detection services utilizes the same strategy.
> To the contrary, the Ministries of Truth, Peace, Love, and Plenty will all be private (or publicly traded) entities.
Basically, the conclusion is that we're inevitably going to turn into 1984. There isn't even a suggestion of how to change the situation.
And, there's only so much time in the day. Nobody can afford to keep up to date with every possible threat the market throws at us, and to stay vigilant and proactive. You end up picking your battles.
these companies pay scientists millions of dollars a year to come up with ways to keep you hooked
Exactly like tobacco companies had scientists on the payroll to boost addiction. That is the best and most accurate analogue for Facebook I believe, it’s like smoking, it’s addictive and it externalises its harmful effects.
Can you provide some reading material about the scientists that get paid millions of dollars to keep you hooked? This is the first time I’ve heard of scientists involved.
> He was hired to work at Facebook as a quantitative social psychologist around November 2015, roughly two months after leaving GSR, which had by then acquired data on millions of Facebook users.
> Corey has been working as a quantitative researcher at Facebook since last summer. His growth research team has “two sociologists and a manager trained in communications with a sociologist as an advisor,” according to an article he published early this year. The team helps expand Facebook to developing countries. Corey uses R-based software stack, collects data via Hive and uses a few other coding languages to do his job.
> We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
> This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
I feel the downvotes were unwarranted. (Not accusing you of Downvoting) Perhaps the user isn't familiar with the industry. I know I'm out of my league with most of the content posted here.
> the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education. We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see
To this list I'd add the concept of epistemic hygiene. Just as crowding together in metropolises re-wired our culture to value hygiene, these informational metropolises of social media will eventually cause us to greatly value epistemic hygiene. Seeing a rage-inducing headline would then evoke a kind of "ew..." response. At least, this is what I hope will happen. Maybe it will require a generational turnover.
It's less about screen-time or self-control and more about toxic, malicious software used for communication.
A lot of the calls-to-action used to generate "engagement" are the same calls to action used for legitimate communications, and the only way to tell is to "engage" with the product.
There's nothing wrong with people opening their social media app if they receive a real message from a friend. The problem is when the platform is incentivized to "manufacture" messages even when there aren't any.
>I know the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education
Not at all; its more like breaking up with a toxic, manipulative, dishonest partner. There's many more fish in the sea and there's many more ways to heathily socialize than just Facebook. Though I suppose people in abusive relationships often have a sort of Stockholm syndrome where they see no alternatives.
not only do people in abusive relationships often have stockholm syndrome, many people are actually materially dependent on their partner, for example women who have stayed home a long time historically but also still today, and thus they literally cannot leave. Or they have children, or leaving might put others at risk.
Point of working through the analogy being, even on an individual basis framing leaving a relationship as some sort of arbitrary choice is kind of nonsensical, in particular if there is a power imbalance between the people in the relationship, to the point where staying in an abusive relationship might be a 'rational choice'.
Which is actually why we've created very elaborate laws and customs surrounding marriage rather than telling everyone "well if you don't like it just leave".
This is why we need to promote the existence of more and better alternatives. Because "if you don't like it, leave" doesn't work if there is nowhere else to go. But if there is, it does.
>they literally cannot leave. Or they have children, or leaving might put others at risk.
>Point of working through the analogy being, even on an individual basis framing leaving a relationship as some sort of arbitrary choice is kind of nonsensical
Could you give some common examples where an individual "literally cannot leave" FB, or where leaving FB would put others at risk?
The best example I could think of is a small business owner losing potential money from taking their business page off FB.
I can give you a few direct examples. I live in a condo. Our entire building management runs through a facebook group. My girlfriend works with kids whose schooling right now is managed through whatsapp. I know quite a lot of people who are required by their employer to maintain a facebook presence because that's how they get into contact with a lot of clients. For a lot of folks FB and affiliates have become part of the basic digital infrastructure.
>I know quite a lot of people who are required by their employer to maintain a facebook presence because that's how they get into contact with a lot of clients.
Being compelled to mix business with personal life like this seems quite objectionable indeed.
> We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see. We also need to encourage people to take control of their feeds, muting users and topics who draw them into unproductive discussions.
This is going to make a world of difference the first time people try it. I started monitoring screen time/configuring notifications about a year ago, and made me feel a lot better about the time I spent on my phone.
But, all of this assumes that social media should (and will) stay the same as it currently is with addictive never ending feeds and endless notifications.
At the end of last year I sat down thought of what social media could look like if it was designed to be used less, and in a way that would add value to people.
For me, that looked like intentional written reflections, shared with a small circle of close friends and family. To keep me from scrolling, there would be no feed. To keep me from checking my phone, content would rarely be surfaced.
This ended up becoming Sundayy, a mindful social network that you can only check once a week (on Sunday): https://www.sundayy.app
Each day you're prompted to slow down and reflect, but there is no feed of reflections. They're all kept secret, for now. At the end of the week, on Sunday, reflections are revealed. It's a more intimate insight into how people close to you lived their week - day by day and in their own words.
We should definitely bring more awareness to monitoring screen time and curating feeds, but we should also question whether we should have to do any of that at all :)
Yeah look, they probably will be, at the start. It's tough to move away from FB, especially when it's designed to keep you there.
When I deleted FB from my phone, I looked around, and there really wasn't any alternative social media platform that didn't have the same setup - ads, feeds, notifications.
I really just wanted to put an alternative out there, so people could find something else if they're in my situation.
Sundayy isn't just challenging FB, but it's challenging the unconscious association between "social media" and "endless feeds", "ad revenue model", and "hijacking notifications".
I long for a day when social media isn't rooted in any of those things.
> The more we talk about overindulgence in social media (or Netflix, or video games, or fast food) as an act perpetrated by evil corporations on us helpless individuals, the less sense of individual agency we give ourselves.
If it's done well, it should have the opposite effect. Describing all the ways that companies are trying to get you addicted will help inoculate people against their tricks. In order to psychologically defend yourself, you first need to understand exactly what you're defending yourself against.
>We have to start reminding people that they are in control of their decisions, and that they can take steps to reduce their social media usage to healthy levels. I know the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education. We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see.
I couldn't agree more, and to add to that I think it's usually a foolish approach to treat people as hapless automatons without any agency if you're trying to convince them that your point is worth listening to. If you look at two of the worst political failures in the UK recently (the Remain campaign for the Brexit referendum in 2016 and Labour's election campaign in 2019), I think what they have in common is that they essentially told people "you're a downtrodden proletariat buffeted about by forces well outside your control, but we can make things better for you" which is such a foolish approach in my opinion. Regardless of whether they actually do or not, the average person likes to think they're in control of their own destiny so blaming everything on Facebook being manipulative bastards will never work if your aim is to change the public's relationship with social media.
Yeah it might be true, but it's extremely counterproductive to point this out when you're trying to convince someone of something. People like to feel like they have agency, even if it's mostly an illusion.
I agree! The economic effects of Brexit are difficult or impossible to measure, so they're not worth worrying about. On the whole, Brexit was good for British people because it gave them a feeling of action and momentum, like they had a real "hand in history." Britain has never been as energized and optimistic as it has been post-Brexit.
In the same way, I bristle at the suggestion that I'm not rational enough to resist the "addiction" of push messaging. We all know what the word "addiction" actually means. This is not addiction, this is just hokey phooey using fake-medical language to push a liberal agenda of extra regulation.
Every time I get a notification from Twitter or Facebook, my day gets a little brighter. When LinkedIn tells me that someone is looking at my profile, that means someone cares, and that's a wonderful thing to know.
Your experience is far from universal. Twitter has made my life measurably worse: I'm less happy, have less free time, react less charitably to people I disagree with. I can look through my comment history and identify the periods where I was most active on Twitter, because I'm constantly flying off the handle at people for no good reason.
I would. Every morning, I wake up saying I'm going to log off Twitter for the day as soon as my coffee's done, and most days I end up logging multiple hours of Twitter time.
What helped me was actually deleting my account and after that, not feeling like creating a new one.
I still peruse feeds of some people I used to follow, but now, instead of doing it compulsively every hour or whenever I need my dopamine, I do it once a month or so, if I don’t forget. I don’t have them bookmarked, so I also enter the full URLs.
And if Twitter says I need to log in to read more of a thread, or whatever, too bad. I don’t have that thingy that you use to log in to Twitter.
Now, Facebook is a different story since they offer a very walled garden. You cannot even read most of the stuff unless you log in, by default. Trouble is, there are people on it I interact with. As it happens with some of those people, Facebook is the only way to reach them.
And another thing is Hackernews, of course.
See, those are places where stuff happens. You go there, scroll to what interests you, engage in a discussion, and it almost feels like meeting people again, especially in a pandemic world.
Or I just want to feed my brain with new stuff to get that sweet dopamine.
I think that what could change this addiction could be entraining the brain to release dopamine as a reward for engaging into more immersive, time-consuming activities. Like reading more of the long form, deep articles or books. Watching a 2 hour movie instead of 15 minute usual youtube fodder. Get that side project to a usable state (starts crying).
This is all just personal anecdote. It might be wrong for you; maybe it doesn't apply to everyone, but it has worked for me...
The options seem to be:
1) Delete your social media account
2) Set up timers (OS-level, account-level, etc)
3) Filter you feed to only the people who matter
The first two options didn't work for me. I created a new account and deleted the timers. The first option left me feeling excluded. The second option just turned SM into a drip-feed, making me check whenever the timer was up. If they work for you, great!
The option that worked was to filter facebook and snapchat to only show people who personally mattered to me in the physical world. I only see things about people I come into contact with and care about. I know them well enough to know the whole story rather than just what they post at face value. Their posts can encourage conversations rather than make me feel bad about some cool thing I'll never do or know more about.
Social media is a tool which, used effectively, can benefit you. But used ineffectively it can harm you. I hope schools of the future or some people/institution teach effective social media use. Or hopefully we can enforce social media companies to follow some regulations regarding user wellbeing.
We’re just witnessing advertising’s race to the bottom. Eventually everything published is questioned because all forms of media are incentivized to push drivel for eyeballs.
The optimist in me see’s how ridiculous and blatant the drivel is becoming and suspects we’ll achieve some sort of collective enlightenment before we extinct ourselves. A smarter media then emerges (timing is everything here) eschewing advertising dollars in favour of a more consumer friendly model.
Paid services to align interests with the user (best user experience wins) are the way to go and I really hope they win out in the long run. I also think individual driven filters and algorithms are the way to go. In my experience, heavily AI/ML driven algorithms for user experience (think spotify suggestions) lead to overfitting super quickly. ML assistance can be amazing (obviously), but I know I want some degree of manual control on my algorithms.
Another plug[0] but, I'm hoping to be a part of this solution. Just a personal project right now, but I'm building a search website whose end goal is to be ad-free (paid) and let users create filters to remove any of the countless trash websites that SEO their way to the top of google.
Currently I've just got it working for web search and a few filters I've created for myself (removes a ton of websites I never find value in, some blogs, news sites, pinterest, etc.)
> Paid services to align interests with the user (best user experience wins) are the way to go
I agree; aligning interests is important.
> I really hope they win out in the long run
It's hard for me to see how they can. Advertising supported services are free to the user, which improves take-up which means that network effects are in their favour.
Hope they can and believe they will are definitely not the same in this case. I do hope we get some great competition for paid alternatives in all sorts of categories, but I agree that it's a long shot for these types of services to win in most markets.
Ad supported free services will (unfortunately) always dominate network effect driven spaces.
Abstinence-only education fails for sex, because a) sex is a built-in human drive b) sex is beneficial in lots of ways, and c) there are lots of ways to have sex safely.
Abstinence as a strategy is not in and of itself flawed. The way lots of people cope with the dangers of heroin is simply by never trying it their whole life. Lots of addicts adopt a strategy of permanent abstinence, forever.
Some things are very harmful. I don't think we all need to try all to them, just to make sure.
> We have to start reminding people that they are in control of their decisions
Exactly. I not a big FB user, but last year I was on a lot more than normal b/c I was home. Then one day I realized I was just either arguing with family or reading things that left me disappointed, and wondered why am I subjecting myself to this? I didn't delete my account because I still use messenger to communicate to a few people, but I haven't been on FB proper for months.
> I know the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education. We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see.
Isn't this also denying people their individual agency, though? Using Facebook is not like sex is for teens, who are faced with the rather unavoidable biological realities of puberty. There are many ways to achieve the things people seek from Facebook, some technological and some not.
We don't need to start under the assumption that people will not be able to commit to having more healthy media consumption habits. These tools offered by Apple, Facebook and Google you mention are not things we should be encouraging. If these companies had the user's best interests at heart, their products would not need these kinds of sub-features in the first place. But when you study the gambling and nicotine industries to figure out how to better hook your users to your mobile apps, you didn't start out from the right place. So I would reckon the answer to that would be to abstain from the product entirely.
I think our built environment and lack of engaging community is the main reason we are disadvantaged against self control. We need trust and support of others, we need it daily, and we dont get enough of it to make significant progress against many addictions. A change in our built environment is a start to improving our connections to our local community, and subsequently continued accountability when facing addictive influences.
It's shockingly easy to delete Facebook. I haven't used it in years, and it almost never comes up. They have done a masterful job of making you feel like you cannot live without them through the careful application of dark patterns, but I assure you that not only do you not need it, but once gone, you also will not miss it.
> I know the common refrain is "Delete Facebook!" but that's the equivalent of abstinence-only education. We need to start talking about how to configure Screen Time on iOS, or how to use Facebook's built-in tools to hide content you don't want to see. We also need to encourage people to take control of their feeds, muting users and topics who draw them into unproductive discussions.
Those tools are provided by exactly the same companies that were manufacturing the addiction in the first. The tools can be revoked or changed the second they show signs of actually being effective.
Sorry, but this is like asking your drug dealer to help you with becoming clean.
I'm not sure empowering users in this way would be all that effective. It would be like trying to empower people to avoid opioid addiction. It's hard to set and stick to limits on something that is designed to addict you.
| the less sense of individual agency we give ourselves
In aggregate, the population is at the mercy of material forces. Turning a social problem into an individual moral failing has never managed to solve anything at scale.
Apps and websites often don’t honor their own configuration choices. They use loose definitions to slide in advertisements or other unwanted information. They do not give you the granularity to decide what types of information you actually want. They also change them after you’ve taken the time to get it sort of working for you.
You’re not going to give people agency by having them press a button provided by someone else. The button doesn’t really do anything. “Cold turkey” is really the only solution to these mind games.
This is a good point. Feeds should be available in well-defined programmatic formats (such as RSS or ActivityPub) so that they can be processed outside the domain of the app whose data carries them; if necessary this should be enforced by law. This would make it a lot easier for users to consume those feeds in ways that they control, and not that Big Tech controls.
Perhaps a solution is that we separate the delivery of social media posts from the company that produces them. (Just like email can be received in a client that is not run by e.g. companies that send spam).
This means that we can teach our Social Media Inbox (abbreviated here SMI) that we don't like certain messages, and the SMI will remove them from the feed.
The trick here is that the SMI has its incentives aligned with the user, not the social media companies. So there is no incentive to make us addicted.
> battling this problem is going to require more than simply shaming them in Medium posts. We have to start reminding people that they are in control of their decisions, and that they can take steps to reduce their social media usage to healthy levels.
Yes, but that's only going to have a small effect, at best. Just like you can't solve the obesity crisis by telling people to eat less.
Things are getting more addictive (http://www.paulgraham.com/addiction.html)
because companies under technology and capitalism form weak superintelligencies that
are capable of building things that are increasingly addictive. So asking people to use willpower to overcome that is going to increasingly fail.
I think the only thing that could succeed is government intervention.
> I know the common refrain is "Delete Facebook!"
If FB and other social media providers were required to federate using ActivityPub, then people would be able to delete FB and still have access to their friends and contacts on it.
> We also need to encourage people to take control of their feeds, muting users and topics who draw them into unproductive discussions.
With ActivityPub it would be a lot easier to build applications that allow them to control and curate their feeds for themselves. The user is back in control.
Adam Curtis's century of the self is a good (but long) documentary on the advances in psychology and it's merger with advertising/politics. Worth a watch IMO.
A tangential point about the horrifying power of euphemism: I've done some consulting work with some large biopharmaceutical companies in the past and found myself consistently shocked at how effective euphemism was at making people comfortable talking about, and doing, very questionable things. E.g. "Maximizing treatment" = "extend how long a patient requires our medication". This was at a senior level and these were, on the face of it, warm, caring people having a very comfortable and open conversation. I had always assumed decisions that directly disadvantaged the consumer would look different, with some sense of secrecy or at least awareness. Nope, one level of language abstraction is apparently all it takes.
There are countless other examples out there. I just finished "Cruel Britannia" regarding the torture practices of the British over the last century. They were particularly adept at it. Euphemism is such a powerful tool for doublethink and systemic abuses of power.
I feel like there's basically two business models in the world - you can be a baker, in which you try to create a product that customers will want on its merits, and work to make the best possible customer experience, or you can be a crack dealer. I think a lot of people think they're bakers, but you gotta realize, the moment you start sprinkling crack in the cookies, you're not selling cookies anymore.
There's a family-run bakery near my house that specializes in cheesecakes. They make the best cheesecakes I've ever tasted. They only make a certain number (~200, say, I dunno really) per day, they always sell out, then they close for the day. Every year they take a two-month vacation. That's it. That's the whole business.
Anthony Mangieri was downtown for a few years here in SF before he went back East. https://www.unapizza.com/about I got to eat there one time. Sublime. Unbelievable.
All he does all day every day is pizza.
Something like that (and a good accountant) and you're set for life, eh?
The small things preventing humanity from murdering, raping, enslaving, and torturing one another. It's easy to forget that these exist and that everyone must always be on guard, defending the world against ones own baser instincts and defending oneself against humanity at larges baser instincts. As we collectively forget, or consciously ignore, our duty to one another we mortgage humanities future with collective suffering today. Eventually the bill on these ills come due, one way or another the sociological debt is always repaid.
Indeed. That's why I'm worried when I see interpersonal relations being replaced by market rules. Markets have no morals, principles and ethics by default, and generally consider these factors to be inefficiencies.
>Markets have no morals, principles and ethics by default
That's not exactly true. "Markets" are made of people, and the morals principles and ethics of each participant imbue any given market with certain characteristics. The market doesn't reflect the exact morals of one particular actor, but it is nevertheless a product of them. Think matrix multiplication.
For examples look at Roman civilization, Jewish civilization, the meeting of the two. Or look at the Wire.
I agree that "morals principles and ethics of each participant imbue any given market with certain characteristics". I was highlighting that these are incidental, not structural in the market. And my main point is: these characteristics are not stable. More than that, these characteristics are actively being diminished, ground down by the market - and that is a structural aspect.
This happens through competitive pressure. If, as a market player, do something slightly shady to get an edge, I'll get ahead and my competitors will either have to do the same, or risk losing. When enough market players go along, the slightly shady things becomes a baseline, "standard business practice", and morality on the market is that much eroded. Then someone else starts getting ahead on being even more shady, and the process repeats.
(Ironically, we often call this chipping away at ethics/morality "innovation".)
> The small things preventing humanity from murdering, raping, enslaving, and torturing one another
nothing, except for the threat of punishment from the law - which mostly works.
Why do you think in a country where the law enforcement failed (such as a failed state) breaks down in to chaos so easily? Where's the "Principles, Morals, Ethics" then?
> nothing, except for the threat of punishment from the law
You’ll be happy to know (I hope) that plenty of us (perhaps even the majority!) don’t have any intention to murder and rape, and it’s neither law nor religion which compels us to be minimally decent towards others: https://youtube.com/watch?v=AwebTX3rk3E
As far as I know, landlords are not known for possessing any of the qualities you list. Indeed, what is special and unusual about this situation might not be the bakery itself, but the landlord of that particular bakery. There might be thousands of such potential bakeries all over, but for the fact that the profits are extracted by the landlords, and this might be the normal state of things. We would have no way to tell.
Market rate depends on what the buyers will bear, on aggregate. Given that rents are sticky - it costs a lot to move yourself, and even more to move a business - the landlord could eat into a lot of their margins, if the landlord was so inclined.
And it also costs a landlord when a business moves. How likely would they be to find new tenants if the previous tenants mention that they were forced out by an above-market rent increase? You're tilting at windmills.
I like analogies, but I really don't think this one illuminates anything.
The US bread industry has been slowly jacking up the amount of sugar in their products for decades. That's because almost all large-scale food producers also want their product to be as addictive as crack. Maybe there are "bakers" who are more honest than that, but as a category I don't think there are enough to base a useful analogy on.
You could just work with the analogy, if you wanted.
"Food science" is all about engineering food to be addictive. Which, sure, it's what 'consumers want', but it isn't what people need.
The line between sprinkling MSG on corn chips fried in cheap seed oils, and sprinkling crack on cookies, is a fine one.
When I hear "baker" I picture a millennia-old activity carried out on a human scale. I don't picture an enormous factory churning out least-common-denominator starches, jacked up with sugar, seed oils, and dough conditioners.
Sure, they don't bake most of the bread. But that is clearly what OP was referring to, and you don't have to refuse to get their point.
It's not, though - that's the point. You're either a baker or you're a crack dealer - you don't get to be a baker that just sells a little crack, that's not how it works.
I work for a very reputable company who produces high-quality content serving vital public interest. We still use "user engagement" as a metric in a mostly positive way. But we also aim to have content "go viral" and breed daily habits and we promote our content heavily on social media. We still have to pay the bills.
Can you give few examples "baker" products which have scaled to really large scale? I always see it as everybody starts as a baker but over time it becomes a sliding scale.
Apple hardware/software, Microsoft Windows, Intel hardware, Netflix (buffet TV shows), Amazon web store (logistics), Etsy store, Steam / GOG Games - all of these (except Netflix) are selling* either hardware or software, to you, once.
Netflix is a subscription to access and stream a lot of shows, but it is "fee for subscription service", not all that different from an alarm company subscription, OnStar, or a subscription lawn maintenance company.
* Ok, Steam / GOG Games or Microsoft Windows is a licensed use with limited conditions, it's not like you can use it on just any hardware
I would not put Netflix in this category as there has been extensive coverage on how they try to get as much time spent as possible. Autoplaying content at the end of one show is a prime example, and how that time was optimized down to be incredibly short.
Short version: Subscription retention and time spent correlate, so they still have the "crack dealer" model incentives.
Most large e-commerce sites including Etsy aren't bakers. They use recommendation engines, personalized site experiences, email marketing campaigns, discount coupons for repeat purchases, dark patterns, offsite re-targeting ads and so on to get you to buy and buy repeatedly.
edit: And the ever fun one of free shipping with a minimum order of X, great way to make you think you're getting a deal by giving them more money.
I'd consider the auto industry to be a "baker" industry (for the most part). Ikea's a "baker", GAP is a "baker". A lot of the pre-digital world made pure play baked goods.
>A lot of the pre-digital world made pure play baked goods.
Coupon codes on your receipt, planned obsolescence, strategic item placement in store to encourage buying additional or more expensive items, loyalty programs and points, physical marketing mailings, targeted pre-internet advertising, etc.
Careful with fashion. Fashion trends are manipulated to keep the dopamine flowing if and only if you update your wardrobe every season. All the magazines, shows, etc... it's not entirely innocent of the problems we're discussing.
To piggy back on this, is it possible to continue being a baker at scale without pricing most people out of your product?
i.e. is it possible to provide something like (internet search) or (social networking) that relies on a paid model? Not everyone who needs/uses google or facebook can afford an iPhone.
Social media is a perfect example of where this is used. I know countless people that are literally addicted to it. They want to see every new update continuously and the social media apps are designed to do this through notifications.
I saw this clear as day when I was at jury duty. We were waiting for the court to get back into session. There was a girl sitting next to me on the bench and I noticed about every 5 minutes she would open up her phone and go through her routine. First pull up facebook, scroll through it. Close the app then pull up instragram, scroll through. She did this consistently for the entire 2 hours we were sitting there (the court was delayed). If that isn't addiction, I don't know what is.
Then look at all the mental health problems young people are having these days - bullying, depression, suicide etc. and I would say a big part of that is influenced by social media. People see other people living the "good life" and they get depressed because they can't have the same. Young women see "pretty" women on Instagram and they know they can never compete with that, so their self esteem drops to nothing. Then you have all the bullying that goes on as well. Completely toxic environment.
A few years ago, I realized that one of the reasons I felt tense and anxious and overwhelmed all the time was because I had a huge backlog of mental processing. It was like I had all of this sensory and information input coming in on a conveyor belt and not enough hands on the factory floor to go through, box up the useful stuff, and throw out the rest. So I just felt it all piling up in my head as this vague sense of "too much to do".
It finally clicked for me that the reason was because I was constantly running that conveyor belt. Every second I had some downtime, I got my phone out and read a bit more social media. Not only did I keep stuffing more in, but I never set any time aside to process.
So I decided to break my social media phone habit. Moved Reddit, Facebook, and Twitter off my home screen. Trained myself than when I had a few minutes of boredom to just let myself be bored.
Oh my God, I can't tell you what a difference it made. I finally felt like I could manage my own mental load. Instead of constantly having all of this news that I didn't know what to conclude about, I could just work through my thoughts, sort it out in my head, and put it away.
Being bored is amazing. We should all do a lot more of it.
As someone who has ADD, you just described my brain. For me, it's slightly better if I don't spend time on my phone, my brain does still get overloaded quickly though :)
A girl was sitting next to me on a train. For the whole one hour journey she scrolled and switched between apps in a spasmodic manner. At times she would lay the phone down for a few secs then resume at once.
It looked sick. It looked like someone on amphetamines.
It’s not exactly the cigarette, it is a smoking right behind a fence where all bitches and bullies do hang out. Just walk further and smoke in nicer places like HN (coughs and spits on js)
> Then look at all the mental health problems young people are having these days - bullying, depression, suicide etc. and I would say a big part of that is influenced by social media.
Guess who has suffered through most of those, without any kind of social media, outside of HN :)
Edit: And suffered before getting on HN. Seems relevant to the situation.
Well, you know, that could also be because the youth racks up student debt, can't afford to buy their own houses, are expected to slave away their life for sub minimal wage jobs etc.
Which is why I don't get why smartphones and social media are almost solely blamed in these kinds of discussion.
We have COVID since one year, wealth gaps are getting bigger etc.
Couldn't' 'seeming more fragile these days' be a part of copycatting, in that if person X got offended by something, then these addicted-to-social media-types copy the behaviour...and then some?
If a service is monetized with ads, then the user is the product. We say that a lot but I think a lot of people still don't understand it, especially outside the tech sphere. "Engagement" isn't about building a better service; it's about serving more ads. "Driving user engagement" should be seen as synonymous with "psychologically manipulating users to use the service so they see ads".
Why not “providing value to users so they use the service so they see ads”? A lot of providing value is marketing and engagement. It can be manipulative and it can also be beneficial to the user - both should be accounted for.
Ads pay for a service that allows you to video chat on a reliable and private connection with anyone in the world, for free. They also pay for google search, which has immense value to you.
They also connect consumers with products they might want. Most advertising is simply making sure people know about a product if/when they need it, no deception required.
Because people tend to drop you and forget about it when you only provide clickbait and doomscroll fodder when they realize that you have nothing of value to provide.
I'm not so sure about that. What value does Facebook offer that isn't done better by a dozen other platforms? Most people I talk to openly admit it provides them with no value, but they still idly scroll through their feeds day after day and can never bring themselves to ditch the platform. Simply put, they're addicted. Facebook is widely hated yet wildly successful because they have mastered psychological manipulation.
Speaking for myself as a light FB user in the US, I get a ton of value from seeing my relatives' updates. Otherwise, I would have no connection with the next generation of babies in my family.
I generally find this line of thinking problematic: conversations with a homogenous and small group -> generalization based on an interpretation of their interpretation of their experience.
Also objectionable but on the other side: people's revealed preferences for how they spend their time are better indicators of what they find valuable.
Neither explanation is particularly compelling except as confirmation bias IMO.
I was providing a quick and simple example, not making a generalization about all Facebook users. Just as my anecdotal evidence does not necessarily demonstrate an addiction problem across all Facebook users, your anecdotal evidence does not demonstrate a lack thereof.
The point is that there are users whose use of a service is driven by psychological manipulation, not by a value proposition. The relationship between many people I know and Facebook is merely an example.
There are also users whose use of a service is driven by direct value received (connecting with families, for free). Both can be asserted as true, both carry very little informative value as statements. The strongest statement that this leads to is something like "some users are manipulated, some receive value."
> "some users are manipulated, some receive value."
Sure. That brings us back to my original question. Why invest in "providing value to users" when psychological manipulation is so much cheaper? So long as psychological manipulation has a greater ROI, there's no incentive to invest in providing value. Why pay $200 for a fishing net when a $20 one is easier to use and catches just as many fish?
Because the people in charge of those companies understand the value in users returning to use their service. And that short term trickery isn’t really a good route to building value over decades, which is what these companies are eying.
Can you give a concrete example of short term thinking that has surfaced in product? I see a lot of hand waving (in general, not from you) about BJ Fogg’s work, but I’d like to know what you specifically think was a decision made to favor “short term trickery” instead of investing in a long term quality product.
The mobile games market is an excellent, concrete example of service providers which not only favor "short term trickery" over "long term quality", but actively reduce the long term quality of their services to engage in that trickery and are wildly successful for doing so.
Modern mobile games are chuck full of time gates, grinds, fake currencies, and many other dark patterns designed to physiologically manipulate their users. They usually give users just enough value for them to become invested. After that, the sunk cost fallacy keeps them "engaged" despite the terrible value proposition. Some users tough it out and consume a torrent of ads to keep playing for "free", while others fork over money for microtransactions to keep up with the game's demands without wasting so much time. Either way, it's a terrible experience for users that makes mobile game developers tons of money.
Ahh, that is a great example. Zynga (when I was a game player) did this a lot. But they also strike me as materially different in behavior than the massive tech companies that often are labeled as manipulative.
No disagreement from me that games, like gambling, use this technique a ton. I’m just not sure this menta mode applies to the decisions made by big tech.
Different metrics. Providing value could be measured in different ways concentrating on that. Engagement doesn't even have to provide value. User going through more pages to get to the thing they need and spending more time in the app is engagement.
It depends on how you define engagement, I suppose. I've never heard people discuss time spent as an engagement metric, which is (IME) usually actions taken by a user that the business wants to measure.
Engagement just measures how engaged your users are. Time spent measures that. These are all proxies for value, because it is very hard to measure value received.
> If a service is monetized with ads, then the user is the product
false. The derived data and access to eyeballs is the product. The user is a resource for creating the product.
Some analogies:
- Cows are not the product, milk is the product. Cows are a resource/asset.
- Prostitutes are not the product, sex is the product. The women are resources.
I feel this more accurately dehumanizing than simply stating users are the product.
Spending money is the assumption but that is unworkable as a metric for a third party. Far easier to say you did your part with x views than whatever janky curve on attributed sales would result in.
This is true, but it's not exclusively limited to ad-driven models. It can also be true of surveillance or in-app purchase models. The latter has become big in gaming where games addict the user and then steer them toward purchasing special items, expansion packets, "loot boxes," etc.
Basically any app or service where there is a direct link from the amount of time the user spends on it to revenue incentivizes shady "Skinner box" addictive designs and other dark patterns.
If you water down the meaning of "addiction" enough then you can say that anything is "addictive" and with that, you can carry over all the connotations of the word.
Then people start saying "I'm addicted to coffee" and "I'm addicted to bread" and before you know it people saying "I'm addicted to heroin" are met with "well, why don't you just stop, like I did with coffee for a week that time".
Since the dawn of time people have tried to make things that people want to use. Making "addiction" a synonym for "success" is just stupid.
Addiction has levels to it, and any one addiction isn't going to be identical to others. Caffeine addiction is a very real thing, as is heroin addiction. Just because the latter can be much stronger and more harmful doesn't mean the former can't exist.
To be honest, I still don't understand how people use the word. Am I addicted to water? I have nasty physiological reactions if I stop using it. What about sleep? Is that a bad habit I should get rid of?
A reasonable definition that I have applied in the past is "an activity that has negative consequences, but which one feels compelled to participate in anyway."
Withdrawal isn't actually required for someone to be addicted to something; they just have to want to keep doing it, even though it's obviously bad for them.
This definition still has flaws; am I "addicted" to eating unhealthy food? Etc.
Not a bad definition, the problem being that most negative consequences from drug abuse are due to the fact that they are illegal, and thus 1) way more expensive than they could be and 2) of bad quality.
This is profoundly ignorant, or to quote you, just stupid.
Turns out, most people who use heroin don't get addicted to it. Most people who smoke tobacco, don't get addicted to it.
Easily 90% of people prescribed a course of opiates, don't get addicted to it. I've been handed a scrip for vicodin three times, never finished a bottle in less than a year's time.
Coffee dependence is the most common chemical dependence in the Western world, and stopping will outright ruin your entire week. Is withdrawal from a hefty heroin habit even less fun? It is, but so what? I eat eggs every day, what happens if I stop? Nothing, that is not at all true of coffee.
Oh, and bread? Compulsive overeating has a similar etiology to other addictions, it leads to morbid obesity and the various dire consequences that come with it, it ruins lives. Do you think someone rounding 150 kilos on their way to 200 feels like they're in control of what's happening to them? They would stop if they could.
Addiction to gambling is also both real and tragic. And now we're getting into the home stretch: so is addiction to social media.
Remember how most heroin users don't get addicted? That's real by the way, look it up. Most users of social media don't develop an addiction, just like most players of pay-to-play games don't blow their whole paycheck on them or sneak off to play at work until they get fired, but: some do.
Ever quit smoking? It's not that bad, really. Maybe a headache, some brain fog, only problem is that you're constantly thinking about getting some nicotine for, like, weeks.
Which is why it's just about the hardest drug to quit, heroin addicts have a higher success rate than cigarette smokers (again: look it up). Addiction along the dopamine pathway is insidious. Opiates hook users through a simple mechanism, if they stop, their pain receptors go into overdrive.
Dopaminergic stimulus, which includes chemicals but also endogenous pathways, produce repetitive compulsions: they hijack the part of the brain which wants you to do that thing again.
Social media isn't nicotine. Even cocaine isn't nicotine, it's uniquely pernicious. But it absolutely creeps up into some people's lives, to the point where their social lives and work lives suffer for it, where they feel out of control, break promises to themselves to use it less, and quit with the intention of walking away which they then break.
"User Engagement" is now indeed code for "Addiction," but not because user engagement has changed in any meaningful way. It's because society has been transforming terms that used to have medical/clinical definitions to mean something completely different. PTSD and OCD are two more examples.
Of course that process of "mainstreaming" clinical terms and shifting their meaning hasn't meaningfully changed either. See as an example the endless treadmill of clinical terms for intellectual disabilities re-purposed as insults by the general populace, causing the medical community to shift to new ones, and so on.
I personally suffer from C-PTSD (C=Complex) stemming from childhood trauma. C-PTSD (generalized as PTSD caused over months/years vs. a single or cluster of traumatic events, e.g. warfare) is still relatively new compared to the traditional understanding of PTSD.
So there is some legitimate expansion of the definition of PTSD.
But one thing I've noticed since I received the diagnosis is just how many people around me use the term "PTSD" casually.
- "That project gave me PTSD"
- "I have PTSD from my last boss"
- "I have PTSD from the last four years of political upheaval"
Now, I should be clear, I do think it's possible to be impacted by truly traumatic circumstances that don't rise to the level of what we traditionally think about when we hear the term.
But I also see the term used far too casually, far too often.
Posts like this completely misunderstand how companies like Facebook think and operate. As a result, they cause people to fight boogymen instead of working toward positive change.
Yes, using Facebook instead of doing something like talking in person or reading a book is probably worse for you in the long run.
But most people don't do those things instead of Facebook. Instead they use Tiktok. Or watch TV. Or read Teen Vogue. Or get drunk and watch reality TV. Or sit alone in their nursing home with no real connection to any other human.
Facebook doesn't want you to be addicted, addiction is bad for user retention in the long term. Facebook wants you to be a happy, healthy Facebook Family of Apps™ user. I know this because I oversee ML launches on some of the highly controversial/addictive surfaces on a certain Facebook property.
Facebook employees want you to be a happy, healthy Facebook Family of Apps™ user. But Facebook as an emergent entity of its own has "wants" which can be hard to see from the inside. Facebook employees don't actually know what happens between each individual user and their Facebook account. You can do user studies, or gather aggregate metrics, but any technique you might use will obscure what's really happening in one way or another. And the whole internal idea of what is happening will naturally be bent toward what helps Facebook survive. In particular, it's very important that what Facebook employees are encouraged to imagine as positive change is not damaging to Facebook itself, or the company will eventually die.
It's true that Facebook the company is an emergent entity, and that the companies behavior and "wants" don't necessarily match those of its employees.
I disagree with the claim that "any technique you might use will obscure what's really happening in one way or another."
RCTs that measure self-reported wellbeing and other engagement-independent measures of mental health do not obscure what's really happening. Techniques like this could be used to actively improve user health, even at a cost to engagement.
Given that the only thing like evidence available to the public regarding how Facebook thinks and operates is the outcomes, and any insider knowledge is, by nature, rife with conflict of interest, I don’t think you could expect anyone to assume benevolent intentions.
Not to mention I don’t know what would be effective work toward positive change from an outsider when the decisions made for and by the company are (completely reasonably) made internal to the company.
I kind of agree that the Facebook boogeyman stuff is played out, but it’s not like there’s much of an alternative to discuss.
So at what point does the metric used for user engagement cross a threshold for 'addiction'? Wouldn't incentives to drive this metric up across the board to increase revenue outweigh the pressure to maintain a healthy relationship with the Facebook Family of Apps™? Regardless, I still can't see the motivation for Facebook to act in a way to ensure the user is healthy, only to ensure the user is engaged at an optimal level for Facebook and not the user.
I understand your POV and can empathize. I tell myself similar affirmations. "Our users value our features and content. They connect to the world for the better using us." But, I can't shake the truth of our business model – ad revenue. It drives the entire organization down a strict path.
Take an extreme example: A manufacturer of sugar. You started your company because of close proximity to sugar cane but over time the sugar industry grew. Then science revealed how bad sugar is for the human body in large amounts. Can you shift your business from selling sugar to an alternative? You are in the business of selling sugar and everything is centered around one goal "sell sugar". There is a subset of buyers that buy and consume in large quantities. Do you tell your consumers to stop eating sugar?
FB and similar are in the business of selling available ad inventory. Thanks to technology the availability and "sweetness" of it is unlimited. Can we quantify the potential individual or societal impacts? Science claims to think so and its not looking great.
Very much so. It's something we're trying to push back on at https://sendmemo.app
But how do you measure success, we're a messaging app.
- Total messages sent
- Number of conversations.
Measuring any of this will mean we optimise keeping users on the platform.
We're trying active conversations, where to be active means only at least one message per week. We'd love to find a metric which went up as each individual user spent less time on memo. A "time to solution" metric
I quite like the idea of measuring "time to solution" somehow, particularly for more business- or productivity-oriented software. I guess that doesn't make sense if you're making a video game or a social network, though.
The "time to solution" metric sounds a lot like SRE's mantra; automate ourselves out of a job. This is often measured in how many hours of human labor we can save for better use elsewhere.
The optimal app requires zero interaction time from the user but still provides tangible benefits. Aside from entertainment, most people don't really want to interact with software.
The word “addiction” is overused in my opinion. Bad habits are not addictions. An addiction is something which actively harms you, but you can’t stop doing it.
Yes, you can argue social media is harmful, but it generally does not cause people to lose their jobs or spend all their money.
I agree that "addiction" can be overused (irony unintentional), but I think it can be objectively defined by whether or not it causes a lasting biochemical change in the user (eventually), which causes discomfort or pain upon withdrawal. Food, for example, does not; we enjoy it (sometimes), but having more does not make us eat ever-larger amounts (if it's, say, a green salad; high-fructose corn syrup I admit the jury is still out).
So, the assertion that social media is "addictive" would translate to, "it conditions the user to require a dopamine (or whatever) hit that they will return to the social media to acquire, and will feel bad (worse than before they used it) if they stop using." Whether or not that's true of social media is debatable, but I think it is raising at least a valid question.
On the other hand, even though Stack Overflow has a lot of the same software features, it has no such problem, so I think the important question is what does Facebook do differently than Stack Overflow to cause people to spend ever-larger amounts of time on it, to no real purpose?
I'm addicted to caffeine. It's relatively harmless, might even help me work better. No one ever lost their job or spent all their money on caffeine. But it's unquestionably an addiction.
If we're going into like very technical definitions I believe what you're talking about is physical dependence. In a mental health context persisting use even after multiple serious negative consequences is part of the definition of addiction.
Though even there I think they are moving to phrases like "drug use disorder" partly because of usage mismatches like this.
Can someone have a gambling addiction? I've often heard that gambling addiction is a real problem, so why can't social media be addictive? Social media and mobile games like to use similar "engagement" techniques as slot machines, for example.
Yes, but just like how the vast majority of gamblers aren't addicts, the same is probably true of social media.
There's also the major difference in consequences. If you're truly addicted to social media, you're way less likely to destroy your life (although it could still be possible.)
COVID has been more harmful than a deadlier disease would be because it spreads more effectively.
If social media caused people to lose their jobs or spend all their money, far fewer people would use it, and public engagement with the problem would be greater.
I suggest that social media's harm to society is greater because it's less harmful to the individual than gambling, etc.
> The word “addiction” is overused in my opinion. Bad habits are not addictions. An addiction is something which actively harms you, but you can’t stop doing it.
I suggest parasitism as a more accurate description of what seeking "user engagement" is.
Their definition was generally using the word "harmful" for which smoking clearly qualifies. Losing one's job was just one example how something can be harmful.
The definition of psychiatric disorders includes this kind of requirement. E.g. if you can't sleep more than a couple hours with no apparent negative effects, it's not a sleep disorder. On the other hand if this situation is unbearable for you and you feel you need to sleep more but can't, then you have a sleep disorder.
But yes I'd think this is a pretty loose definition that would include things like social media. Personally I think more is needed. Is that social media addict a depressed person who would just escape from life via an alternative means like by watching TV or something else? If so we should call their problem depression, and call social media simply an "escape" not an addictive behavior.
Yes, a small number of apps have very addictive properties. But those addictive properties occur almost entirely by luck.
This idea of app makers as dealers of crack is incredibly self-aggrandizing. It's like someone saying every time they move to new city, they have to register their hands as deadly weapons. Let's take it down a notch, buddy, you're not that powerful.
99% of the time, makers of one addictive app find themselves completely unable to make a new one (exhibit a: Facebook) and just use their profits to buy up that tiny 1% apps that happened to get lucky and strike oil.
Apps can be addictive. Everyone is trying to make addictive apps. But let's also be clear that nearly everyone who tries, fails.
Now, amp that all up with even more colorful light shows and sound effects, and you have loot boxes. Also, amusingly enough, referred to as "user engagement" by game studio heads (also known as "recurring revenue", as if it's a reasonable subscription and not a fucking slot machine).
The other side of this is how these companies build engaging products that thousands struggle with. May be a solo developer could learn some lessons. Can anyone shed light on how to build engaging products and make users come back? What methods are these companies using that small dev can utilize and learn?
Imagine how different social media would look if it were subscription based and required much, much less investment and revenue could grow linearly with the userbase.
The incentive of an ad-supported, vc-funded social media is to addict you.
The incentive of a subscription service is to be useful enough that you stay subscribed. If you log in once a week but never cancel, that's the ideal situation for a subscription service.
What I wrestle with is whether consumers will ever accept a small subscription fee (and I mean VERY small) after they've been given everything for free, even if it meant less psychological manipulation, no ads and strong privacy.
This hits the nail on the head while also missing one of the biggest pieces of the puzzle. Marketing and sales people are trying to equally play the system to further addict people not only to the platform but to purchasing/ using their products off the central platforms.
It's shocking how much material on social networks is really just marketing disguised as grassroots. Or worse marketing specifically designed to tickle our other addictions, especially sensual addictions.
So now you get a double hit and these companies love it because the marketers pay the bills.
I don't doubt that in many cases developers are actively optimizing for addiction, but I wonder if "User engagement" is also sometimes code for "we made our site hard to navigate, and it takes twice as long to find what you're looking for". I mean, how do you distinguish between users spending more time on your site because they find it fun or useful or whatever, versus users spending more time on your site because it's just badly designed?
That’s a timely article and it has some good points.
I myself would say instead of “you should quit social media”, you should quit particular platforms that have too many toxic users (trolls and just rude people) and you should quit platforms that are run by companies with whose vision you do not agree. I am not saying we should quit Deviant Art or even Twitter, for example.
But yeah, quitting Facebook, Instagram and Whatsapp seems like a good idea at the moment. And I did so myself some time ago.
Habit forming patterns can be used for good purposes too, though. I personally wish that Anki was more addictive so that I'd be more likely to keep doing my reviews.
For android users, a really good FOSS app to lock apps, control usage etc is 'Open Timelimit', available on fdroid [1]. It doesn't need internet access permission.
It's good to see more write-ups like this. In my experience see-sawing between different ways of managing social media use, all of the suggestions out there are really just mitigation (turn on iOS Screen Time), or full blown abstinence (get off the apps).
These suggestions are targeted at social media as it currently is, with never ending feeds and attention sucking notifications. It's time to start talking about, and building, social media as it should be.
At the end of last year I spent most of my time thinking about what social media would look like if it was designed to be used less, and in a way that would add value to people.
For me, that meant intentional written reflections, shared with a small circle of close friends and family. To keep me from scrolling, there would be no feed. To keep me from checking my phone, content would rarely be surfaced.
This ended up becoming Sundayy, a mindful social network that you can only check once a week (on Sunday): https://www.sundayy.app
Each day you're prompted to slow down and reflect, but there is no feed of reflections. They're all kept secret, for now. At the end of the week, on Sunday, reflections are revealed. It's a more intimate insight into how people close to you lived their week - day by day and in their own words.
We should definitely bring more awareness to monitoring screen time and curating feeds, but we should also question whether we should have to do any of that at all.
Can you imagine pharmaceutical companies trying to improve "User engagement" with their products?
I think optimising for user addiction should be illegal. It causes great damage to society as people are drawn into the product and forget about important things, while having their wallets drained by unscrupulous companies.
You could also say it is staying competitive.
User engagement is important, the user should be in charge of setting limits.
Sure, anything can become addictive if the user has the "right" personality, but using an app is still a far cry from using recreational drugs.
However, porn and some gaming are coming close.
I think there should be regulations on this aspect of programming/marketing. Sadly it's hard to monitor, and it's the best way to be successful in today's market.
Engagement when done properly is tied back to a meaningful and measurable life (or business) output or outcome. The underlying assumption here is we should be collectively helping each other.
Club houses inability to turn off alerts in the app is an example of this. The best is to minimise alerts and pause them for a week. There be other dark patterns in there to get you hooked. I’ve set my account to counteract these.
My prediction is like all popular new things it’ll turn shitty in 3 years time. Sad but that’s what I expect from SV. Facebook Google Amazon Atlasssian etc have prepared me. I’m vaccinated!
I couldn't even finish reading the article - it was just back to back vapid sinisterizing cliches with no actual point. That conceited paranoid douchebag would call a Chinese takeout menu "a plot by communist China to render him utterly dependent upon them for sustance, monopolize his income and jeopardize his health with MSG".
I own https://www.executeprogram.com, which has interactive in-browser courses on various software development technologies. Currently they all cover languages, more or less: TypeScript, SQL, various JS topics, regexes. (Disclosure: it costs money after you finish your 16th lesson.)
Almost all (maybe literally all) of our competitors are amenable to binging. It's true of books, video learning platforms, and most/all other interactive learning platforms.
Execute Program is very intentionally non-bingeable. When you start a course, you get 5 lessons on the first day, then it stops you and tells you to come back tomorrow. On the next day, you get some brief reviews of yesterday's lessons, then a few new lessons, then it stops you again until the next day. That cadence repeats until you finish the course. You can't binge/cram even if you want to.
(A bit more technical detail: it's a spaced repetition system with exponential review intervals, similar to those used for language learning in e.g. WaniKani and Anki. But it also has a lot of fine-grained knowledge of its own course structure, so it can use reviews to intelligently unlock different lessons depending on how the user performed on their reviews.)
Occasionally, we get support email from new users who don't like this. They want to cram a whole course in a day. But cramming is a very time-inefficient way to learn, so this is self-defeating! Since launch, we've had good success adjusting the app's behavior and internal explanations to reduce these complaints.
However, we still get emails from long-term users who appreciate the time limitations. Generally these fall into two categories:
1. Users like that an enforced break before the reviews provides tangible evidence that "yes, I genuinely understood yesterday's lessons". If we allowed cramming, that reassurance wouldn't exist; it's too easy to succeed at a review when you just finished the lesson 30 seconds ago.
2. Users like that the usage limits remove a source of anxiety and worry. You do your reviews and lessons, you finish, and then you wait until tomorrow. There's no temptation to think "I really should've done 10 lessons today instead of 5; I'm so lazy".
It's still possible for a very dedicated user to do all of our courses in parallel within their first monthly billing cycle. (Median course start-to-finish time is 8-18 days depending on the course.) So this scheme doesn't make users pay us more than they would otherwise. And they're spending the same amount of wall-clock time that they'd spend if they crammed all of the lessons in one day. That makes it pure win: they memorize the topics more deeply, they worry less, and they get those benefits for no extra time expenditure. The only exception I can think of would be people who think "I must get exposure to all TypeScript syntax and semantics before tomorrow morning, even if that significantly reduces my ability to remember what I learned."
Obviously I'm very biased here, and the goals that we're optimizing for don't even exist in most other product spaces. But I thought it would be nice to have a counterexample to "engagement at all costs".