Rory

October 3, 2022

The algorithm wants to kill you.

[tw: suicide]

For the first time, a court has found two social networks directly responsible for the death of one of its users. A coroner testified that Instagram and Pinterest contributed to her death "in a more than minimal way."

The defense these companies provided was that they are not responsible for the content of their sites. They are mediums, which means that they are neutral. Sure, people can post and consume terrible things while they're there, but those people are directly responsible. The medium isn't responsible for what's it's used for, right?

The thing is, the mediums aren't neutral. Social networks aren't just "mediums"—they're companies. Their mechanics aren't inert or unintentional or inevitable—they're designed for a specific purpose. They're technologies—and they're technologies that have been evolved to specific psychological and financial ends. They want us to feel something. And the thing they want us to feel is literally bad for us.

Let's go past the obvious. Yes, they want to addict us. Yes, they are designed to continually grab our attention. Yes, they revolve around "likes" and "notifications," and are designed to make us crave both—and to "create content" that will generate both, in between listlessly scrolling to find more.

All that is true. But why do social networks exacerbate anxiety and depression so much? Why are they all so consistently good at radicalizing their users? It goes beyond red numbers and infinitely-scrolling feeds. At the heart of all these mechanics lies a simple but non-obvious truth: social networks are designed to put us in competition with one another—and on some level, the game board is your very sense of self.

Your "character" is you, no matter what the particular game. Are you interesting? Are you beautiful? Are you funny? Do you have enough friends? Are you enjoying your life enough? Are you rich? Are you famous? Are you right? Are your feelings and opinions the correct ones to have? Are the things you enjoy, objectively speaking, the most correct things you can enjoy?

The more invested you get—and these games are expertly designed to make us feel invested—the less it suffices to play completely honestly, "as yourself." There is an increasing pressure to perform, to outdo, to win at all costs. But your "reward" isn't really a reward, and there are two reasons why that is:

  1. First, any chemical reaction in your brain to "being liked" is extremely fleeting, unless it's connected to some genuine intrinsic benefit (like "meaningful connection"). Social networks aren't designed to offer you that.

  2. Second, and more perniciously, the actual "reward" in the schema of the game is that you make other people feel worse, driving them to try and outdo you. This is the Möbius strip of sequence that defines social networking: you doing "well" is what causes other people to "lose," usually in ways designed to make them feel specific forms of inadequacy. And they respond to those feelings by doubling down, pushing those feelings onto you instead.

It's a game without an actual winner, because nobody's accurately depicting anything. They're just lying to make other people feel bad. Projecting images that are designed, somewhat consciously but mostly due to social-network manipulation, to make other people feel more alienated and alone. Which leads to the phenomenon that defines the social-media age: literal billions of people are unified in these feelings of resentful isolation, but that unification is sundered by the fact that they've all been taught to see each other as the enemy.

The "algorithm" is designed to blindly push people towards things that generate the most engagement—in other words, the things that make people feel the most compelled to commit to something. Perversely, that often means the things that make people feel the worst, because that's what leads them to want to lash out or overcompensate in some way. "Lowest common denominator" obviously holds some advantage—hence cute animals popping up everywhere—but ultimately that's not quite enough. The most successful things are also irritating or aggravating on some level, because that's what sets the feedback loop into motion.

The algorithm doesn't know what it's pushing—but there's plenty of data on what it pushes and why. It helps users encourage one another to despair, because the more you despair, the more you need an outlet for venting that despair. Hence communities that revolve around suicidal ideation or eating disorders or explaining why slight variations in the shape of a male forehead determine whether women feel a biological need to screw a man over. The unifying trait of these communities is that they consist of people encouraging each other to keep going, confirming for one another that their worldview is right while constantly upping the ante, escalating the bleakness, ratcheting up the sense of urgency. And these communities proliferate because they do the algorithm's work for it—so the algorithm keeps recommending them to new people. New users, if you will.

You know the popular story about the "paperclips AI?" The machine that gets told to make paperclips as optimally as possible, and ends up destroying the world by turning it all into paperclips? I've seen it suggested before that publicly-owned corporations are a version of this AI: companies like Chevron can't help setting the world on fire, because they're an algorithm tooled towards maximizing profits at the expense of literally anything else. Social media algorithms work the same way: they will destroy communities and human connection and they will do their damnedest to destroy your soul too, because they don't care about human beings.

The "optimal user," to these algorithms, is the most horrific image of a person that you can imagine—the kind of person who'd annihilate a part of your mental well-being if you so much as glimpsed at them. That person might be an vapid influencer or a political radical or someone who isn't just suicidally depressed but actively addicted, in a deeply disturbing way, to suicide as a concept. Or all three! But make no mistake: this is what the algorithms are designed to produce, regardless of whether or not their creators are smart enough or willing enough to anticipate the end results. The monstrosity isn't an unintentional byproduct—it's the precise thing these sites want to generate, even if they'd rather limit it just enough to maintain plausible deniability or even a good night's sleep.

As the economy crumbles and the social safety net falls away, and as younger and increasingly poorer generations are taught that the only way out is to hustle, their livelihood and their future starts to look exactly like social media. Whether it's driving for Uber or posting on OnlyFans or living in a TikTok influencer house or, hell, trying to invent the next big social network, you're just trying to be the one person who does well enough to destroy all the others, as you cross your fingers and pray that the algorithm doesn't abruptly change and pull the rug out from under your feet.

Because the algorithm wants engagement. It wants you obsessed with other people and obsessed with yourself—and it wants you to hate both them and yourself. That's the algorithm's version of perpetual motion, and it leads to Bored Apes and murders and suicides and not much else. 

About Rory

rarely a blog about horses