Discover more from Digital Liturgies
When Bots Write Your Love Story
"Is the Internet Real Life?"
You may remember a story a few years ago about the “affair dating” site Ashley Madison: specifically, the revelation that the site employed a large number of artificial intelligence “bots” that posed as real women looking to hook up. This Fortune piece is partially paywalled, but the most important info is this:
The bots were essentially Ashley Madison’s sales force. Men who signed up for a free account would be immediately contacted by a bot posing as an interested woman, but would have to buy credits from the site to reply. Annalee Newitz, a reporter who helped expose the site’s use of bots, also uncovered internal documents showing that 80% of initial purchases on Ashley Madison were by a male user trying to communicate with a bot.
This story has fascinated me ever since it came out.
There are so many angles to it, it’s almost literary; you can imagine an amazing Jeeves and Wooster story or Seinfeld episode on this exact premise. Obviously there’s the commercial fraud aspect of it. Then there’s the way that the Internet has transformed everything about culture even as most people still have no idea how it works (or doesn’t work). But the part that’s captivated me the most over the years is a simple question: How many times has a particular idea or behavior been normalized in someone’s mind because they believed that lots of other people were doing it too—a belief that turned out to be false?
The Ashley Madison thing is extra interesting because it eventually forced the company to admit that its ratio of men to women was lopsided (5-to-1), which of course puts the immovable force of sexual liberation right smack up against the immovable object of gender stereotypes. It turns out that wanting to cheat on your spouse with a complete stranger is something men like a lot more than women. Stupid patriarchy! Jokes aside, how easy is it to imagine that for a tiny sliver of time, the existence of a website like Ashley Madison convinced some men that the women in their life are actually just as sex-crazy as they are? The bots weren’t simply trying to attract men to the site. They were telling the site’s story for it. A man who signed up on Ashley Madison received (according to the news reports) an immediate message from a bot pretending to be an interested lady. If the site just wanted to bilk guys out of their money as quickly as possible, they could have paywalled the entire thing and offered some cryptic “testimonials” as incentives. But the bots weren’t just billboards. They were part of how the company wanted its male customers to think of the world. “There really are tons of beautiful women just dying to bail on their husbands for me.” It was a narrative. Told by machines.
That machines are telling us particular stories about our world is one of the main reasons I keep coming back time and again to digital culture, epistemology, and theology. Our default posture toward the Internet is still, to this day, a posture of intuitive belief: to genuinely accept that what we see on the screen is a piece of “real life,” representative of someone who is really somewhere. And in many cases, of course, this is more or less true. But there are also very real cases where the intensity or the vividness of what we see online is disproportionate to its weight or validity outside. TikTok offers perhaps the most compelling example of this, as the Wall Street Journal’s investigators discovered an algorithm finely tuned to push users toward the fringe, measuring “interest” simply by the milliseconds that someone slowed their scroll over a particular video. TikTok’s algorithm, in other words, was not just supplying users with stuff to feed their interests. It was trying to create users with certain kinds of interests. This makes a lot of commercial sense; the best way you can ensure that someone will spend a lot of time on your platform is not to grow very intelligent about their desires and customize the platform accordingly, but to retrofit their desires toward the platform.
But how do you do that? TikTok’s algorithm suggests the answer: you show people images and video of their interests/desires being talked about or acted on. If you do this, you will not only make them more likely to keep running down the rabbit trail, you will make them believe that the rabbit trail is good, normal, healthy…maybe even part of an “identity.” And as the Ashley Madison story demonstrates, savvy developers don’t even need lots of real people to create these visions. What they need is a desire. If you want something to be true—of yourself, of those around you, of the world at large—you can find evidence online that it is true; a lot of evidence, in fact. And it may not occur to you that the world you are seeing on your screen is engineered not by a swelling community of which you are now a part, but by an algorithmic logic that puts a magnifying glass on your desired narrative and blinders around everything else.
A few days ago a friend of mine, an elder at his church, told me that a group in his congregation were getting ready to confront him and the leadership. Their crime? They were “woke.” My friend is not woke. My friend is not close to woke. He would not go to a woke church, much less serve alongside a woke leadership (I dislike typing scare quotes too frequently, but please know that they belong over the word woke in every sentence above). Put simply: I do not believe that this disgruntled group at my friend’s church has had their elders say something genuinely woke. I believe they have heard someone on social media say something like, “If your pastor says [XYZ], your pastor is woke!” That social media account probably gets a lot of retweets and likes and shares. So, most reasonable people would see a post like that getting a lot of attention and conclude, “Yes, this must be true. If my pastor ever says [XYZ], they must be woke.”
Now it seems to me that in the current malaise of American political discourse, especially within evangelical spaces, much is made of numbers. An especially popular idea is that “elite” evangelicals are pushing wokeness on the “grassroots” church goers. The evidence for this usually consists of pull-quotes from articles, clips from YouTube and podcasts, and other portions of content that get shared very quickly online. There’s an impressionistic element to this process: if you search “Tim Keller” or “David French” at the current moment you are probably equally likely to get results that are critical/negative toward them as you are to get results from these actual people themselves. Why? Because those critical/negative pieces do very, very well on social media, and Facebook and Twitter’s algorithms know that you are more likely to engage, share, and click on something that’s already doing well in those areas.
For all its interest in dismantling establishment and elite bastions, the populist conservative moment has curiously little to say about the effects of Silicon Valley on how ideas are shared. For example, would your opinion on whether a particular evangelical is woke change if you discovered that of the 1,000 people who shared that article claiming he is, about 200-300 were bots, and about 300-400 were the same people who share every such article when it comes out? Maybe your opinion on the man himself wouldn’t change, but you know what might? Your opinion on whether there really is a groundswell of outrage toward his ministry. If it turned out the narrative you saw online about this particular person were not as representative of what other people were saying as you initially thought, that might make you wonder whether the narrative itself was valid. Like a football team that pumps artificial crowd noise into their stadium on gameday, an algorithmic inflation just might be a kind of cover for a lack of something very important.
But does that mean “The internet isn’t real life”? To conclude, let’s return to the Ashley Madison story. Once again, the most fascinating part of the story about Ashley Madison’s bots was not that they were getting money from men. It’s that the bots were making the idea of an affair dating website far more plausible, far more compelling than it would have been without them. An affair website needs, ahem, women. It needs to promise sex-seekers that they’re looking for is out there. But the problem for Ashley Madison is that it was really just a platform of a bunch of sex-seekers and very few sex-providers. From the men’s point of view, though, the bots affirmed their hopes. Here was evidence that their desires weren’t as misogynistic, selfish, or dysfunctional as some might have told them. “See? There are lots of women here. And they get me.”
These poor suckers found out the expensive way that the question is not, “Is the Internet real life?” The real question is, “Is the Internet real life for you?” That’s the power of the algorithm. That’s the power of the AI chatbot. Sitting in a darkened room with naught but your phone, real life can happen to you. It really is your beliefs that are being manipulated, your outrage that’s being engineered, your sexual addiction that’s being enabled. But you’re alone. You’re alone because you’re just one more notch on the software developer’s belt. You’re alone because “real life” keeps getting smaller and smaller. You’re alone because your ideas are not really yours. The white rabbit you thought you were following had a gun to your back the whole time. And now you can only think what it wants you to think.