Is There Ever Anything Better than Getting What You Want?
A large part of life is learning the lessons that come when we don’t receive what we want, or when we do receive it but eventually wish we hadn’t.
One way to say it would be that much moral formation comes from living attentively in the gap between desire and control. A person who can sit with a desire that they cannot fulfill, without being driven either to depression or mania, is a person who knows genuine happiness. You know the old statistic about people who win the lottery and lose it all within a few years? Most of the time we think that statistic reveals something about money. But really it reveals something about desire. The kind of person who plays the lottery enough to win it is probably also the kind of person who cannot make peace with the gap between desire and control. In their case, a $50 million windfall feels a lot like control, which is then aimed squarely at desire. When $50 million hits your account, it is very difficult to imagine you might feel differently about it five years from now.
Control, as Hartmut Rosa has beautifully explained, is ironic. We value it because it helps “tame” the world around us, and seems to protect us. But past a certain point, the control itself starts to inflict punishment. A little bit of felt control becomes a lifelong quest for more control. The only thing worse than not having control is watching what control you think you do have start to slip away. This is why, for example, careful parents become helicopter parents, who then become manipulative and guilt-tripping parents. It’s why pastors with 40 years of genuine love for a congregation end up sandbagging their churches in year 41, when they can’t lay their head down without fearing that the new generation is going to ruin everything.
But control is only one side of the equation. We look for control because we have desire. Which means that control is to desire what propane is to a fire. The most dangerous thing you can give anyone with an inordinate desire is control. Sometimes there’s no alternative—a parent having to let a child make their own mistakes, or a friend realizing that after pleading and trying, there’s nothing more to be done for their foolish beloved. But desire and control combust when they meet, and the more this union happens in our hearts, the more unable (or unwilling) all of us become to live without the combustion. Desire demands control. Control demands more.
The worst thing about a teenager is that you have a billion manic desires, tearing holes in your body trying to get out. The best thing about being a teenager is that, if things around you are the way they’re supposed to be, you are forced to live within a wildly disproportionate ratio of desires to control. The teenage girl cannot make all of her classmates think she’s amazing. The teenage boy cannot make that beautiful girl appear, smitten and eager, right in front of him. Neither the girl nor the boy can make their parents “get it,” or make their homework do itself, or make time speed up. The intensity of the desires is at its highest when control is more or less at its lowest. Eventually, the intensity of the desires declines right as control starts to increase. This is what’s called “growing up.” Like an exhausting mountain climb, growing up makes you hurt, but if you stay with it, you’ll get a view that you can’t get anywhere else: A vista of yourself, one that might even take your breath with gratitude for the control you didn’t get and the desires that weren’t gratified.
This brings us to AI.
At this point in the discourse, it’s very difficult to say anything new. The reality is that AI boosters and critics both inhabit a world in which AI is ascendant and will likely continue to be, with all kinds of questions, pitfalls, and problems to come. My point here is not to make yet another case for AI skepticism, but instead to think about AI’s broader symbolism in our lives.
What we think is a technology question is really a spiritual question. What kind of person would choose to live in a world where they don’t always get what they want, when they want it? That question reveals what the debate over AI is all about. It’s also the question that divides political philosophies, educational approaches, parenting, styles, church, worship, and much more. Is anything ever better than getting what you want? Admittedly, very few people who inhabit a Christianized society will openly deny any value in delayed gratification. But the trick is to not ask the question as a philosophically rhetorical question, but instead as an existentially specific question. Ask people, “Should you be able to eat whatever you want,” or, “Would it be OK if you had to ask someone’s permission to marry or date,” or, “Here’s a million dollars, you can have it if you’ll sign these terms of service you can’t read.” All of these questions are ways of asking if there’s ever anything better than getting what we want.
Many describe AI technology as a “revolution.” But this is misleading. The most relevant aspects of AI have been operating at will on our minds and souls for decades now. We are astonished that an AI bot can talk to us like an intimate partner or best friend, but that’s only because we’ve long since overcome our astonishment that email and instant messenger can carve our a permanent emotional home for people with no physical presence in our lives. Our jaws drop to realize that an LLM can generate a picture of literally anything or anyone. But how many years have we been summoning the most niche kinks or the most bizarre addictions with only a Google search? How many YouTube videos or Instagram accounts have we found of people who are into our exact hobby or interest? The world has been clay in our hands for some time now.
In other words, we have long been liberated from the gap between control and desire. Any desire, no matter how specific, weird, or cruel, is available with a reliable enough wifi connection. Godlike control is cheap and easy. All of us are freeze framed into something like an inverse adolescence: rather than raging hormones and zero power, we have stultifying boredom and infinite power. And the wisdom that we’re supposed to learn as we grow—that the world does not and (we learn this later!) should not bow to our desires—feels as quaint and nonsensical as the phone book.
The divide over AI that really matters is not finally a question of whether there are or are not any legitimate purposes for it. It’s not even a question of how many people should do how many jobs versus how many machines could do those same jobs. Those are important questions. They deserve more than hasty answers. But they are not the most important ones. The dilemma we find ourselves in is a spiritual dilemma that will get worse, regardless how many times we pull out a book instead of a chat bot. If we do not feel deeply in our heart that there is something of great worth to be gained from the thwarting of our own desires, than the world we will make is a world that reflects the values and attitudes of the technological regime.
Christianity does not teach that all our desires are bad. But it does teach that what we want now is different than what we will want 1,000 years from now. Much depends on living in light of this difference. The Christian struggle is to bring ourselves into as much alignment as possible with the realities, objective and subjective, that will remain when the realities we see right now are long gone. Some people object to the doctrine of sin because, they argue, it damages self-esteem. To which we Christians should say, “Yes, but only temporarily.” Christianity is not “This world bad, other world good.” It’s, “This world was good, has gone bad, but will be made good again.” What damns our souls is not loving what’s good in this world, but loving what will be changed later. Sin is preferring the fast food sack lunch today to the steak dinner tomorrow. Sin is the shrinking down of our entire selves so as to love shadows more than the people who cast them. It’s not ultimately that we need to think less of everything and everyone now, it’s that we need to think higher of everything and everyone later.
For Christians, there is an inescapable question that frames our walk in this world toward the next one. “What will you want later?” Some desires we have now will turn into regrets in the light of eternity. Sanctification is subjective and objective. We will ourselves to become more like Christ, whom, by the Spirit, we are always becoming more like. We are “growing up” into who we were always meant to be. And just like growing up in this world, there is a pain to inhabiting the gap between desire and control, but it is a gap we must inhabit. In order for me to have what I most need in this universe, I cannot have everything that I want on this earth. In fact, I should not even try.
We interrogate our technology by asking what kind of people it’s making us. How are we being calibrated? What do we think of as normal? What kind of existence does this technology map out for us? Too many debates over AI focus on what is licit, or on the lack of total consistency. These are distractions. The real question is whether there’s every anything better than getting what you want, when you want it. If the answer is yes, then we live in a particular kind of world, where particular kinds of spiritual and moral habits are good, and other kinds are not good. If the answer is no, same thing. This alone doesn’t settle the argument. But we cannot refuse to answer the question.




I wonder if this argument could be strengthened by adding a kind of classification of desires. There are right desires that can be thwarted, which calls for increased faith and perseverance. There are wrong desires that can be thwarted, which is a helpful check against our sin nature. And there may be other desires that are difficult to evaluate as good or bad without the help of wisdom and experience.
I worry that emphasizing the importance of or value of being thwarted may minimize the importance of knowing which desires are right to begin with and what a right response looks like. A desire to learn is usually good and worthy of pursuing, vs. a desire for distraction, which is more suspect.
This is not to undermine the point that technology has been about the business of closing the gap between our desires and our control, and it’s better to be cautious about that gap closing than to embrace it too eagerly.
Right on. AI is an accelerant that magnifies already existing tendencies and disordered human desires. AI brings to the fore the moral and spiritual formation questions that have been there all along. In some ways, we might even say that AI is the logical outworking/conclusion of the digital age/the internet; it just sort of closes the loop. As you say, we've become accustomed to disembodied communication and presentation of our digital identities; what's the categorical difference if those are now artificially generated.