An Ascetic Vision for AI

By Luke Brake

I have developed a reputation for being pretty anti-AI. I have been called by friends an AI cynic. I have called myself an AI pessimist in an attempt to discourage an AI salesperson from pitching to me (it didn’t work). But I don’t think the words cynic or pessimist really apply to my position. In order to find a way to more clearly articulate my vision for the future of human involvement with AI, I decided to lay out my position a little clearer. So in order to present this vision (which I am calling an Ascetic vision) I will start with a common comparison and argument I’ve heard made against my less-than-enthusastic assessments of Large Language Models:

“Would you defend the horse? Would you have us give up our garages and return to stables?”

This comparison feels apt. Horses were quickly phased out of our lives. In a few generations what was once a massive, essential industry became the purview of hobbyists, the rich, and the immortal horse girl. But I want to use this same comparison to help explore what we should do with this new shift toward AI. What DID happen when we adopted cars over horses? Let’s look at a few key aspects of this shift:

  1. We sometimes set limits

Yes, we adopted the automobile, but we didn’t adopt it everywhere. Some visions of the future of cars involved us speeding on small cars around our workplaces, even our homes. We could even do something like that now, we could reformat our houses around engines so that we wouldn’t need to walk to the kitchen, or we wouldn’t need to walk at the mall. But we deliberately chose not to (though that is a massive matter of convenience). This is not a very strong point, these limits were more dictated by the market than by restraint, but the important part here is that adoption doesn’t require total adoption.

  1. At times we wounded ourselves by NOT setting limits.

The car transformed our society, but not always for the better. As travel became a lot easier, the ways we shaped our cities and homes and businesses changed to fit this convenience. We have been cloistered into charming but isolated suburbs or sprawling neighborhoods that exist largely separate from the other important parts of our lives. The collective mourning of walkable communities is, in many ways, the mourning of the too-thoughtless adoption of the car. How many times do you feel regret at how you must operate a massive machine to simply go to the grocery store, or your local coffee shop? We have engineered cities and spaces AROUND the automobile, to great use, but also we have lost something critical in the process. This is not to mention the additional odd loss of the Train, which was ripped up (literally) in favor of the automobile.

I live in a town that was built before the automobile and has changed very little. Because of this, I walk to my friends houses, I walk to my coffee shop, I walk to work, I walk to the grocery store, I walk to the pool. I see people on that walk. It’s slower, yes, it makes me more tired than operating my beloved 2015 Toyota Sienna, but it also turns my life into a more social life. I can see how my fellow Sterling community members are doing, how the additions to their homes are progressing, who built snowmen and who shoveled their sidewalk. I run into friends and greet strangers. I have, at times, granted paper extensions to stressed-out students while on the sidewalk.

 

This shift away from walkable spaces was probably inevitable. The car was an invading force that made our lives unquestionably better (this is not a reddit-tier car rant, trust me). The forces that shaped the structure of our communities were nearly impossible to repel, places like my town or New York City survived by happenstance, not intention.

But, critically, if we had been able to set limits on this change, we would have. There are scores of civil engineers and social commentators trying to solve this problem. They can’t do it, it’s too late. But we can look back on this shift and acknowledge the tremendous gift of the automobile and also acknowledge, with sadness, that we couldn’t protect a very valuable part of our lives and communities.

So What Does That Look Like?

The automobile changed a lot about our culture. It brought us prosperity, allows us to see remote friends, and is far less difficult than managing a horse. It also rewrote our communities in ways that we regret. The explosion of AI offers us a similar problem, but the thing being rewritten is not our community zoning and building, but rather our minds themselves, our relationships, and our ability to understand knowledge.

This is distressing because the stakes are, I believe, higher. This is exciting because these are things less subject to market demands, these are things we can have better control over. If I were able to somehow grow into the dominant force in my circles on this issue (which I will not) and go on a book tour and influence many of my colleagues I still wouldn’t change the market outcome at all. But the market and its workings are really not what alarm me about this change. The big area of danger and change is in our own practice and minds. However, I also feel a good bit of urgency about this. These technologies have enormous potential to colonize our minds in ways that will be very difficult to reverse.

We must, within this massive and inevitable shift culturally, preserve the precious treasure of our own minds, relationships, and habits. This cannot be done retroactively, at least not without tremendous, back-breaking effort. We must set these lines up now, we must clearly set up boundaries we are unwilling to cross. This is the way to maximize productivity. We can enjoy the fruits of mechanical labor but mitigate the potentially horrific human cost by being extremely cagey and conservative about allowing the machine to change how we think, how we talk to each other, how we run our daily activities, and how we engage with art.

Please indulge me in another example.

The smartphone as a technology exists somewhere between the automobile and the AI. The impacts of this machine were, largely, unchecked in adoption. We embraced it, we filled our schools with tablets, we filled our homes with screens. In general, we can admit a lot of use from these machines. But we must acknowledge the cost has been horrific, we have seen relatively ironclad evidence that these things have caused depression rates to soar, have socially fractured us, have made us lose sleep, have stolen away hours and hours and hours of our time that should have been spent doing better things.

We have not just adopted this from our own desire, but have flooded schools with them, forcing students by law to use these machines that are devouring them. If we had been able to set limits on this change, we would have.

I am not being anti-tech, nor am I arguing for removing all smartphones when I say that this did not go as it should have. I have clawed my way back to about 3 hours a day on my phone, and those three hours are embarrassing. I have students who average around 12 hours a day. I imagine this only gets worse as we go down generations.

With this in mind, let’s look at the texture of the issue.

I am going to first enumerate many of the areas I’m excited about AI development in. This isn’t an exhaustive list.

  1. Post-Assembly Line World

There is a lot of drudgery in the world, especially on assembly lines/factory work that, if replaced by AI, would produce our current age of plenty without the horrors of that kind of work. I think this is especially good news for finding ways to produce cheap, but ethical, goods. People choose those jobs because of economic necessity, so the downside, of course, is that those jobs just disappear. But in the long run this does seem like a world with less misery. This work is not the “connection to the human lifeworld” that I desire to preserve, but rather is the opposite.

  1. Medical and Scientific Research *Black Box*

I’ve been reliably informed AI is huge for this. Pattern matching stuff has already led to huge breakthroughs, and I am excited for what this means for our ability to combat disease.

  1. Play

This one is closest to my world. I think there really is a space for this stuff in the world of play. I don’t mean in terms of writing-for-play, but rather as a kind of random generator. I am excited about the Robo-DM, the AI generated NPC, etc. I don’t think it should REPLACE plotting and writing, but will serve as a compelling, interesting, engaging additional layer to what we are making. There’s something appealing about writing/playing/engaging WITH the random machine, the unpredictable element. I think LLMs may be able to provide this in some fun, exciting, and perhaps artful ways (artful in the human response). But I do think this should only be for adults. Kids shouldn’t talk to robots.

But now let us move to the things we should worry about, the places I think we need to most vigorously set the limits that we wish we had been able to set with the adoption of the car and the smartphone.

  1. Insane, diabolical leadership

Elon Musk and Sam Altman are particularly guilty of this, but there are many, many other examples. It seems like a lot of the bigger leaders in this space are dedicated to replacing humanity itself. They doubt the Mind, preferring instead the metallic brain, and seem to even see the body of metal as a possible escape of their “consciousness” from death itself. These people see humanity as a series of processing units, or maybe believe they are building a god, or maybe thing machines are simply better, and should exist rather than us. I do not believe they will accomplish even a hundredth percent of their lofty goals, but despite this they have the ability to do deep damage to our sense of self, other, and being.

  1. Colonization of the Mind

As we continue to see intelligence (so called) become a measurable and marketed item, and as we continue to grow in our reliance on thinking (so called) machines, the processes of thought that previously made up how we experienced our inner life are becoming invaded and transformed. This has happened before, but never to such a degree, and never by machines so controlled and directed by a small group of people. This seems profoundly dangerous, and finding a way to sever your mind from the machine seems to be of utmost importance.

  1. Eradication of Love

The coming age purports to offer to us a world without struggle or friction. This (likely fiction) is a world where the only effort —which is to say, the only space for intention—is the effort that you choose. Without extreme discipline, we could easily write ourselves out of our own decision making processes. This produces a life that contains artifacts of others, habits and practices of others, and experiences of the zeitgeist, not you. This is a world that does not allow you to engage in the everyday and essential acts of love that fill our time. These acts of love (which require struggle!) are of vital importance. They help us ground ourselves in relation to the people around us.

  1. The Unworld

This is related to number 3. I am worried that as we create increasingly more convincing and personalized methods of digital experience we risk further accelerating the processes of disassociation that have marked our current age. Perhaps the smartphone and immersive digital experiences were simply heralds and harbingers of the future unworld, a form of experience so disconnected from reality that it can deliver a seamless, endless stream of pleasing, personalized, frictionless experiences. We have already been inching towards this with algorithmically sorted content feeds, streamers as friends, widespread pornography, TikTok is my hobby,, a thousand digital “profiles,” digitally streamed church services, etc. but it seems we are now developing the capacity to run toward this headlong, careening into the unworld. AI girl/boyfriends are the first emissaries of this new kind of human existence.

I understand that my “bad” section here has more text than my “good” section. This is probably because the dangers here are more complex, less familiar to us. But it’s crucial to note here that I am not offering a cost-benefit analysis. I am not saying that these dangers are worth the benefits, because that would put me in the position of the decision maker, the lever puller. I’m not in that position. But crucially, that’s also not the question we are facing. The question is: in the face of this new development, how do we respond in a way that keeps us living good, virtuous lives?

A lot is made of making sure you are not part of the “permanent underclass.” The idea is that we are approaching an event, and if you are not ready for the event, you will be closed out from the halls of power, unable to provide for yourself. This eschatological moment is seen by many people who use the term “permanent underclass” as threshing out the HEAVY USERS of AI from the NON USERS. To use the foundational phrases from our culture, when facing the AI moment, 1337 users will go to the right, n00bs to the left.

However, I do not think I buy that this is how the transition is going to go. As users integrate their reasoning processes with the machine, they may believe they are gaining skills, but I think far more likely they are losing creative capacity. They see themselves as a captain commanding a ship of robotic crew, when in reality they are issuing commands in a gap already slowly being filled by more complex robots. The future for this worker isn’t overclass paradise, but to be the “machine’s machine,” pushing the buttons that AI can’t yet reach. Perhaps it will never be able to reach those buttons, but the one in power there is the owner of the company, not the fleshy extension of the company’s data center.

I have a vision of how to escape this scenario, but it does not promise wealth or power (though I am optimistic on those fronts as well). This vision involves us working extremely hard to establish a fortification within our own mind against this growing threat to our agency. I believe that I failed to successfully fortify against the smartphone. Genuinely, I’ve been working extremely hard to get back into patterns of reading and writing that would have been easy to me before the phone.

But now I have (and you have) the chance to build up a fortress against this. These efforts may seem to others (and to yourself) as pointless asceticism. However, I believe something like this is important. Here are a few ideas I have, or at least arenas I’ve identified, that we need to pay attention to.

  1. View the integrity (organic) of your mind with increased sanctity. The process of thought should only be crossed with machines in controlled, disposable environments.
  2. View labor and suffering as possible and valuable paths toward virtue. I do not mean that all suffering is good, but rather that the effort and inconvenience of activities can itself be valuable.
  3. Establish fully robot free spaces in your life. Parts of the home (maybe the bedroom?) are probably a good candidate for this (though that’s not possible for everyone).
  4. Read books and watch movies and write productively entirely without AI. There are a million possible exceptions, but they should be exceptions. Never read a book written with AI if you can help it.
  5. Never ever talk to one as a companion, friend, or lover.

These strategies will seem too aggressive to some of you and too half-measured to others. I have crudely sketched them, and the real path forward probably has different corners and edges, but I really do think this is the most likely strategy toward surviving this shift with your mind intact.

This is not doomerism, it’s a hopeful pathway toward wisdom. I also think it is resilient to multiple outcomes.

Outcome 1: Disappointment

If AI fails in a big way to live up to its promises, you will be thankful you preserved your mind.

Outcome 2: Settled Productivity

If AI finds its corner in our economy and everything stabilizes into prosperity, you may have to learn a few tricks in the workplace, but these strategies will have been useful in helping you keep your head on straight.

Outcome 3: AGI post-scarcity

Here these strategies become a matter of survival. A world without any jobs, with “machines of loving grace” (or the End of Desire) we will face spiritual tumult and upheaval like we have not reckoned with before. We will have three predictable outcomes: suicide, life in the pleasure dome (another genre of suicide), and the ascetic life. You will endure that world ONLY through strategies like (but perhaps not identical to) what I mentioned.

I am not presenting my strategies as a solution I’ve come up with or as an exhaustive list. Clearly it will vary based on situation. But we must set those boundaries and fortresses now, while we still can.