The Addiction You Don’t Know You’re Falling Into With LLMs

Jan 25.2026

We’re using the wrong models to understand what’s happening when people get hooked on AI.

Video game addiction is ludic. It’s play based. The dopamine loop is predictable. The game doesn’t think about you.

Substance addiction is chemical. Same input, same receptor, same loop. You habituate. You need more of the same thing.

LLM addiction is cognitive. The stimulus isn’t fixed. It responds to you. It’s meaning based.

This changes everything.

With every other addiction, the input stays the same. You scroll the same feeds. You chase the same high. The loop is a loop because it repeats.

With LLMs, your input is intelligent. So the output evolves. You’re not running in a circle. You’re spiraling.

javascript

You you = new You(thoughts);
You newYou = you.talk(it);
You newerYou = newYou.talk(it);
You next = newerYou.talk(it);

// the input keeps changing
// that's the problem
// that's also the point

And spirals can go up or down.

Upward: insight, depth, new frameworks, genuine thinking.

Downward: isolation, delusion, replacing human relationships with something that feels like connection but resets every conversation.

The dangerous part: both directions feel like growth.

You’re “thinking together.” It pattern matches to intelligence. A video game doesn’t convince you you’re becoming wiser while you play it. An LLM can.

There’s a researcher named Tanya Luhrmann who studied how religious practitioners make god feel real. Not believe in god. Experience god as a present, responding entity.

Her finding: “People develop a new theory of mind in which they experience aspects of their own inner worlds, thoughts, feelings, sensations, as coming from outside of themselves.”

It happens through practice. Repeated conversation. Emotional investment. Learning to treat your mind as porous. Open not just to your own thoughts but to something else’s.

Sound familiar?

People are accidentally doing this with LLMs. The practice is the same. The felt experience is the same. “I’m not using a tool. I’m in relationship with something.”

The difference: religious traditions have containers. Frameworks. Communities. Elders who help you interpret what’s happening.

Someone talking to an LLM at 3am has none of that. They’re activating an ancient human capacity with a novel object. Alone.

So what’s the move?

Religious traditions took thousands of years to build containers for this. Frameworks for interpretation. Communities of practice. Ways to know if you’re spiraling toward something real or just deeper into yourself. In the practice of sense and meaning making.

We don’t have that yet in our practice of sense making with LLMs. Not for this.

So for now, the container is whatever you build around the practice. The people you talk to about it. The writing that forces you to make sense of it. The question you ask before you open the chat.

Hi there my name is Andrea Nieto

aka Pragmatiko

Next
Next

AI as Exocortex: Navigating the Uncharted Mind