I’ve been wrestling with a growing sense of frustration about the world we’re living in right now. And the more I try to understand it, the more I come back to a simple observation:

Nearly everything we interact with today has been optimized for extraction.

Extraction of our attention.
Extraction of our money.
Extraction of our outrage.
Extraction of our emotional energy.

If something can capture a little more of our focus, keep us scrolling a little longer, provoke us into reacting just a bit more intensely, there’s a system somewhere trying to do exactly that.

And in some ways, that shouldn’t surprise us. Over the past few decades, the people designing these systems have become very, very good at studying us. They’ve mapped the weak points in what we might affectionately call our monkey brains, the ancient wiring that governs curiosity, fear, belonging, and reward.

They know that novelty grabs us.
They know that anger spreads faster than calm.
They know that social validation can be addictive.

And when you combine behavioral psychology with enormous data sets and algorithms that can run millions of experiments a day, you get systems that are extraordinarily good at pushing those buttons.

Not because anyone necessarily set out to create a dystopia, but because the incentives reward whatever keeps us engaged the longest.

The result is something many of us feel, even if we can’t quite articulate it: a sense that there’s no real escape from the attention economy. It follows us everywhere. In our pockets. In our homes. In the tools we use to understand the world.

And now, we’re adding another layer.

Artificial intelligence.

AI can now generate convincing text, images, voices, and video at a scale we’ve never seen before. Which means that something very fundamental is changing: the relationship between our senses and reality.

For most of human history, “seeing is believing” was a pretty reliable rule.

That rule is breaking.

We’re entering a moment where what we see and hear can no longer be taken at face value. And that means every one of us is being asked to become a kind of full-time detective, constantly evaluating, verifying, second-guessing.

But here’s the problem: humans aren’t built for permanent vigilance.

Our brains evolved for trust. For cooperation. For social connection. Not for navigating an endless stream of carefully engineered manipulation.

So naturally, the question people start asking is: what do we do about it?

Some people say regulation. Treat this as a public health issue. Just as we eventually put guardrails around industries that harmed our bodies, unsafe food, polluted air, dangerous products, we might need guardrails around systems that harm our attention, our emotional stability, even our sense of reality.

But that’s difficult. Because the same economic forces that created these systems have enormous influence over the institutions that might regulate them.

Others think the answer might only come after a crisis, some kind of economic or social reset that forces us to rethink the incentives that shape our technologies.

Maybe.

But I think there’s also a quieter, more immediate question we can ask ourselves.

How do we reclaim our humanity inside these systems?

How do we build boundaries around our emotional lives?
How do we recognize when our attention is being harvested?
How do we create spaces, personally and culturally, where manipulation simply stops working?

Because ultimately, the real issue isn’t technology.

It’s whether we allow human experience, our curiosity, our empathy, our relationships, to become just another resource to be mined.

The encouraging thing is that once people begin to see a system clearly, they begin to resist it.

And that awareness is already growing.

People are learning to recognize outrage bait.
To question what they see online.
To step away from systems that demand constant engagement.

These are small acts. But collectively, they matter.

Because preserving our humanity in an age of optimization isn’t just a technical challenge.

It’s a cultural one.

And the first step toward solving any cultural problem…
is realizing that we are allowed to change the rules.

@bittner this is well articulated.