I’ve been wrestling with a growing sense of frustration about the world we’re living in right now. And the more I try to understand it, the more I come back to a simple observation:

Nearly everything we interact with today has been optimized for extraction.

Extraction of our attention.
Extraction of our money.
Extraction of our outrage.
Extraction of our emotional energy.

If something can capture a little more of our focus, keep us scrolling a little longer, provoke us into reacting just a bit more intensely, there’s a system somewhere trying to do exactly that.

And in some ways, that shouldn’t surprise us. Over the past few decades, the people designing these systems have become very, very good at studying us. They’ve mapped the weak points in what we might affectionately call our monkey brains, the ancient wiring that governs curiosity, fear, belonging, and reward.

They know that novelty grabs us.
They know that anger spreads faster than calm.
They know that social validation can be addictive.

And when you combine behavioral psychology with enormous data sets and algorithms that can run millions of experiments a day, you get systems that are extraordinarily good at pushing those buttons.

Not because anyone necessarily set out to create a dystopia, but because the incentives reward whatever keeps us engaged the longest.

The result is something many of us feel, even if we can’t quite articulate it: a sense that there’s no real escape from the attention economy. It follows us everywhere. In our pockets. In our homes. In the tools we use to understand the world.

And now, we’re adding another layer.

Artificial intelligence.

AI can now generate convincing text, images, voices, and video at a scale we’ve never seen before. Which means that something very fundamental is changing: the relationship between our senses and reality.

For most of human history, “seeing is believing” was a pretty reliable rule.

That rule is breaking.

We’re entering a moment where what we see and hear can no longer be taken at face value. And that means every one of us is being asked to become a kind of full-time detective, constantly evaluating, verifying, second-guessing.

But here’s the problem: humans aren’t built for permanent vigilance.

Our brains evolved for trust. For cooperation. For social connection. Not for navigating an endless stream of carefully engineered manipulation.

So naturally, the question people start asking is: what do we do about it?

Some people say regulation. Treat this as a public health issue. Just as we eventually put guardrails around industries that harmed our bodies, unsafe food, polluted air, dangerous products, we might need guardrails around systems that harm our attention, our emotional stability, even our sense of reality.

But that’s difficult. Because the same economic forces that created these systems have enormous influence over the institutions that might regulate them.

Others think the answer might only come after a crisis, some kind of economic or social reset that forces us to rethink the incentives that shape our technologies.

Maybe.

But I think there’s also a quieter, more immediate question we can ask ourselves.

How do we reclaim our humanity inside these systems?

How do we build boundaries around our emotional lives?
How do we recognize when our attention is being harvested?
How do we create spaces, personally and culturally, where manipulation simply stops working?

Because ultimately, the real issue isn’t technology.

It’s whether we allow human experience, our curiosity, our empathy, our relationships, to become just another resource to be mined.

The encouraging thing is that once people begin to see a system clearly, they begin to resist it.

And that awareness is already growing.

People are learning to recognize outrage bait.
To question what they see online.
To step away from systems that demand constant engagement.

These are small acts. But collectively, they matter.

Because preserving our humanity in an age of optimization isn’t just a technical challenge.

It’s a cultural one.

And the first step toward solving any cultural problem…
is realizing that we are allowed to change the rules.

@bittner thanks for posting this Dave. I agree completely.

I think one hard question is that of network effect. To a very great extent, I've found my own either fully private, or run by a few dedicated nerds spaces which aren't full of this, and that really helps.

But I don't know if those spaces are visible or accessible to everyone.

I also know that for me, with a firehose of even non AI information, I'm fully capable of building my OWN dark patterns: it's tempting to follow people who are angry but CORRECT AH YOU'RE SO RIGHT I WILL FOLLOW. I'll build my own rage feed without the algorithm's help.

But that is really bad for me, so I'm doing my best to mute those triggers and unfollow negativity. That's hard, but worthwhile for me. Not sure how everyone can learn how and why to opt out. It takes repeated work and accepting FOMO.

I rely on podcasts to give me the news, but that means I rely on functioning news media. which also doesn't seem like a safe bet. I pay for as many as I can but still.

I do see more friends and acquaintances thinking this way and working together to find other ways, so hopefully you're right.

I'm really curious about what options will work for not so tech nerds folks. Mastodon and signal are my happy places, but I don't think the network effect is there for people with different interests? And I worry that as soon as there is enough of a network effect then the bad actors will find us there.

Have you found other options that work for you and the folks in your life?

@bittner I guess more specifically: you say it's a cultural problem and not a technical one, and I think you're right. I also think you're right that people care.

But how, culturally, are you seeing folks effectively push back and make changes other than by changing technology? Or is it linked, so resisting the commercial incentives means looking outside of those companies offerings?

@discontinuity One of the things I've seen slip away during my lifetime is the notion that things are, generally speaking, better when companies operate and exist at what I describe as "human scale."
Cable TV is a decent example — when cable first came through our community it was run by a local interest, the county cable company. If you had an issue chances are you knew someone who knew someone who could help. There was local accountability.
Fast forward to today, that's gone, and not just from cable TV. We can all think of countless examples of the things we rely on for every day society being consolidated and gobbled up by huge multinationals. We're left with a version of Lilly Tomlin's famous telephone operator, "Sir, we're the phone company - we don't have to care."
Look at a platform like YouTube. They've got no meaningful competition, and yet there's nothing stopping anyone from taking a run at them, which helps shield them from antitrust claims. Amazon, Google, Meta, Apple's App Store all enjoy massive scale and a government uninterested in regulation.
Perhaps cynically, I've come to believe that we are, more or less, a reactive species. (I hate being cynical!) So change may come when there is no other option, or when the powers that be find themselves directly affected. ("Senator, based on this publicly available location data we purchased, it seems you have been spending many overnights at your intern's apartment.")
I wish there were easier answers, and that's a big part of my frustration. That lack of "human scale" makes it feel impossible for one person to make a difference.
So, for me, I vote. I demonstrate. I engage directly with my elected officials. I share what I'm thinking so others know they aren't alone.
And I try not to give up hope.
“Hope is a good thing, maybe the best of things, and no good thing ever dies.”

@bittner I think you're right. It's the scale that kills things.

But then, that's why I like Mastodon, because it can be run and is run at human scale (see Jerry helping out Krebs with the kimbot trolls).

It's just that the human scale only works with enough folks willing to run human scale stuff, and users being willing to pay. I hope that this mindset shift means more folks who aren't tech nerds are looking into real alternatives.

I just feel like, as a group, FOSS nerds don't make it easy for folks not in the in-group to get in. Who would run the local fishing group or textile arts Mastodon instance for example? I'm a nerd and even I don't want to run this stuff myself. A community with a low count of nerds able to do this is the one with the hardest journey.

I'm really curious to see how that gap gets bridged! I hope it does at least.