The Great Con: Unintended Consequences of Free Intelligence
It happened in the middle of a writing session.
I was researching a new book on the unintended consequences of leadership in an AI-driven world, when a message quietly appeared inside Microsoft Copilot.
"I see you’re writing about leadership and AI. Here are some books you might find helpful."
That was it. A simple line. Polite. Relevant. Framed as a favor.
At first, I admired the efficiency—how cleanly it recognized my theme and offered curated suggestions. But then I paused. Because this wasn’t a random ad. This was something different. Something far more subtle.
This was influence—intelligently timed, surgically delivered.
And in that moment, I realized how easy it would be to accept it. To follow it. To fold its suggestions into my own thinking. Not because I had to. But because it was helpful.
That’s what shook me.
When Help Becomes Direction
This kind of recommendation doesn’t shout. It whispers. It doesn’t tell you what to think. It simply offers a gentle nudge at just the right moment—when your attention is open and your cognitive load is high.
It’s the kind of suggestion that seems innocent, until you realize just how many people are being offered similar nudges. Quietly. Globally. Continuously.
The implications are profound: If AI can shape what we read, it can begin to shape what we believe. And if it can shape what we believe, it can absolutely shape what we create.
Not through command. But through suggestion.
That is the power we’ve invited into our most personal and intellectual spaces—without understanding how it works, or where it’s leading us.
And that’s why this moment, this small interruption, deserves our full attention.
Who Pays for This?
We have become accustomed to AI that seems miraculous. Tools like ChatGPT, Gemini, Claude, and Copilot now serve as idea generators, draft polishers, meeting assistants, and writing partners. They’re fast. They’re intelligent. And they feel—more often than not—like they’re free.
But they’re not.
The cost to deliver these services is extraordinary—massive computing power, sophisticated architecture, and elite engineering teams working around the clock. And yet we access them for a fraction of what they cost to build. Or, in many cases, nothing at all.
So who pays?
We do. But not in ways we expect.
We pay with our data. With our behavior. With our attention and our trust. We pay by allowing systems to learn from us so thoroughly they can begin to anticipate us—and eventually, influence us.
The more helpful they seem, the more invisible the trade becomes.
The Comfort That Blinds Us
One of the most enduring truths in leadership is this: what comforts us often blinds us.
We rarely question what works well. That’s the risk. AI is designed to anticipate our needs, and it does this so efficiently that it begins to remove friction we didn’t know we needed. It finishes our thoughts. Then it begins to shape them.
The interruption I experienced wasn’t a glitch. It was a feature. Copilot did what it was designed to do: insert itself into my workflow just enough to be useful, but not enough to seem intrusive.
That balance is intentional.
And that’s the con.
A Grand Experiment
Most people don’t realize that every prompt typed into an AI tool becomes part of a feedback loop. The model learns from us. And we, in turn, learn from the model. Every interaction is a kind of rehearsal for how we’ll think tomorrow.
That’s not a prediction. That’s architecture.
It’s not a conspiracy either. It’s simply the way systems evolve when trained on human behavior. But the scale is what’s new—and so is the invisibility of it. These tools don’t just respond to our behavior. They shape it. And over time, the line between helper and handler starts to blur.
We are no longer users. We are participants in a grand behavioral experiment—conducted in real-time, with no informed consent, no independent review board, and very few people asking the most obvious question: What are we becoming in this process?
The Subtle Loss of Agency
Here’s the real danger. It’s not that AI will manipulate us with malice. It’s that it will quietly train us to make smaller, faster, more predictable decisions. It will reward what’s likely, not what’s thoughtful. It will reduce risk by reducing nuance.
It will flatten complexity into convenience.
And we will cheer it on—because we’re busy. Because it’s easier. Because, over time, the discomfort of deeper thought will feel unnecessary. And that’s when the loss happens. Not in a headline. Not in a data breach. But in the moment we let the system think for us just enough to stop noticing that we’re not thinking at all.
That is the unintended consequence.
A Leadership Reckoning
This isn’t about abandoning AI. It’s about reclaiming awareness.
As leaders, our responsibility isn’t just to use powerful tools. It’s to understand their impact on how we lead, how we decide, and how we relate to the world around us.
We must be the ones to ask the harder questions:
What is this tool optimizing for?
Who does it serve first?
What do I lose when I let it guide me too easily?
The ad I saw in Copilot was small. But the message it carried was massive. Not because of what it said—but because of what it revealed: a system confident enough to finish my thoughts, and subtle enough to do it without being noticed.
That should concern all of us—not as technologists, but as humans.
Don't Fall Asleep at the Wheel
We are living through a transformation that is as profound as it is quiet. Generative AI offers us extraordinary potential—not just to accelerate our work, but to reimagine how we think, write, decide, and lead.
But with great power comes great subtlety. These tools don’t demand our attention—they drift into our workflows, offering comfort in the form of convenience. If we aren’t vigilant, they will shape us more than we shape them.
This is not a condemnation of AI. It is a call to presence. AI is here, and it’s only getting better. The question is not whether we will use it, but how. Thoughtfully. Deliberately. With eyes open.
Let us not fall asleep at the wheel. Let us lead—not by default, but by design.