← Back to Blog
AIEthicsSociety

The Cost

It's important to ask "What are we giving up?" But it's easy to overlook the questions "What have we already given up?" and "What are we reclaiming?"

Nicholas WebbMarch 2026

Why I'm Here

I work at a large company that's bullish on AI. For a couple of years, I wasn't. I tried integrating AI into my workflow — pair programming, code review, drafting — and the results were mixed at best. It felt clunky. I fought it more than I used it. And I wasn't alone — studies were showing that even senior-level developers were being slowed down by AI integration. That was the prevailing sentiment in my community and in the news for quite a while. The data backed up what I was experiencing.

Then I started paying attention to what other people were actually doing. Not the LinkedIn thought-leaders regurgitating press releases — the engineers and builders I respected. Teammates. Friends. People whose judgment I trusted. They weren't just enthusiastic. They were transformed. Their output had changed. Their speed had changed. The nature of their work had changed.

So I dug in. Studied the latest models. Learned how these people were actually using the tools. And I found the gap.

The Adoption Gulf

There is a real chasm between what most people do with AI and what people who understand AI are doing with it. The results are not remotely comparable.

Most people use AI like a slightly smarter Google. Ask a question, get an answer, move on. Or they use it as a pair programmer — writing code alongside them, catching errors, suggesting completions. That's useful. It's also leaving 90% of the capability on the table.

Using AI effectively doesn't mean pair programming. It doesn't mean delegating basic menial tasks you think it won't mess up. I spent months fighting AI trying to help me code, because fundamentally, our minds — if you want to call it that — our styles, our training, they're very different. I was trying to keep my process and have AI improve it.

The process was the problem. That's what was giving me mixed results. That's what was slowing me down. Getting across the adoption gulf doesn't mean adding AI to your existing workflow. It means overhauling the workflow entirely.

Thought patterns. Processes. Habits. All of it needed to change. I needed a tabula rasa mindset — look at the work with fresh eyes and ask: "If I were starting from scratch, with these tools, how would I approach this?"

Once I started doing that, everything shifted. That's where the Beyond the Screen concepts came from. The realization that I don't need to write code. That I don't need monitors. That typing is actually the bottleneck. That I can direct AI agents with my voice while walking around the house.

And then I noticed: the people on the cutting edge are all arriving at the same conclusions. None of the best engineers I know are still writing code. They're directing systems that write code. The skill set that matters has fundamentally shifted, and it's happening right now — not in five years, not gradually. Now.

Darwinism in Real Time

To keep doing things the way we were is a sure-fire way to get left behind. That's not pessimism — it's pattern recognition. This is classic Darwinism happening in real time, in what is easily the biggest revolution in human history.

That sounds hyperbolic. It isn't. The industrial revolution mechanized physical labor over the course of a century. The internet connected information over the course of two decades. AI is transforming cognitive work itself — the thing that was supposed to be uniquely, irreducibly human — in years. Not decades. Years.

The .com boom and the industrial revolution will look like sidenotes. Not by a small margin.

I see this clearly now. And I'm trying my best to ride the crest of that wave so as not to get drowned by it. That is why I use AI every day. That is why I'm building and writing and thinking so much about it. Ultimately, I have a family I need to help provide for, and it would be irresponsible for me not to do everything in my power to be successful on that front.

Prometheus and Fire

AI is here. It's going to saturate everything. There is no putting the genie back in the bottle.

And of course we're going to have problems. Societal breakdowns. Reorganizations. Job displacement at a scale we've never seen. Power concentrating in the hands of whoever controls the models. Privacy eroding in ways that make social media look quaint. Real costs, borne by real people.

But consider the oldest version of this story we have. Prometheus steals fire from the gods and gives it to humanity. Zeus punishes him — chains him to a rock for eternity. The message: powerful tools come with consequences. Always have.

Fire can burn and destroy. Or it can warm and protect. That is always the delicate balance we have the responsibility, as wielders of these tools, to get right. AI is a tool. It is fundamentally no different than any other tool or concept before it — just faster, broader, and more consequential.

We have to understand things if we want to use them for the better. You can't steer what you don't comprehend.

What Have We Already Given Up?

The conversation around AI tends to focus on what we're about to lose. Jobs. Skills. Privacy. Agency. These are legitimate concerns and they deserve serious attention.

But there's a question that gets asked far less often: what have we already given up? Long before AI, long before the internet, long before electricity — how many trades did humanity make, knowingly or not, that reshaped what it means to be human?

Yuval Noah Harari makes a provocative argument in Sapiens: we didn't domesticate wheat. Wheat domesticated us. Hunter-gatherers had diverse diets, flexible schedules, and varied physical demands. The agricultural revolution chained us to fields, narrowed our diets, destroyed our joints, and created social hierarchies that didn't exist before. But it went further than that. Stored grain created surplus — and surplus created something worth stealing. Permanent settlements created territory worth defending. Dense populations living alongside livestock created the perfect breeding ground for zoonotic diseases: smallpox, influenza, measles, plague. Hunter-gatherers didn't have organized warfare. They didn't have pandemics. We traded those into existence — for grain. It was sold as progress, and in many ways it was. But the individual human arguably got a worse deal. The species thrived. The person suffered.

Graham Hancock — a polarizing figure whose claims about ancient civilizations draw as much skepticism as fascination — nonetheless lands on something that resonates when he calls us a "species with amnesia." Whatever you think of his broader work, there's a fundamental truth in that refrain that stands on its own. We don't remember the sacrifices our ancestors made for convenience, safety, or survival. Even if we were all great students of history, recorded history represents a blip in the timeline of humanity. We can't know all the paths we've walked or the tradeoffs we've made.

Agriculture
Gained

Food security, civilization, population growth

Lost

Dietary diversity, physical variety, egalitarian social structures, and — critically — it created the conditions for organized warfare (stored surplus worth stealing, territory worth defending) and epidemic disease (dense settlements plus animal proximity spawned smallpox, influenza, measles, and plague). Hunter-gatherers didn't have wars or pandemics. We traded those into existence for grain.

Harari, Sapiens (2011); Jared Diamond, The Worst Mistake in the History of the Human Race (1987)

Writing
Gained

External memory, law, literature, science

Lost

Oral tradition, trained memory capacity, communal knowledge transmission

Plato, Phaedrus — Socrates argued writing would destroy memory

Industrial Revolution
Gained

Mass production, urbanization, modern medicine

Lost

Craft skill, rural community, connection to labor's output

Marx, Capital; E.P. Thompson, The Making of the English Working Class

The Automobile
Gained

Personal mobility, suburban expansion, economic freedom

Lost

Walkable communities, public transit investment, clean air, ~40K annual deaths (US)

Jane Jacobs, The Death and Life of Great American Cities (1961)

The Smartphone
Gained

Instant information, global communication, productivity

Lost

Attention span, sleep quality, adolescent mental health, presence

Haidt, The Anxious Generation (2024)

Every one of these transitions was irreversible. Every one created winners and losers. Every one had advocates who saw only the upside and critics who saw only the cost. And every one reshaped humanity in ways that were impossible to fully predict at the time.

AI is the next entry in this table. We will gain things and lose things, and we won't fully understand the tradeoffs for generations.

What Are We Reclaiming?

Here's where my perspective diverges from the standard tech-ethics essay. Because AI isn't only taking things away. In some cases, it's giving things back.

Physical freedom

The screen chained us to desks. AI voice interfaces are unchaining us. For the first time in decades, knowledge work doesn't require a fixed posture in a fixed location. Our bodies can move again.

Time

Tasks that took hours take minutes. Not because the work is worse — because the bottleneck shifted from execution to direction. The surplus time is yours: for family, for health, for thinking, for living.

Agency for small operators

A single person with AI can compete with a team of 10. That's not just productivity — it's economic democratization. You don't need venture capital or a 50-person company to build something real anymore.

Craft at scale

Mass production killed craft. AI might bring it back. Custom products, personalized experiences, one-person businesses that deliver the quality of a boutique operation at scale — the economics are suddenly viable.

The agricultural revolution gave us food security but took our physical freedom. The industrial revolution gave us production but took our craft. The digital revolution gave us information but took our attention.

Is it possible that the AI revolution gives us back some of what the previous ones took? Physical movement. Creative agency. Time. Craft.

Nobody is going to stop using AI. That debate is already over. So if there's a chance these tools give back some of what previous revolutions took — we should be paying attention, and making sure we actually capture it.

Eyes Open: The Real Costs

None of this optimism erases the problems. They're real and they're coming fast:

01
Job displacement at unprecedented speed. Previous transitions played out over generations. AI is transforming entire job categories in years. "Learn to code" was already outdated advice. "Learn to use AI" may be insufficient too — not everyone can retool, not every role has an augmented equivalent. We need economic systems that account for this.
02
Skill atrophy is real. I notice my own coding skills fading as I direct AI agents instead of writing code. Scale that to a generation of developers, writers, and designers who never practice the craft manually. The AI becomes load-bearing infrastructure for skills humans used to own. Writing clarifies thought. Coding teaches logic. If we delegate the practice, do we lose the cognitive benefits?
03
Power is concentrating — for now. The most capable AI models are built by a handful of companies. Training costs hundreds of millions. My entire workflow runs on API calls to Anthropic — if they change pricing, terms, or behavior, my productivity takes a hit. That dependency is real and I don't love it. But the trajectory is working against concentration, not toward it. Two years ago there was one frontier model. Now there are half a dozen providers and a growing ecosystem of open-weight models you can run on your own hardware. Inference costs are cratering. Competition is doing what competition does. The risk isn't that power stays concentrated forever — tech commoditizes too fast for that. The risk is the transition window. Right now, while the market is young, bad precedents can get locked in: regulatory capture, closed-source lobbying, compute cartels. Open-source models and self-hosted infrastructure are the counterweights — and they need support while the window is still open.
04
Privacy is deepening its erosion. To get the most from an AI agent, you give it access to everything — email, files, calendar, messages, code. The convenience-privacy tradeoff isn't new, but AI takes it further because the understanding is deeper. A search engine sees your queries. An AI assistant understands your intent, your patterns, your relationships.
05
The energy bill is enormous. Training and running these models consumes extraordinary energy. Data centers are being built at unprecedented pace. The irony of freeing humans from desks with technology that pressures the planet isn't lost on me. We should demand renewable commitments and efficiency research, not pretend the footprint doesn't exist.
06
Meaning is fragile. Work gives people structure, identity, and purpose. Retirement research shows people who stop working without replacing it with meaningful activity decline rapidly. If AI handles the doing, humans need to find meaning in directing, creating, connecting — or in entirely new pursuits. That transition won't happen automatically.

Wielding Fire

I don't have a policy agenda. I'm not a researcher or a politician or a pundit on the sidelines. I'm an engineer and a trainer with skin in the game — in the arena, getting my hands dirty with these tools every day, trying to build a good life for my family while riding the biggest technological wave in human history.

Roosevelt had it right: "It is not the critic who counts; not the man who points out how the strong man stumbles. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood." The people writing think-pieces about AI from a distance aren't wrong, but they're not the ones navigating it daily. The costs I'm describing aren't theoretical to me. Neither are the gains.

What I have is a conviction: the people who use these tools have a responsibility to be honest about them. Not breathlessly optimistic. Not reflexively fearful. Honest.

AI is fire. It will burn things down and it will light the way forward. The question — the only question that's ever mattered with powerful tools — is whether we're paying enough attention to wield it well.

I'm paying attention. I hope you are too.

Further Reading

📖
Sapiens: A Brief History of HumankindYuval Noah Harari (2011)

The wheat domestication argument and the agricultural revolution's true cost

📖
Fingerprints of the GodsGraham Hancock (1995)

"Species with amnesia" — what we've forgotten about our own history

📖
PhaedrusPlato (~370 BC)

Socrates' argument that writing would destroy memory — the original tech-ethics debate

📖
The Anxious GenerationJonathan Haidt (2024)

The smartphone's impact on adolescent mental health

📖
The Death and Life of Great American CitiesJane Jacobs (1961)

What the automobile took from us

This is a companion piece to After the Screen, which explores what happens when AI frees us from screens and desks. Read them together. The optimism is real. So are the costs. Both matter.