A working hypothesis about where humans go after AI takes the work we used to do.
Some of my favorite memories as a kid involve a dairy farm in Minnesota.
My extended family owned it, and we'd visit when we could. To a kid who didn't grow up in that world, it was a fantasy. Hay lofts that felt cathedral-sized when you were eight. Rope swings dangling from rafters that creaked every time you launched. Cattle that didn't care you existed until they did, and then they really did. Tractors, my uncle would let us drive in slow circles around the yard while he pretended not to watch.
I loved those visits. Not because I understood farming, but because the whole place ran on a logic I could feel even if I couldn't articulate it. Animals needed feeding. Crops needed tending. Equipment needed maintaining. Everything had a job. Everything had its place. And the people who lived there knew exactly what they were for.
What I didn't understand at the time was that I was visiting a museum.
Not the building. The role.
In 1790, 9 out of every 10 Americans were farmers. By 1900, the number had dropped to about 38%. By 1930, 21%. Today, less than 2% of Americans farm. My uncle's farm wasn't a window into how most Americans still lived. It was a window into how almost all of us used to live, preserved by a small minority who kept the lights on while the rest of us moved into something else.
I've been thinking about that farm a lot lately. Not for nostalgic reasons. For a reason that took me a while to admit.
The Realization I Wasn't Ready For
I was sitting at my desk a few weeks ago. I had three browser tabs open, all of them AI tools. I was prompting Claude to help me think through a client positioning problem. I was feeding ChatGPT a transcript to extract themes from. I had just published an article on bensaibrain.com that I knew, statistically, was being scraped by training crawlers somewhere in the world before I'd even closed the tab.
And I had this strange moment of recognition.
I wasn't using AI. I was feeding it.Every prompt I wrote was a small parcel of human reasoning a system somewhere was learning from. Every article I published was potential training data. Every Slack message, every LinkedIn post, every voice note I dictated to my phone, every long-form thought I worked out in writing, all of it was, in some real sense, content. Crops. Output. Material being grown by humans and consumed by machines.
I sat with that for a minute. Then I caught myself thinking the thing I want to write about today.
What if this is the next chapter? What if becoming content farmers is what humans actually do on the other side of all this?
I want to be careful here, because I argued something pretty pointed in my last article on the Industrial Revolution. I pushed back hard against the comfort reading of history, the "we adapted before, we will adapt again" line that executives reach for right before they cut a department. I stand by that. The gap between productivity gains and human gains is real, and pretending it isn't is morally lazy.
So I'm not writing this to walk that back. The gap is still real. Some of the people reading this are inside it right now.
But there's a different question that piece didn't try to answer, and it's the one I keep circling: what do humans actually do, on the other side of the gap? Not what does the spreadsheet say? What does the role look like? What do we wake up and do every morning, once the work that defined the last century has been redefined out from under us?
Content farming is my working hypothesis.
What I Mean By That
I don't mean the SEO version of "content farms," the junk-content sweatshops that gamed Google for ad pennies in 2010. That term has baggage I'm not interested in defending.
I mean something closer to the literal one. Farmers grew the food that fed everyone. They worked the soil, tended the animals, harvested the crops, and shipped them to a society that needed them to eat. Most of human civilization, for most of human history, organized itself around that work.
What I'm wondering is whether we're watching a similar role emerge, at scale, across every industry, mostly without anyone naming it yet, where humans grow, tend, curate, and ship the content that feeds the AI systems an entire economy is now organizing itself around.
Not as a side hustle. As a primary role. The way farming was a primary role.
That's the hypothesis. The rest of this piece is me trying to figure out whether it holds up, what it would mean if it does, and what kind of content farmer any of us actually want to be, because if this transition follows the pattern of every other one, that question is going to matter a lot more than most people are ready for.
The Signals That Made Me Take This Seriously
I didn't pull this idea out of the air. I started looking around at what's actually happening in 2026, and the signals are louder than I expected.
The displacement is real, and companies are increasingly willing to say it out loud.
In Q1 2026, 78,557 workers were laid off in tech, and roughly 47.9% of those cuts were attributed to AI and workflow automation. By early 2026, Block had cut its headcount nearly in half, from 10,000 to under 6,000, with Jack Dorsey writing that "intelligence tools have changed what it means to build and run a company." Atlassian cut 1,600 workers, framed explicitly as an AI-era restructuring. Meta announced 8,000 cuts in April 2026 and said the freed compensation budget would go to AI research. Accenture told its workforce that "those we cannot reskill will be exited." Citigroup is planning to reduce headcount by 20,000 through automation. Freshworks just laid off 11% after its CEO publicly said more than half of the company's code is now written by AI.
This is not the euphemistic "restructuring" language of past layoff cycles. This is corporate America openly attributing job elimination to a specific technology.
That's the dust on the horizon.
Meanwhile, on the other side of the same economy, something else is growing fast.
The creator economy hit somewhere between $214 billion and $314 billion in 2026, depending on which analyst you ask, and credible projections put it past $1 trillion within the next eight to ten years. There are 207 to 303 million content creators globally. In the U.S. alone, 162 million people now identify as content creators, with about 45 million working at it professionally. 84% of those creators are already using AI tools in their workflow, and 68% plan to expand that usage in 2026.
And then there's the part that genuinely surprised me when I dug into it. AI companies are now signing real, eight- and nine-figure deals to license content from the platforms and publishers whose material they need.
Reddit alone has pulled in more than $203 million in licensing agreements, with an estimated $60 million per year coming from Google. News Corp signed a five-year deal with OpenAI, reportedly worth more than $250 million. The New York Times is taking $20-25M per year from Amazon. Anthropic settled a copyright class action for $1.5 billion (about $3,000 per work) over training data. Shutterstock has earned over $100 million from training deals across major LLMs. A new licensing standard called Really Simple Licensing launched in September 2025, modeled directly on the music industry's ASCAP/BMI structure. New marketplaces, TollBit, ProRata, Cloudflare's Pay-Per-Crawl, and Microsoft's content marketplace are being built specifically to charge AI systems for the content they consume.
In other words, the economy is starting to formally compensate for the production of content for AI consumption.
That is the strangest sentence I've written in a while. Read it again.
The economy is starting to formally compensate for the production of content for AI consumption.
That is not a side note. That is the shape of a primary role emerging.
The Engels' Pause Problem
Now, before I get carried away, I have to say the part I said in the 1830 piece, because it still applies.
Productivity gains and human gains do not arrive on the same timeline. They never have. The British workers of 1820 watched output per worker climb sharply for sixty years before median wages caught up, and that's the chapter of the Industrial Revolution everyone forgets when they reach for it as a comfort story.
The same pattern is showing up right now in the data on AI.
PwC's 2026 Global AI Jobs Barometer found that productivity growth in AI-exposed industries jumped from about 7% in the 2018-2022 period to 27% in the 2018-2024 period, nearly quadrupling. The gains are real. They are showing up. The question, the same one Engels' Pause asked in 1820, is who's getting them.
So far, the answer looks uneven, and the creator economy is the cleanest evidence of how uneven.
About 50% of full-time creators earn less than $15,000 a year. Only 4% earn over $100,000. 57% of full-time creators earn below a living wage from platform monetization alone. The average creator takes about six and a half months to earn their first dollar. Reddit, the platform, is collecting $60 million a year from Google. The Reddit users whose content actually fills the platform are collecting nothing, because as a non-exclusive licensee, Reddit holds the rights it needs to cut deals on content its users created.
If content farming is the role that's emerging, the field is already laid out the way every other transition has laid itself out. There are landowners. There are sharecroppers. And there are day laborers. The transition isn't going to decide which one you are. The decisions inside the transition will.
That's the part I want to sit with for a minute.
Three Positions in a Field That Already Exists
When I started thinking through the analogy seriously, the part that locked it in for me was realizing the economic positions are already there. Not as predictions. As current observable facts.
The landowners are the platforms and publishers with enough leverage to license at scale. Reddit, News Corp, the New York Times, Shutterstock, Axel Springer. They control either a corpus of training-quality content (decades of journalism, libraries of stock imagery) or a flow of fresh content the AI systems can't get anywhere else (live conversations, real-time news). They're cutting eight- and nine-figure deals because they can. The infrastructure to do this barely existed three years ago. Now there's a standardized licensing protocol, multiple marketplaces, and a $1.5 billion settlement establishing a per-work valuation floor.
The sharecroppers are the individual creators producing the content the landowners are licensing. The Reddit user whose post made it into a Google AI Overview. The blogger whose archive got scraped before any deal existed. The freelance journalist whose byline appears under a license they didn't sign. They're producing the crop. The land they're producing it on belongs to someone else. And as legal scholar Matthew Sag has argued in detail, the structural problem is that content licensing at internet scale doesn't realistically work for individuals. The transaction costs of paying a hundred million people a few dollars each are higher than the dollars themselves.
The day laborers are the people doing the work that's emerging downstream of all this. The "human in the loop" annotators are reviewing AI outputs for accuracy. The AI-literate editors catching hallucinations. The content QA leads make sure brand voice survives the model. There are hundreds of human-in-the-loop job listings on Indeed right now, paying anywhere from $37k to $253k, depending on what kind of judgment you're being paid to apply. This is real work, and a lot of it is good work. It's also work where you don't own the field, you don't own the crop, and you can be replaced by the next round of model improvements that automate your judgment, too.
I'm watching this play out in my own businesses in real time. At Rocket Media, the writers who are thriving aren't the ones competing with AI on volume. They're the ones who own a perspective, a voice, a body of original thinking the model can't fake. At Digital Ignitor, the strategy work that holds value is the part that requires synthesizing across a client's actual situation, not the part that the model can produce from a generic prompt. At Modern Moments, the value is in the relationship and the experience, not in any artifact AI could generate.
In each case, the work that lasts is the work where I own something. A relationship. An asset. A point of view. A body of evidence from real implementation. The work that's getting commoditized is the work where I'm just producing more of what already exists in abundance.
The pattern is the same one that's playing out at the platform level, just smaller. And the question it's asking is the same one.
The Question Worth Asking
I'm not writing this to tell you which position to aim for. I genuinely don't know what the right answer is, partly because I think the answer depends on what you're already good at, what you have leverage over, and what stage of life and career you're in.
What I am writing this to ask is whether you've thought about which position you're currently in.
Most people I talk to about AI haven't framed it this way. They're either anxious about getting displaced (a real concern, not a phantom one), or they're optimizing how to use the tools (also reasonable), or they're trying to figure out a "new skill" to learn that will keep them safe (the framing the 1900s answered with "go to high school," which worked for one transition and may not work for this one).
The frame that's been most useful for me is this: I'm already a content farmer. I have been for years. Most of you reading this are, too. The question isn't whether to enter the field. The question is what kind of farmer you're going to be on it.
Do you own the land you're working on, or are you working someone else's? Are you growing a crop the market specifically wants from you, or are you producing a commodity? Are you building toward leverage, or are you renting out your judgment by the hour? Is what you're producing going to be more valuable in five years because you produced it, or less valuable because anyone with a model can replicate the surface of it?
These aren't new questions. Every farmer in 1880 was asking them, with different vocabulary. Every craftsman in 1920 was asking them. Every clerical worker in 1985 was asking them. The categories shift. The question doesn't.
What I'm Doing About It
I want to be honest about where I've landed, because I think the worst thing a piece like this can do is name a problem and offer no skin in the game.
Here's what I've changed in the last six months, knowing I might be wrong about half of it.
I'm publishing more in places I own. bensaibrain.com is the first field I work in. Social platforms are a distribution, not a home base. If I disappear from LinkedIn tomorrow, I want the body of work to still exist somewhere I control. That's a landowner move, even on a small scale.
I'm writing about things I've actually tested, not things I've researched. The model can be researched. The model cannot have spent eight months building voice-first lead-capture systems across three businesses and then watched what broke. That experiential layer is the crop the model can't grow on its own, and I'm leaning into it harder.
I'm letting AI handle the work I used to consider core, and reorganizing my time around the tasks the model can't do. First drafts, research synthesis, format conversion, those are tractor work now. The work is in the framing, the synthesis, the choices about what's worth saying, the relationships the writing serves. That's harder to defend at a budget meeting, like I said in the last piece. It's also where I think the durable value is.
I'm watching the licensing economy carefully. Not because I think most individual creators will get paid (the math doesn't work, per Sag), but because the infrastructure being built around it will determine what content "counts" in five years. RSL, TollBit, ProRata, Cloudflare's pay-per-crawl, those are the equivalent of the early agricultural commodity exchanges. They're going to shape who gets paid and who doesn't.
And I'm trying not to pretend I know how this ends.
The last article ended with two questions I taped to the inside of my notebook. I'm going to add a third.
What is the next thing I am going to stop counting as work?
Who decides?
And what am I planting today that I'll still own when the field changes?
The dairy farm in Minnesota didn't survive because the family worked harder than everyone else. It survived because they kept asking what kind of farm it would be, decade after decade, while the world reorganized itself around new ones.
I'm not sure content farming is the right name for what we're becoming. I'm sure we're becoming something. And I'd rather wonder about it out loud, with you, than wait to be told.
If you're trying to think through what your own portfolio of work looks like on the other side of this transition, that is the conversation I'm having every day over at Digital Ignitor. Come find me.
Sources
Tom's Hardware, Tech industry lays off nearly 80,000 employees in Q1 2026
Tech Insider, Tech Layoffs 2026: How AI Is Driving the Biggest Workforce Shift
Will Scott, How AI Licensing Deals Determine Search Visibility in 2025
ProMarket, The False Hope of Content Licensing at Internet Scale (Matthew Sag)
Parseur, Future of Human-in-the-Loop AI 2026 (PwC Global AI Jobs Barometer)
Created with ❤️ by humans + AI assistance 🤖