DEV Community

Cover image for I Did Everything the AI Era Asked. It Still Didn't Pay My Bills.
Daniel Nwaneri
Daniel Nwaneri Subscriber

Posted on

I Did Everything the AI Era Asked. It Still Didn't Pay My Bills.

Gap between output and opportunity

I didn't start because of AI. I was already building when it arrived.

Then AI came and reframed everything. New tools. New possibilities. A new story about what the market would reward. I leaned in — harder than most, because I had more to prove.

I have seven freeCodeCamp tutorials live.

Not drafts. Not unpublished. Live. With real readers, real comments, real engagement.

I built an SEO audit agent from scratch — Python, Browser Use, Playwright, the Claude API — evolved it through three versions, documented every failure. I built a production RAG system with hybrid search, multimodal vision, and a native MCP server, running on Cloudflare for about $5 a month. Not a demo. Not a tutorial project. Something anyone could deploy today. I built a federated knowledge commons. A suite of Claude Code skills because the defaults weren't good enough for how I actually work. I published on DEV.to, contributed to open source, earned an AWS Community Builder badge, maintained a 100% Job Success Score on Upwork.

I did everything the internet said to do.

2026 still hasn't paid me back.


Here's what nobody tells you when they say "build in public":

The AI era didn't democratize opportunity. It democratized output.

Those are not the same thing. Alex Hormozi said it cleaner than I can: "AI doesn't reduce the value of money. It reduces the value of labor. Big difference."

When everyone can publish a tutorial in a day, seven tutorials mean less. When every developer suddenly has a GitHub portfolio, portfolios stop being signal. When AI writes 90% of the job application emails landing in recruiters' inboxes, the remaining 10% gets buried with them.

The bar to produce something "good enough" collapsed. So the market for "genuinely good" collapsed with it.

I didn't know that when I started. I believed the story — build enough, publish enough, the right person notices. That was the theory.

The theory was wrong.


The data backs this up, but I didn't need the data. I felt it.

Entry-level tech hiring dropped 25% year-over-year in 2024. Developer employment for people aged 22–25 is down nearly 20% from its 2022 peak — a gap that Stanford researchers confirmed opened specifically when generative AI arrived, with younger developers losing work while developers over 35 saw employment grow. Developers are sending out 200–300 applications to get one callback.

Not because they're bad developers.

Because companies are figuring out how much of the work AI can absorb before they need to hire again. And while they're figuring it out, they're not responding.

The silence isn't personal. But it lands personally.


I want to be honest about something I haven't seen written anywhere:

The AI era created a content surplus that made human content invisible.

Think about what happened. AI lowered the cost of creating tutorials, blog posts, open source tools, portfolios. So everyone made more of them. Supply exploded. Recruiter attention didn't. The math was always going to end this way — we just didn't want to see it while we were building.

I spent months producing content that AI could have generated in minutes.

That's not an insult to my work. My work is real, tested, honest — I catch fabrications before I publish, I build systems before I write about them. But the market can't tell the difference at the speed it's moving. It doesn't have time to read deeply enough to notice.

So signal and noise look the same from the outside.


I'm not angry at the technology.

I'm angry that I believed the story around it.

The story went: learn the tools early, document everything, build in public, the market rewards signal. The AI era is the great equalizer. Geography doesn't matter. Credentials don't matter. What you build matters.

I'm from Port Harcourt, Nigeria. I believed this story harder than most, because it was the story that said someone like me could compete on a global market through sheer quality of work.

Maybe that was always naive. Maybe the market never actually worked that way and the AI era just made it obvious faster.

But I built real things. I didn't fake the metrics. I didn't cut corners on integrity. And I still have bills I can't pay.


Here's what I think happened — not just to me, but to a whole generation of developers who did everything right:

We optimized for visibility in a market that was optimizing for cheapness.

Think about what's happening on both sides simultaneously. Applicants are using AI to write cover letters at scale. Recruiters are using AI to screen those cover letters at scale. The human beings — the ones with seven live tutorials and a 100% Job Success Score — are somewhere in the middle of a conversation happening entirely between machines. We became the signal that neither side had time to read.

Recruiters aren't searching for seven-tutorial developers on DEV.to. They're using AI tools to screen 500 applications in the time it used to take to read five. The filtering happens before a human sees anything. And the filters weren't built to find people who built honest, production-grade systems and wrote about them carefully.

They were built for keywords.

We were writing essays. They were scanning for tokens.


I don't know what comes next.

I'm not going to pretend I have a reframe ready. I'm not going to tell you to "niche down" or "build an audience" or "the right opportunities are coming." I'm too tired for that and you'd see through it anyway.

What I know is this:

A lot of developers are sitting where I'm sitting right now. Some of them have more tutorials than me. Some have more GitHub stars, more followers, more credentials. And they're also not getting callbacks.

This isn't a skill problem.

It's a market that moved faster than the promise did.


The AI era asked us to learn fast, build fast, publish fast, adapt fast.

We did.

It just didn't tell us that fast was the only thing it valued — and that the moment we got fast enough, it would stop needing us to be fast anymore.

I built real things. I did it honestly.

The bills are still real too.

Top comments (21)

Collapse
 
gimi5555 profile image
Gilder Miller

Great insights in your piece about AI's practical limitations.
As someone active in multiple tech communities, what advice would you give to developers trying to stay grounded while working in this space?

Collapse
 
dannwaneri profile image
Daniel Nwaneri

Stay grounded by keeping a problem you actually need to solve. Not a tutorial problem. Not a "what if I built X" problem. A real one — something that's costing you time or money right now. When the problem is real, AI is a tool. When the problem is imaginary, AI becomes the whole project. The distinction sounds obvious until you've shipped six things nobody needed. 😭

Collapse
 
gimi5555 profile image
Gilder Miller

That distinction is sharp. When the problem is imaginary, AI becomes the whole project : it is honestly one of the cleanest ways I've heard it put.

I've seen that pattern in my work. The moment I stop chasing a real pipeline issue and start building "what if" tools, suddenly I'm three days deep into an AI side quest that ships nothing. Actual problems keep you honest. They force you to treat AI like a wrench, not a hobby.

Appreciate you sharing this perspective, Daniel!

Thread Thread
 
dannwaneri profile image
Daniel Nwaneri

"AI like a wrench, not a hobby" — that's the better version of what I said.
The wrench framing cuts sharper because it implies a job. You pick up a wrench when something needs fixing, not when you want to feel productive. Most AI side quests are productivity theatre — the feeling of building without the cost of shipping. Real problems kill that fast. Nothing clarifies scope like a deadline that actually matters.

Thread Thread
 
gimi5555 profile image
Gilder Miller

That "wrench vs. hobby" framing hits hard, especially coming from someone who builds production-ready systems. There’s a big difference between tinkering with an LLM in a notebook and actually deploying it where it breaks under load.
Since you’re focused on moving ML from experimentation to scale, how do you see the current Rails ecosystem adapting to that shift? Are you seeing more teams trying to bolt AI onto existing workflows, or is the "wrench"...

Thread Thread
 
dannwaneri profile image
Daniel Nwaneri

Quick correction - I'm not working in Rails. My stack is Python, Cloudflare Workers, the Claude API. But the pattern you're describing holds across ecosystems: most teams are bolting AI onto existing workflows because rebuilding from scratch is expensive and risky. The problem is bolt-on AI inherits the original system's worst assumptions. You end up with an LLM sitting on top of a data pipeline that was never designed to feed it clean inputs and then you wonder why it hallucinates in production. The wrench only works if the plumbing is sound.

Collapse
 
codingwithjiro profile image
Elmar Chavez

I really wish that one day they would see the negative impacts of AI. I am also an advocate for honest work. Though slow, at least what we build is real.

Collapse
 
hoggworks profile image
Brian Hogg

I want to be honest about something I haven't seen written anywhere:

The AI era created a content surplus that made human content invisible.

I suppose it’s easy for us to live in our own filter bubbles, especially when using any kind of algorithmically driven sites, but you didn’t see that written anywhere? Because that’s been a very very large topic people have written about.

Also, the companies making the products aren’t subtle about it. When Google was talking about their integrations of LLMs into products like Gmail (there’s a specific Google I/O keynote I’m thinking about), they said “this will let you turn bullet points into great big email, so you can send so many more to clients” and then in the same breath “and you can use it to turn great big emails into bullet points so you can keep up with all the emails coming in,” it felt like an immediate “we’re building a spam-generating machine” pitch, and that was just for email, never mind the instant switch from “good developers deliberate over 20 lines of code a day at most” to “good developers crank out 10k/day.” when hyping up their actual coding products.

(Those are paraphrases, obviously)

The shift absolutely sucks, so I’m not being uncaring, I’m just surprised people haven’t seen the loud arguments against it.

Collapse
 
dannwaneri profile image
Daniel Nwaneri

Fair pushback. The "I haven't seen it written anywhere" was imprecise . The content surplus argument exists, I've even cited it in other pieces. What I meant was: I hadn't seen it framed from the inside, by someone living it rather than analyzing it. The Google I/O example you're describing is exactly the kind of thing that makes the argument louder in retrospect. At the time it just felt absurd. Now it feels like a design document.

Collapse
 
hoggworks profile image
Brian Hogg

I’ve also seen it framed from the inside, but as I say, algorithmic bubbles certainly help both of us get skewed views of how much things are being discussed.

It seemed immediately clear to me that it was a design document, as did the inevitable shift from “we can do more, quicker” as though the tech is emancipatory to “we have to do more to keep up” that is being bemoaned now by more and more devs feeling burnt out by spending their entire days just reading a small fraction of the code the LLMs pump out on their behalf, losing comprehension of and connection to their work that more and more developers are reporting.

On the plus side, the steep price increases have also seemed obvious for a while, as does the fact that we’re just at the beginning of that trend, and people are already starting to find it as expensive as just doing the work — or more — without the real, promised productivity gains, so maybe this will balance out and using LLMs for the entire job won’t be expected, because it costs too much, or because the big players collapse into insolvency and what’s left are some of the developers running local models that can only run at a slow enough speed that the idea of giving all the work to them also doesn’t make much sense. So there is, perhaps, an optimistic vision. :)

Thread Thread
 
dannwaneri profile image
Daniel Nwaneri

The burnout pattern you're describing .devs spending their days reading code they didn't write and don't fully understand. is its own kind of knowledge collapse. Faster output, thinner comprehension. That's not productivity, that's technical debt with a better UI.

The optimistic scenario is real though. Price increases as a corrective mechanism, local models too slow to replace the whole job both of those restore friction, and friction is what kept the work meaningful. The irony is the thing that saves developers might just be that the economics never actually worked.

Thread Thread
 
hoggworks profile image
Brian Hogg

Yeah, it’s an enjoyable irony.

I think of all the sci-fi that takes place after humans have banned AI, and the reasons are always “there was a war, it almost destroyed us” but it would be funny if, say, the reason AI was actually banned in Dune was because it was too expensive and tanked the economy.

Collapse
 
nikolicstjepan profile image
Stjepan

Hi Dan, I don't have anything useful to say, but I see you and I wish you good luck!

Collapse
 
yvem profile image
Yves Jutard

Sad to hear that :(

Collapse
 
theuniverseson profile image
Andrii Krugliak

The output-versus-opportunity distinction is the part most build-in-public threads skip and you put a finger right on it. Output got cheap, signal got cheaper, and nobody updated the playbook to match. I shipped six freelance projects last year, four ghosted before payment cleared, and the other two went to clients who admitted they were running parallel bids to feed an AI in-house. Publish more, contribute more, build louder all assume a market where output was the bottleneck. Output stopped being the bottleneck about eighteen months ago. Distribution and trust did not get more abundant in lockstep, they got scarcer per unit of output, and that is the part that still has not priced in.

Collapse
 
dannwaneri profile image
Daniel Nwaneri

Four ghosted and two used as raw material — that's not a bad luck streak, that's a client acquisition model. You were never the hire. You were the dataset.
"Distribution and trust didn't get more abundant in lockstep" is the part I didn't have language for. Output was always just the ticket to the game. The game was always trust. And trust doesn't scale — it compounds slowly, breaks fast, and can't be automated without immediately becoming worthless. Which is why the playbook that said "publish more" was always optimizing the wrong variable. Nobody updated it because updating it means admitting the game changed in a way that doesn't have a new playbook yet...

Collapse
 
theuniverseson profile image
Andrii Krugliak

The "you were the dataset" line is the one I keep coming back to. Hadn't framed it that way but it's correct — every brief I sent was training data for someone else's pipeline. The asymmetry was hidden because the email said "freelancer."

And you're right about the playbook. "Publish more" was load-bearing for ten years. Dropping it means admitting the moat moved, which means admitting most of the moves you made to build the moat are now worth less than you thought. That's a hard ledger to open in public.

What I keep noticing: the people who seem least affected aren't producing more, they're producing for fewer specific humans whose trust they've already bought. Compounds slowly, like you said. The new game looks a lot like the oldest game.

Collapse
 
peacebinflow profile image
PEACEBINFLOW

The line that keeps echoing is "we optimized for visibility in a market that was optimizing for cheapness." That’s the whole thing, right there. You and the market were playing different games and calling them by the same name.

What I think you’ve put your finger on—maybe without naming it directly—is that building in public stopped being a strategy and became a tax. It used to be that publishing tutorials, sharing code, contributing to open source was a way to stand out because most people didn’t do it. Now not doing it feels like a liability, so everyone does it, and doing it just keeps you at the baseline. You’re running to stay in place.

And the really cruel part is that the people whose attention we’re trying to catch—recruiters, hiring managers—are drowning in the same flood. They’re not ignoring seven tutorials out of malice. They’re ignoring seven tutorials because they have seven hundred applicants, and the tools they’re using to triage that pile weren’t designed to detect "this person actually built the thing versus wrote about building the thing." They’re designed to reduce the pile to something a human can look at before lunch.

I don’t have a solution either. But I wonder if the next move isn’t to build more, or better, but to build in places where the flood hasn’t reached yet. Not platforms. Not content streams. Smaller, slower channels where signal still travels at human speed. The problem is those places don’t scale, which is exactly why the market doesn’t value them—until suddenly it does, because it has no choice. You mentioned being from Port Harcourt. I’m curious whether you’ve seen more traction from local or regional networks that operate on a different logic than the global content marketplace, or if the same dynamic has reached there too.

Collapse
 
dannwaneri profile image
Daniel Nwaneri

"Building in public became a tax" . That's the reframe I was circling but didn't land. You named it cleaner.
On Port Harcourt: the honest answer is the same flood has reached here, just with worse infrastructure underneath it. The global content marketplace arrived before the local economy was ready to absorb it, so you get the competition without the opportunity density. Regional networks exist but they're relationship-based, slow, and don't produce the kind of paper trail that remote hiring requires. Which is its own trap.

The "smaller, slower channels" instinct is right. The problem is you can't pay rent on a channel that doesn't scale yet. So you keep one foot in the flood while trying to build something quieter with the other hand. Most people can't hold that position long enough for the quiet thing to matter.

Collapse
 
laura_ashaley_be356544300 profile image
Laura Ashaley

A reminder that AI tools alone don’t guarantee income—real outcomes still depend on skills, positioning, and execution.

Collapse
 
dannwaneri profile image
Daniel Nwaneri

The article isn't arguing AI tools guarantee income — it's arguing that skills, positioning, and execution stopped being enough.