· 2 min read

Open Source is closing its doors.

AI didn't break the code — it broke the economics. Hallucinated bug reports, spam pull requests, and a verification tax are overwhelming unpaid maintainers. Projects are closing their gates. If your stack depends on open source, upstream project health is now a board-level concern.

Open Source is closing its doors.
Open Source is closing its doors.

AI didn't break the code. It broke the economics.

Open Source has survived corporate capture, licensing wars, and decades of unpaid labor. It will survive AI too.

But the economics it's built on? Those are already breaking.

In January, the curl project shut down its six-year-old bug bounty program. curl runs on over 20 billion devices. The reason for the shutdown: 20 AI-hallucinated security reports landed in the first 21 days of the year. Every one required manual investigation. Every one was fabricated.

This is not an isolated incident. It is the new normal.

AI coding tools have made it trivially easy to generate a pull request. But verifying that code, making sure it actually works, fits the architecture, and doesn't introduce a vulnerability, is still slow, difficult, and irreducibly human.

The result is an asymmetric flood: frictionless generation on one side, bottlenecked review on the other.

Maintainers are responding the way you would expect. They are closing the gates.

The tldraw project now auto-closes every external pull request. Ghostty requires a trusted contributor to personally vouch for any new developer before their code is even reviewed. Unvouched submissions are rejected automatically.

And some are going further. Last week, Cal․com, an open source scheduling platform backed by $32M in venture funding, announced it is moving its production codebase to closed source. The stated reason: AI tools can now scan open codebases to identify and exploit vulnerabilities faster than maintainers can patch them. When even a funded company concludes that transparency has become a liability, the economics have shifted in ways that volunteer-maintained projects cannot absorb.

For decades, open source operated on trust by default. The barrier to entry was understanding the codebase well enough to contribute something valuable. AI eliminated that barrier overnight, and the trust model broke with it.

The data underneath is sobering. A controlled study by METR found that experienced developers using AI tools on real-world repositories were 19% slower, not faster. The culprit: the "verification tax", the compounding time spent debugging the subtle errors AI introduces.

Meanwhile, 60% of open source maintainers are entirely unpaid. 44% cite burnout as their primary reason for quitting. And 97% of enterprise codebases depend on their work.

The open source ecosystem is not collapsing. But it is reorganizing around trust, identity, and gatekeeping, in ways that fundamentally change who gets to participate.

If your organization depends on open source infrastructure (and statistically, it does), this shift deserves attention. How you evaluate upstream project health matters more now than it ever has.

Read next

The AI Rental Trap.

The AI Rental Trap.

Anthropic pulled Claude Code from its $20 plan and reversed course within days — but the signal matters more than the reversal. Every major AI tool is sold below cost. When subsidies end, every workflow built on rented AI becomes a cost you didn't budget for or a capability you lose overnight.

Kai A. Hartung
Kai A. Hartung
· 2 min read