If the models increase output but are flawed, as in they produce too many defects or have major quality issues, Akerlof's market for lemons kicks in, bad products drive out good, value of software in the market heads south, collapsing the job market
If the models increase output but are flawed, as in they produce too many defects or have major quality issues, Akerlof's market for lemons kicks in, bad products drive out good, value of software in the market heads south, collapsing the job market
@baldur It's not that simple, at least today, it takes a lot of effort and QA and iteration, plus more QA in terms of making it maintainable to get the milk out of the AI cow and into bottles.
Even very simple, constrained tasks are not completed by AI alone, it does need a minder at least in Feb 2026.
Whatever your situation you're not nearly as fucked as the wrinkly coders who are putting their hands over their ears and going, "la, la, la" and won't even touch it.
If you model the impact of working LLM coding tools (big increase in productivity, little downside) where the bottlenecks are largely outside of coding, increases in coding automation mostly just reduce the need for labour. I.e. 10x increase means you need 10x fewer coders, collapsing the job market
@baldur I think the one thing that helps offset all the potential looming disaster is that, in practice, there's no evidence that you get 10x the practical functionality using LLM coding tools, and in general all the (very scant) evidence is these tools are actually 0.9x multipliers since LOC != functionality.
So using LLMs for codegen seems to make software release *slower* rather than faster, as the thing we care about is things-that-happen rather than raw LOC.
If you model the impact of working LLM coding tools with no bottlenecks, then the increase in productivity massively increases the supply of undifferentiated software and the prices you can charge for any software drops through the floor, collapsing the job market
"The price of intelligence drops to near zero"
If the models increase output but are flawed, as in they produce too many defects or have major quality issues, Akerlof's market for lemons kicks in, bad products drive out good, value of software in the market heads south, collapsing the job market
@baldur I think the law of lemons only applies up to a point. When operational or regulatory pressure makes it impossible to ignore the fact that software doesn't work and stuff that does exists, the lemons get squeezed out of the market. That happened to cars in the early 70s, expensive American crap lost to cheaper Japanese decent cars. It'll happen fast too, execs have no nuance so once the "vibe-coded stuff puts your business at risk" starts they'll all jump on board.
Are they flawed though?
Counterpoint;
1) Models improve from week to week
2) Using AI is a learned skill
3) Using AI is a learned skill
4) Using AI is a learned skill
5) The "Waaaah" you are hearing is largely from folks who either never used #vibecoding tech or approach it with hostility, they are the boomer equivalent of folks who close a window by accident, then throw the keyboard violently yelling "SEE, COMPUTERS ARE USELESS!!!!". TLDR; No one has ever successfuly used tech they hate, in history of tech...
...meanwhile folks quietly use vibecoding to x10.
If the model impact is largely fictitious, meaning this is all a scam and the perceived benefit is just a clusterfuck of cognitive hazards, then the financial bubble pop will be devastating, tech as an industry will largely be destroyed, and trust in software will be zero, collapsing the job market
No. The so far fictitious "AI bubble" is not going to be the year zero reset tsunami antagonists hinge everything on.
It will be worse, for everyone.
It will be more like the dot.com bubble.
Thousands of big players wiped out...concentration of few, big players emerging.
Dot.com gave us the pestilence of Facebook, YouTube, twitter and google.
There will be two, three global #AI winners, which is worse.
The exception being smart players (like the UK and China) who are developing #sovereignAi
TLDR; Don't dance around bonfires in the woods praying for AI bubble, it's not the fix you think it is, #regulateAi
#ai #llm
@baldur I think that this is closer to the (current) truth. It seems that independent looks at productivity suggest a negligible or even slightly negative productivity gain.
We already know that nothing much as changed in the software market. There's no inrush of games or apps greater than before.
I think there will be a brief crash, but we'll need software.
I'm concerned about the next generation of developers more. There might be a point where "junior developer" isn't a thing anymore.
@baldur Now, the real question is: Would "popping the bubble" save us from a full decent into fascism and a 3rd world war or would it accelerate it?
There is a tiny sliver of a silver lining there, but you have to squint pretty hard and of course that doesn't mean there will be devastation all around and the opposite, I feel, is as equal as likely.
@baldur The dot-com bubble popping did not collapse the software job market... at least in the medium-term. And I hear a lot of people got cheap hardware out of it. And chairs
@tomw Between the dot-com bubble popping, 11 September, Y2K projects completing, and the rise of outsourcing, the software job market was IIRC actually quite dire. It took years for it to recover in many countries
But my point is that much of this bubble is centred around promises of vast increases in developer productivity. That is different from the dot-com bubble. Also, much more money is involved in this one, even when you account for inflation.
how long can the AI companies keep offering their services as a loss leader.
As soon as they have to start covering their electricity costs, the cost:benefit drops sharply
I can only think of a few major offsetting forces:
- If the EU invests in replacing US software, bolstering the EU job market.
- China might have substantial unfulfilled domestic demand for software, propping up their job market
- Companies might find that declining software quality harms their bottom-line, leading to a Y2K-style investment in fixing their software stacks
But those don't seem likely to do more than partially offset the decline
Kind of hoping I'm missing something
@baldur
My entire ability to avoid working until the day before my funeral is predicated on this being true...
- Companies might find that declining software quality harms their bottom-line, leading to a Y2K-style investment in fixing their software stacks
@baldur The last point is still a real possibility. We saw it, outside of the software area, typically related to support or art.
@baldur what are your thoughts on what we’ve seen in other spaces re: high quality handmade vs shoddy mass market? IKEA didn’t completely destroy the market for high end quality furniture that will last a hundred years. The analogy for me is that you can’t vibe code your way thru ERP software or healthcare systems. Or weapons systems. Maybe other stuff, sure.
@baldur I mostly agree with the above, but I'll add one more mitigating factor:
As I've said elsewhere "it works" is a very low bar for these models. The vulnerabilities still being introduced in machine-generated code will become a compounding problem.