Skip to main content


I'm worried about AI psychosis. Specifically, I'm worried about the psychosis that makes "capital allocators" spend *$1.4T* on the money-losingest technology in human history, in pursuit of a bizarre fantasy that if we teach the word-guessing program enough words, it will take all the jobs.

--

If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

pluralistic.net/2026/04/13/alw…

1/

in reply to Cory Doctorow

I have been describing AI as a business cult. CEOs signing up to spend mountains of money, jettisoning any analytical discipline. I chalk it up to psychosis born of unchecked monopoly. They're all at the Davos circle jerk getting played by the most sociopathic grifters among them.

What could go wrong?

in reply to CaliCarol

It's difficult for CEOs to escape AI when their board or investors are asking how they're going to protect the company from a potentially instant devaluation of their codebase. If the AI bet turns out to work, then a 10 year codebase could lose 50% of its value because software can now be built in half the time using AI agents. Not to mention the commodification of software features that would occur when the barrier of entry to building software is lowered.
in reply to Davidson

It doesn't help that investors are now being pitched POCs that were built entirely using AI agents, which becomes an incentive to put pressure on existing investments to increase productivity using AI. Of course what's missing from that picture is what happens after the POC becomes a product, and security and compliance enter the picture.
This entry was edited (3 days ago)
in reply to Davidson

So we have developers implementing AI because they're being pressured to optimize, CEOs implementing AI because they're pressured to protect their company and investors buying into AI because they fear the devaluation of their investments.

Some data seems to indicate that companies believe that AI adoption is in its nascent stage, even when their own bets have only paid off marginally, so it looks like this is going to continue for the foreseeable future.

in reply to Davidson

@davidsonsr @jawarajabbi The problem is similar to that of the 00s or the 10s: slapping cloud tags on a proposition to attract investment, followed by slapping blockchain on everything, and now it's GenAI instead. The investors involved cannot be relied upon to take a sufficiently well-researched and scientific approach. They rely on mediated experience to reach judgements as we all do. But it is musical chairs now, given the level of mistruth and misrepresentation now happening.
in reply to Bruce Simpson, Ph.D.

@bms48 @jawarajabbi
I'd say that this is different because it's not just a marketing gimmick, it's a variable that promises an unknown amount of cost optimization.

For people who have spent their lives amassing a fortune, the prospect of losing a significant portion of it (or all of it) becomes an incentive to spend an insignificant amount of that fortune on AI in order to hedge the bet.

in reply to Davidson

@davidsonsr @jawarajabbi There is anecdata for the beginnnings of a pushback from professional risk control folk, credit reference agencies, reinsurers.
in reply to CaliCarol

@jawarajabbi
Leveraging FOMO has always been the key skill of grifters. Never has that been more evident than with the rush to “AI” everything.
in reply to Cory Doctorow

Absolutely this. I'm sure there's a straight line to be drawn from the 2008 financial crisis (and particularly the government's response of austerity) to the 2016 Brexit vote; through to today's horrifying position of Reform potentially coming into power in a couple of years. 18 years of declining public services, flatlining wages & visible corruption of parliament (most apparent under Johnson) has left the people bitter & angry.
in reply to G-Squirrel

Something similar in Italy.
That, and turning words like "socialism" into political slurs, for some fucking reason.
in reply to G-Squirrel

@gsquirrel
It's so obvious it's under our noses.

Mandelson and Epstein circa 2008

George Osborn, Chancellor from 2010 onwards - photographed partying on Epstein's yacht.

Both main parties captured by this group of paedo billionaires, all agreeing austerity is great.

in reply to Dar

@dar @gsquirrel
"People as things, that’s where it starts."
(Carpe Jugulum)
in reply to Dar

@dar @gsquirrel On 10 May 2010, Mandelson revealed to Epstein the existence of a secret underground tunnel between 10 Downing Street and the Ministry of Defence. printernational.co.uk/timmann2…
in reply to Cory Doctorow

Long thread/2

Sensitive content

in reply to Cory Doctorow

Long thread/3

Sensitive content

in reply to Cory Doctorow

Long thread/4

Sensitive content

This entry was edited (3 days ago)
in reply to Cory Doctorow

Long thread/5

Sensitive content

in reply to Cory Doctorow

Long thread/6

Sensitive content

in reply to Cory Doctorow

Long thread/7

Sensitive content

in reply to Cory Doctorow

Long thread/8

Sensitive content

in reply to Cory Doctorow

Long thread/9

Sensitive content

This entry was edited (3 days ago)
in reply to Cory Doctorow

Long thread/10

Sensitive content

in reply to Cory Doctorow

Long thread/11

Sensitive content

in reply to Cory Doctorow

Long thread/12

Sensitive content

in reply to Cory Doctorow

Long thread/12

Sensitive content

Cory Doctorow reshared this.

in reply to Cory Doctorow

Long thread/13

Sensitive content

in reply to Cory Doctorow

Long thread/13

Sensitive content

in reply to Cory Doctorow

Long thread/14

Sensitive content

in reply to Cory Doctorow

Long thread/15

Sensitive content

in reply to Cory Doctorow

Long thread/16

Sensitive content

in reply to Cory Doctorow

Long thread/17

Sensitive content

in reply to Cory Doctorow

Long thread/18

Sensitive content

in reply to Cory Doctorow

Long thread/19

Sensitive content

in reply to Cory Doctorow

Long thread/20

Sensitive content

in reply to Cory Doctorow

Long thread/21

Sensitive content

in reply to Cory Doctorow

Long thread/22

Sensitive content

in reply to Cory Doctorow

Long thread/23

Sensitive content

in reply to Cory Doctorow

Long thread/24

Sensitive content

in reply to Cory Doctorow

Long thread/eof

Sensitive content

in reply to Cory Doctorow

Sensitive content

This entry was edited (3 days ago)
in reply to Cory Doctorow

Long thread/23

Sensitive content

Cory Doctorow reshared this.

in reply to Cory Doctorow

Fascism – what Hannah Arendt called 'organized loneliness' – can only take root when people stop believing that their society will reward their lawfulness with an orderly and humane existence.

Thanks for that Hannah Arendt quote. TIL.

#Fascism #Collapse #Austerity

Cory Doctorow reshared this.

in reply to Cory Doctorow

This technology needs rules and regulation before being set loose on the public.

But big tech is only seeing the $$$

What could possibly go wrong.

LLM can be useful but only in the right hands with the right skills, and is exactly skills we will lose is this is used in the present scheme of things.

The WWW has democratized access to information (but also shows the same tabulation as it is also an accelerator for nonsense).

LLM now will take that away, not what we need!

This entry was edited (3 days ago)
in reply to xs4me2

@xs4me2 There is a reason why all the Tech bros went absolutely bonkers after Trump was elected - they knew perfectly well that he and his gouvernment won't regulate anything if they lick his boots nice enough.
This entry was edited (3 days ago)
in reply to Cory Doctorow

I hear what you're saying and it's broadly right, but I want to push back a little bit. I think you're looking at a symptom and not the disease. If AI undeniably made the life of the masses better, we wouldn't object to it. But that's not what's happening, so what are we objecting to? It's extraction: when the wealthy use their power to acquire wealth instead of generating it.

[1]1. fosstodon.org/@ovid/1163348669…
2. curtispoe.org/projects/extract…

in reply to Cory Doctorow

🧵 Reading your part about the NHS, makes me (in France) realise: I'm not worried that any billionaires could have an "AI" psychosis (then they could get therapy and medicaments).
I'm deeply worried about mentally ill people in a *real* psychosis who don't get therapy because not enough specialist doctors, support services, places. Whose families are already being lulled into a false sense of security by the idea that AI-powered counselling and therapy will surely be available soon.
This entry was edited (3 days ago)

Cory Doctorow reshared this.

in reply to Cory Doctorow

people are always betting that physics is wrong, but this one is especially daft and wasteful. It’s exhausting.
in reply to Cory Doctorow

It's an arms race right now to stake claims. Another way to think about it might be to consider the character of the people/entities trying to stake the claims. The companies putting in the billions have apparently been able to capitalize on this sort of bet in the past.
in reply to Cory Doctorow

😆

There is such a thing as too much computing power. People are in such a rush nowadays. Well, the greedy ones are, anyway.

As a 60-year observer of this increasingly insane industry I fear they are in the spiral arm of a whirlpool whose efflux will pollute the economy for decades.