the ai will break you
I've spent the last few days obsessed with this story of a man live-streaming his own psychotic break, as he leads the police on a 110mph chase with his five children in the back seat. (The link is to an Internet Archive mirror of the page because the New Hampshire Union Leader is one of those weird US publications which is terrified of GDPR. Ah well.)
The tale itself is terrifying enough:
A Massachusetts man arrested after leading police on a chase with his five children in the vehicle live-streamed some of the incident on Facebook before allegedly ramming a cruiser and crashing into a tree in North Hampton.
“We don’t want to die,” one of his daughters screamed at one point as she pleaded with him to stop during Thursday’s frightening ordeal.
Police had received a report that a woman was thrown from the vehicle in Massachusetts, which prompted the chase, but Rockingham County Sheriff’s Chief Deputy Al Brackett said they later learned that the woman was [Alpalus] Slyman’s wife and that it appeared she jumped out while it was moving because she was concerned about the way he was acting.
The details in this Twitter thread, which, as is increasingly common, traces the extremely public nature of one man's descent into psychosis – first slowly, then quickly – are more alarming still. Marc-André Argentino does a sterling job of piecing together that history.
As far back as 2011, Slyman was a conspiracy theorist, with a line in anti-vaxx, 9/11 trutherism and illuminati theories.
And over the years, he accumulated a veritable grab-bag of other delusions. This post, a picture of a list from last year that lists all the things he believes, is like a bad piece of diegetic storytelling from a middling walking simulator:
But things really went off the rails on June 6th – less than a week before he got in his car and nearly killed his family. That's when, according to Argentino's analysis, Slyman was first introduced to QAnon.
A brief précis for those not versed in this particular branch of internet insanity: QAnon is, narrowly construed, a conspiracy theory, holding that there is a great conspiracy to keep Donald Trump from ridding the world of the illuminati-esque cabal of paedophiles and murders who currently pull the strings as part of the Deep State.
Broadly construed, I think it would be fair to call QAnon the first religion of the third millennium. It has its prophet – the pseudonymous Q, someone who has spent the last couple of years posting cryptic predictions on the bottom-feeding image board 8Chan. Q's predictions have never come true, of course, but each failed prognostication only increases the zealousness of the converts.
I would be willing to bet that in thirty years time, long after Donald Trump becomes the second most famous person in history to die on the toilet, people will be hawking books of Q's posts on street corners, reading new meaning into them.
So: Slyman, Argentino believes, first watched a video from the QAnon community on June 6th. Then, "very likely he was red-pilled into QAnon in the early hours of June 8 when he binge watched 'Fall of the Cabal' until 4am," Argentino writes. From there, he descends further, latching on to one particularly niche theory that Hilary Clinton skinned and ate a child on camera for the illicit high gained from consuming a young person's adrenaline.
Five days after he watches his first Q video, he is live-streaming his belief that the local radio station is sending him coded messages from Q. Later that day, the song You Spin Me Round by Dead Or Alive convinces him the Deep State is coming to kill him, and he gets in the car with his wife and kids and begins his drive.
The question that matters, besides "how much worse could this have all been", is this: was Slyman radicalised by QAnon, or was QAnon just the closest thing available at the moment a man who had been hovering on the edge of sanity finally tipped over?
There is no doubt that people have been radicalised by the internet, and by this particularly horrible corner of it. There are just too many cases like Slyman's, where we can see, in the pattern of YouTube likes, Facebook groups and Twitter follows, someone entering the funnel at one end – watching Jordan Peterson videos, or listening to the Joe Rogan Experience – and then, six months or a year later, fully "red-pilled", accusing Hilary Clinton of child murder or calling for a second civil-war in the US.
(One particularly curious thing about this as a Brit is that that's even the journey of radicalisation of much of the UK far right. God knows we have our own pathways too – with Tommy Robinson and Katie Hopkins playing major parts – but the number of Trump t-shirts and MAGA hats at British fascist gatherings is wild.)
But in this case, six days just feels too quick for the normal radicalisation narrative to fit. This was someone who was pushed over the edge; if, instead of finding QAnon, he'd been accosted by Hare Krishnas at an airport, or had a door-to-door missionary visit, the results would have been far less violent, but possibly just as transformative to his life.
The reason I can't stop thinking about the case, though, is because of another aspect to the question. What is the role of algorithmic curation in this?
I don't mean in general, though the negative consequences of optimising for engagement are well-documented. If you preferentially promote things that are likely to garner comments, likes, and responses, you are likely to push discourse to extremes.
I mean in particular. I mean: if YouTube's recommendation algorithm had learned to recognise the signs of someone on the edge of a psychotic break, and had learned that if you show them a lot of QAnon videos at that stage in their life engagement goes through the roof, what would be different from the tale we've just heard?
"A pair of very powerful AIs are regularly driving people to insanity, death and murder" is a pitch for a sci-fi short story but it is also a very literal description of the world today.
I don't think Facebook's algorithms are deliberately breaking people. In part because I don't think Facebook's algorithms are actually that smart; when it comes to ad targeting, the company would absolutely try to profile users with the level of granularity required to find people on the edge of a breakdown, and I honestly believe if it was palatable to sell adverts targeted to that group, the company would already have that product available.
But I'm not sure they aren't doing it accidentally. Because fundamentally, what Facebook's algorithms saw is a man watch a couple of videos, and then post more than 16 hours of livestreams in five days. That is phenomenal engagement. What's not to like?
Other than, y'know. Everything.
If you like this email, please pass it on to a friend. If you are the friend to whom this was passed on, why not subscribe here?