If there’s a good path through the woods here, I think it’s going to require a lot of social technology.
Our personal habits, beliefs, and cultural understanding of the web and how we interact with it will need to change. (fwiw, I’m cautiously optimistic. My friend’s teenage daughter already doesn’t believe anything online is necessarily real, and practices things like phone hygiene. So, in the same way we’ve adapted to the pitfalls of limitless calories, maybe there’s some cultural adaptions we’ll develop here)
The incentive structure for web2 platforms is also a big part of this. Ad based revenue models led to product designed to aggregate attention which led to social media as an exercise in aggregating the largest following possible. I’m not sure what kind of institutional changes we’d need to make to head off bad incentives < dark patterns < crap timeline, but I currently believe there’s a lever there.
More pessimistically, there might be institutional changes to be made taking our broken brains and flooded zones as a given. I’m similarly unsure how we might think about rewiring democratic procedures for the world we’ve wrought, but that might end up being another point of intervention.
The printing press and widespread literacy launched a trillion lies. Who needs an LLM when you can print a pamphlet and distribute copies across the neighborhood. Or for the cheapskates, tell Aunty Mabel.
There's a confluence of ugly, global moments preceding LLMs that gave the new tools particular potency. Eight years ago, Americans decided that anyone was better than a woman, so they made Trump their chief misinformation overlord. Then there was a pandemic, which saw billions of people acquire, overnight, PhDs so they were qualified to 'do their own research', rather than rely on the fools who spent years studying. Why read footnotes when you can make up your own shit, or embrace the shit that some other idiot made up.
LLMs are kerosene on the fire, not the cause of the fire.
If there’s a good path through the woods here, I think it’s going to require a lot of social technology.
Our personal habits, beliefs, and cultural understanding of the web and how we interact with it will need to change. (fwiw, I’m cautiously optimistic. My friend’s teenage daughter already doesn’t believe anything online is necessarily real, and practices things like phone hygiene. So, in the same way we’ve adapted to the pitfalls of limitless calories, maybe there’s some cultural adaptions we’ll develop here)
The incentive structure for web2 platforms is also a big part of this. Ad based revenue models led to product designed to aggregate attention which led to social media as an exercise in aggregating the largest following possible. I’m not sure what kind of institutional changes we’d need to make to head off bad incentives < dark patterns < crap timeline, but I currently believe there’s a lever there.
More pessimistically, there might be institutional changes to be made taking our broken brains and flooded zones as a given. I’m similarly unsure how we might think about rewiring democratic procedures for the world we’ve wrought, but that might end up being another point of intervention.
Fab article so all I can add is relatable videos:
'Beyond Utopia' is an excellent documentary about escaping brainwashing - https://www.youtube.com/watch?v=sVmKew4YYSY
Maria Ressa is on Al Jazeera for A.I. - https://www.youtube.com/watch?v=QopoJRt-wH0
My favourite docuseries last year is about shitty people in the USA using humans in the build-up to robocallers - https://www.youtube.com/watch?v=nKLveXWvb2s
The printing press and widespread literacy launched a trillion lies. Who needs an LLM when you can print a pamphlet and distribute copies across the neighborhood. Or for the cheapskates, tell Aunty Mabel.
There's a confluence of ugly, global moments preceding LLMs that gave the new tools particular potency. Eight years ago, Americans decided that anyone was better than a woman, so they made Trump their chief misinformation overlord. Then there was a pandemic, which saw billions of people acquire, overnight, PhDs so they were qualified to 'do their own research', rather than rely on the fools who spent years studying. Why read footnotes when you can make up your own shit, or embrace the shit that some other idiot made up.
LLMs are kerosene on the fire, not the cause of the fire.
Right there with you, last year I penned a piece about this very subject and it’s looming implications.
A year later, I’m trying to answer the question of what we can do about it. To my surprise, that answer is showing up as a novel— The Age of Disbelief
https://open.substack.com/pub/corsonfinnerty/p/chapter-one-timecode?r=ql3e4&utm_medium=ios&utm_campaign=post
I felt that second paragraph so hard.
Surprised to see you here, thought you decamped
I did! On beehiiv now. TheRacket.news