AIdolatry

Navneet Alang’s No god in the machine: the pitfalls of AI worship is the sort of long, meandering read that I enjoy.

Before I quote some of points that resonated the most with me, the larger context is that our decline of the humanities, philosophy, and religious studies have made our society ripe for the arising of cheap pseudo-religions and messianic movements. Without the deeper vocabulary in the humanities, most critics of the AI movement are left with primitive one-liners screaming that AI is racist or whatever.

Alang makes this point about the larger ideological context:

The idea of an exponentially greater intelligence, so favoured by big tech, is a strange sort of fantasy that abstracts out intelligence into a kind of superpower that can only ever increase. In this view, problem-solving is like a capacity on a dial that can simply be turned up and up. To assume this is what’s called “tech solutionism”, a term coined a decade ago by the writer Evgeny Morozov. He was among the first to point to how Silicon Valley tended to see tech as the answer to everything.

The idea that there is a worldly, technological solution to everything is a radical philosophical position, which many have unquestioningly adopted.

Another deeply held assumption by the AI crowd is a deep mind-body dualism, which traditional religions have been debating and mostly rejecting for thousands of years:

So much of what produces will and desire is located in the body, not just in the obvious sense of erotic desire but the more complex relation between an interior subjectivity, our unconscious, and how we move as a body through the world, processing information and reacting to it. Zebrowski suggests there is a case to be made that “the body matters for how we can think and why we think and what we think about”. She adds, “It’s not like you can just take a computer program and stick it in the head of a robot and have an embodied thing.”

And the conclusion:

When the systems that give shape to things start to fade or come into doubt, as has happened to religion, liberalism, democracy and more, one is left looking for a new God. There is something particularly poignant about the desire to ask ChatGPT to tell us something about a world in which it can occasionally feel like nothing is true. To humans awash with a sea of subjectivity, AI represents the transcendent thing: the impossibly logical mind that can tell us the truth.

Lingering at the edges of Clarke’s short story about the Tibetan monks [The Nine Billion Names of God] was a similar sense of technology as the thing that lets us exceed our mere mortal constraints. But the result is the end of everything. In turning to technology to make a deeply spiritual, manual, painstaking task more efficient, Clarke’s characters end up erasing the very act of faith that sustained their journey toward transcendence. But here in the real world, perhaps meeting God isn’t the aim. It’s the torture and the ecstasy of the attempt to do so. Artificial intelligence may keep growing in scope, power and capability, but the assumptions underlying our faith in it – that, so to speak, it might bring us closer to God – may only lead us further away.