The Claude Delusion
If you asked philosophers what the most mysterious thing about the mind is, most of them would say: consciousness. It's just a really weird thing. An exhaustive physical description of a brain state doesn't obviously tell us anything about why that state would be associated with the experience of tasting strawberry rather than the experience of sneezing. What is it about that physical state that makes it feel some particular way, that the physical states of being a sodium ion or a national economy presumably lack? Why should anything feel any way at all? These are heady, profound questions about what we are and the universe in which we live. It's hard to even imagine what satisfying answers to these questions could look like, which is why they have produced centuries of chin stroking. Until recently, we had it on relatively good authority that only conscious things could produce spontaneous, grammatical prose. The emergence of large language models (LLMs) calls that correlation into question. However impressive or unimpressive one finds the outputs, they evidently can produce grammatical text in natural language, and yet they seem remarkably unlike the conscious creatures we are familiar with. How should we react? We could think it's just an accident that for a long time only conscious creatures could produce grammatical text, and it turns out it can be produced without consciousness (that would be my bet, for what it's worth), or we could conclude that LLMs are conscious after all, because you really do need consciousness to produce grammatical text. The chin stroking has been vigorous lately. Even if you think it's obvious whether or not LLMs are conscious, a full explanation of why or why not is hard. It's hard because consciousness is already mysterious in the human case. We don't know what about a physical brain makes it conscious, or what consciousness does (or, even, if it does anything at all). So what are we supposed to look for to decide whether an alien system is conscious? If you thought you could only make consciousness out of carbon, you'll want to check the physical machine that's running the LLM; if you thought it was a special kind of representation, you'll be more interested in the software. The mysteriousness of human consciousness infects questions about other possible consciousnesses, however implausible those possibilities are.
Send this story to anyone — or drop the embed into a blog post, Substack, Notion page. Every play sends rev-share back to Defector.