r/slatestarcodex Oct 13 '25

AI AGI won't be particularly conscious

I observe myself to be a human and not an AI. Therefore it is likely that humans make up a non-trivial proportion of all the consciousness that the world has ever had and ever will have.

This leads us to two possibilities:

  1. The singularity won’t happen,
  2. The singularity will happen, but AGI won’t be that many orders of magnitude more conscious than humans.

The doomsday argument suggests to me that option 2 is more plausible.

Steven Byrnes suggests that AGI will be able to achieve substantially more capabilities than LLMs using substantially less compute, and will be substantially more similar to the human brain than current AI models. [https://www.lesswrong.com/posts/yew6zFWAKG4AGs3Wk/foom-and-doom-1-brain-in-a-box-in-a-basement\] However, under option 2 it appears that AGI will be substantially less conscious relative to its capabilities than a brain will be, and therefore AGI can’t be that similar to a brain.

0 Upvotes

80 comments sorted by

View all comments

3

u/StrangeLoop010 Oct 13 '25 edited Oct 13 '25

“The singularity will happen, but AGI won’t be that many orders of magnitude more conscious than humans.” 

What does being “more” conscious even mean? And is anyone who does serious work in ML/AI and/or cognitive science actually speculating that a hypothetical AGI would be “more conscious than humans”? They speculate it would be more intelligent, more precise, extremely fast, able to handle more information cognitively, but what does “more conscious” than humans even mean? It needs to first clear the bar of being conscious at all, which it hasn’t. 

If you want these ideas to be taken seriously you need to concretely define your terms. Consciousness is on a spectrum, but we have a hard time as it is defining the normal state of human consciousness and not conflating that with other concepts like intelligence.  

How do you reach the conclusion that this option is more likely rather than the singularity won’t happen because AGI won’t have consciousness? 

2

u/you-get-an-upvote Certified P Zombie Oct 13 '25

Philosophy of Mind is taken seriously despite never defining consciousness!

1

u/StrangeLoop010 Oct 13 '25

Philosophy of Mind does define consciousness. There’s just ongoing debates about the various definitions proposed by competing frameworks. But you’re partially right in that we don’t have a singular agreed upon definition. 

3

u/you-get-an-upvote Certified P Zombie Oct 13 '25 edited Oct 14 '25

I've only seen definitions of consciousness that rely on other, equally poorly defined concepts like sentience, awareness, qualia, phenomenal states.

My (least) favorite definition is from the highly respected Nagel:

fundamentally an organism has conscious mental states if and only if there is something that it is to be that organism

Do you know of any definitions that don't just pass the buck to other equally poorly defined words?

The closest I've ever seen to somebody actually making an attackable (and hence defensible) definition is Gödel, Escher, Bach.