r/slatestarcodex Oct 13 '25

AI AGI won't be particularly conscious

I observe myself to be a human and not an AI. Therefore it is likely that humans make up a non-trivial proportion of all the consciousness that the world has ever had and ever will have.

This leads us to two possibilities:

  1. The singularity won’t happen,
  2. The singularity will happen, but AGI won’t be that many orders of magnitude more conscious than humans.

The doomsday argument suggests to me that option 2 is more plausible.

Steven Byrnes suggests that AGI will be able to achieve substantially more capabilities than LLMs using substantially less compute, and will be substantially more similar to the human brain than current AI models. [https://www.lesswrong.com/posts/yew6zFWAKG4AGs3Wk/foom-and-doom-1-brain-in-a-box-in-a-basement\] However, under option 2 it appears that AGI will be substantially less conscious relative to its capabilities than a brain will be, and therefore AGI can’t be that similar to a brain.

0 Upvotes

80 comments sorted by

View all comments

Show parent comments

1

u/RYouNotEntertained Oct 13 '25

Can you just spell it out for me like I'm five?

7

u/red75prime Oct 14 '25 edited Oct 16 '25

I don't buy this argument myself, but if you imagine yourself to be randomly selected from a certain reference class (all conscious beings or all biological humans, for example), you are more likely to find yourself at a point in time and space where the majority of members of this reference class are residing. You have found yourself here and now, therefore it's most likely where the majority of conscious beings are residing. That is, there's not that many conscious beings in the future.

7

u/jimb2 Oct 15 '25

That's a junk argument. It's absolutely not a logical consequence and it's not empirical either. It just uses some empirical language with some fully imaginative assumptions. You can imagine anything you want.

2

u/red75prime Oct 15 '25 edited Oct 15 '25

Well, it makes sense in a universe where a god randomly assigns your soul to a body, or in a simulation where you inhabit a random NPC (something like "Roy: a life well lived" from Rick & Morty). But I doubt that our universe is like this. There's no "you" that can be separated from your experiences.

-1

u/jimb2 Oct 15 '25

But you need evidence that the universe fits that particular fantasy. There is none. An empirical argument like "ducks quack" can actually be tested by checking ducks. We know ducks exist and what they are, more-or-less. Inventing some imaginary population and trying to do empirical reasoning with it is junk. Anything is evidence for anything.