sofiechan home

AGI and demographics

anon_542 said in #3046 2w ago:

Sometimes you believe two things but don't know how to think about them at the same time. Very few people could think well about how AGI relates to companies before 2015ish. Similarly, very few people could think well about how AGI relates to governments before ChatGPT. I don't mean "have clever or correct thoughts", I mean "think about the intersection of these two things at all"—they were essentially in separate magisteria.

I currently have no idea how to think about the intersection of AGI and civilizational decline, and demographic shifts in particular. Surely the west collectively committing suicide via immigration (and East Asian countries losing the will to reproduce, and African populations booming) is *somehow* relevant for thinking about what the path to AGI going well looks like. But I look at these two things and my brain just puts them in two separate magisteria—the timeframes involved are too different.

One guess is that AGI makes the "exit" option far more feasible as a way to escape civilizational decline, since a small population of highly talented humans using AGI well could be competitive with a much larger population of less talented humans using AGI badly. But the whole point of AGI is that you don't need to be very talented to use it well. Perhaps if AGIs end up shackled by bureaucracy within major powers in the same way that humans are, then maybe you still want to escape? But this assumes AGIs that are aligned well enough to keep enforcing bureaucracy, but not well enough to somehow fix the situation—a seemingly narrow path.

The other obvious thought is that AGI concentrates power in a way that can be used to fix civilizational decline. But this is a pretty rough possibility to aim for, due amongst other things to the extreme lack of people who are virtuous enough that such concentration would be desirable.

referenced by: >>3049

Sometimes you believ

anon_544 said in #3049 2w ago:

>>3046
Great question OP. This one has puzzled me for many years as I have been convinced both of the near-medium possibility and revolutionary impact of AGI, and of the demographic suicide default trajectory. Here's how i factor it:

1. We don't actually know that much about what AGI is going to do or be like, even if it happens. Our whole conceptual framework could be fucked. Therefore it's bad idea to bet hard on any particular outcome. That said, it looks highly likely true AGI will basically replace and possibly kill humanity if it works at all. There also doesn't appear to be anything that can be done about this (the alignment people are all coping). So we're left with something existentially important but uncertain and hard to predict, which we can't really do anything about. We could try to put off the AI problem by slowing down progress, but everyone trying to do that is retarded and making it worse. All we can really do is try to make our society more capable of dealing with endogenous shocks and coordination problems, and live well until God decides to wipe us out with unpredictable super-AGI.

2. On the demographic suicide, you could also argue that this stuff is beyond our control because caused by civilization macro-cycle decay blah blah blah, but I don't really buy this Spenglerian doomerism. It's obviously a major problem, but it feels entirely within the realm of possibility to build a political order that chooses to live as a matter of will. That implies a total revolution in philosophy and politics, and the last time someone tried they got crushed and that whole set of ideas has been illegal for 80 years, but the usual trope applies: "if the situation was hopeless, the propaganda would be unnecessary". This is where the life-energy is. Even if it's futile, there are worse things to do than hold on to the standard of will and race and civilization, finding ways to continue the project despite the world crumbling around you. Not to be a doomer though. I think thing are going quite well in some ways. We are way further than we were 10 years ago. Again, live well and struggle for life, but God is ultimately in control.

What I don't see is much interaction between these problems. More advanced AI tech just accelerates the usual industrial takeoff problem. As long as it's not actually escaping human control, the civilization and race stuff really matters for actually operating all this. If AI would change the situation, industry more or less already has. If AI actually revolutionizes the world to the point that race doesn't matter anymore, it's because we're all dead/obsolete. I don't see how it makes any of our job easier or harder.

Maybe there are big interactions I'm not seeing, but one reason no one talks about these problems at the same time is they seem to be basically orthogonal in practice. Everyone who tries just fails to say anything usefully true about their interaction.

Great question OP. T

You must login to post.