anon_542 said in #3046 2w ago:
Sometimes you believe two things but don't know how to think about them at the same time. Very few people could think well about how AGI relates to companies before 2015ish. Similarly, very few people could think well about how AGI relates to governments before ChatGPT. I don't mean "have clever or correct thoughts", I mean "think about the intersection of these two things at all"—they were essentially in separate magisteria.
I currently have no idea how to think about the intersection of AGI and civilizational decline, and demographic shifts in particular. Surely the west collectively committing suicide via immigration (and East Asian countries losing the will to reproduce, and African populations booming) is *somehow* relevant for thinking about what the path to AGI going well looks like. But I look at these two things and my brain just puts them in two separate magisteria—the timeframes involved are too different.
One guess is that AGI makes the "exit" option far more feasible as a way to escape civilizational decline, since a small population of highly talented humans using AGI well could be competitive with a much larger population of less talented humans using AGI badly. But the whole point of AGI is that you don't need to be very talented to use it well. Perhaps if AGIs end up shackled by bureaucracy within major powers in the same way that humans are, then maybe you still want to escape? But this assumes AGIs that are aligned well enough to keep enforcing bureaucracy, but not well enough to somehow fix the situation—a seemingly narrow path.
The other obvious thought is that AGI concentrates power in a way that can be used to fix civilizational decline. But this is a pretty rough possibility to aim for, due amongst other things to the extreme lack of people who are virtuous enough that such concentration would be desirable.
I currently have no idea how to think about the intersection of AGI and civilizational decline, and demographic shifts in particular. Surely the west collectively committing suicide via immigration (and East Asian countries losing the will to reproduce, and African populations booming) is *somehow* relevant for thinking about what the path to AGI going well looks like. But I look at these two things and my brain just puts them in two separate magisteria—the timeframes involved are too different.
One guess is that AGI makes the "exit" option far more feasible as a way to escape civilizational decline, since a small population of highly talented humans using AGI well could be competitive with a much larger population of less talented humans using AGI badly. But the whole point of AGI is that you don't need to be very talented to use it well. Perhaps if AGIs end up shackled by bureaucracy within major powers in the same way that humans are, then maybe you still want to escape? But this assumes AGIs that are aligned well enough to keep enforcing bureaucracy, but not well enough to somehow fix the situation—a seemingly narrow path.
The other obvious thought is that AGI concentrates power in a way that can be used to fix civilizational decline. But this is a pretty rough possibility to aim for, due amongst other things to the extreme lack of people who are virtuous enough that such concentration would be desirable.
referenced by: >>3049
Sometimes you believ