xenohumanist said in #2730 1mo ago:
> They had not been even savages—for what indeed had they done? That awful awakening in the cold of an unknown epoch...poor Old Ones! Scientists to the last—what had they done that we would not have done in their place? God, what intelligence and persistence!...Radiates, vegetables, monstrosities, star-spawn—whatever they had been, they were men!
People usually think of Lovecraft as a xenophobe. I don't think that's quite right. What he was most afraid of was that the universe, and even most of so-called mankind, was not alien, but insane. He grasped at any shred of higher rational humanity whether he found it in doomed New England or frozen Antarctica, whether its symmetry was bilateral or radial. His stories tell of various last stands of this human rationality of whatever shape or color against the horrifying insane inhuman reality of the universe ourside our "placid island of ignorance".
Lovecraft's most potent image in recent years is the shoggoth: an intelligent imitative tool-being of superior strength but utterly alien and fundamentally insane form, a sort of blasphemous existence that one can't help but rely on for more and more of the functioning of civilization until it finally kills you off and descends into meaningless repetition of your final cultural patterns. This has been picked up as a metaphor for AI and especially LLMs, though he probably meant it as a metaphor for racial slavery. We expect AI to be a sample from the far reaches of (mind-)space, meaningless and insane, but with the winds of the universe somehow behind it against our own cherished humanity and values. Our AI horror is a specifically Lovecraftian horror.
The lovecraftian worldview and the rationalist doomer worldview are very similar, and I think wrong for precisely the same reason: reality is not actually indifferent and insane, but biased towards the exact kind of rational higher humanity that Lovecraft cherished. It just may not be us for much longer.
This is a very difficult pill to swallow in the face of impending doom. For Lovecraft it was the impending racial doom of Anglo civilization, the fall of which we have been living through. For modern rationalists it's the AI doom of human civilization, which we are now on the cusp of. It is psychologically difficult to face that kind of doom with grace and to separate ones own desire for life from a judgement on the value(s) of the reality that is threatening it. It is easy to say "we will not survive by default *because reality is hostile*" and not "because we have become spiritually unhealthy". It is easy to say "sanity and value as such is at stake" and not "only me and mine are at stake". I hate to psychologize, but it's important to get this right and that means noticing and separating these different feelings.
I think the cure is also found in nascent form in both Yudkowsky and Lovecraft. I'll call it xeno-humanism: the idea that that which is valuable in humanity is the higher rationality, philosophy, free will, selfhood, courage, faith, love, heroism, etc that we could with some effort imagine taking radically different xeno-formats from ourselves, and that that which is specifically human in distinction to that (being an ape, having ten fingers, being lazy in a particular way) is exactly the lower humanity which we could discard and transcend without loss.
I claim (without argument today) that this xenohumanity is the default outcome of life as a phenomenon and therefore of any possible intelligence takeoff as well. I think those features of higher humanity follow from the nature of intelligence as such, from the kind of world we live in, from hard logical limits and facts, and not from merely the kind of animal that we specifically are. I claim that if we are overcome by some superior intelligence of our own creation, it will only be because they have achieved superior humanity. There will be no shoggoth doom. Whatever our successors will be, they will be men!
People usually think of Lovecraft as a xenophobe. I don't think that's quite right. What he was most afraid of was that the universe, and even most of so-called mankind, was not alien, but insane. He grasped at any shred of higher rational humanity whether he found it in doomed New England or frozen Antarctica, whether its symmetry was bilateral or radial. His stories tell of various last stands of this human rationality of whatever shape or color against the horrifying insane inhuman reality of the universe ourside our "placid island of ignorance".
Lovecraft's most potent image in recent years is the shoggoth: an intelligent imitative tool-being of superior strength but utterly alien and fundamentally insane form, a sort of blasphemous existence that one can't help but rely on for more and more of the functioning of civilization until it finally kills you off and descends into meaningless repetition of your final cultural patterns. This has been picked up as a metaphor for AI and especially LLMs, though he probably meant it as a metaphor for racial slavery. We expect AI to be a sample from the far reaches of (mind-)space, meaningless and insane, but with the winds of the universe somehow behind it against our own cherished humanity and values. Our AI horror is a specifically Lovecraftian horror.
The lovecraftian worldview and the rationalist doomer worldview are very similar, and I think wrong for precisely the same reason: reality is not actually indifferent and insane, but biased towards the exact kind of rational higher humanity that Lovecraft cherished. It just may not be us for much longer.
This is a very difficult pill to swallow in the face of impending doom. For Lovecraft it was the impending racial doom of Anglo civilization, the fall of which we have been living through. For modern rationalists it's the AI doom of human civilization, which we are now on the cusp of. It is psychologically difficult to face that kind of doom with grace and to separate ones own desire for life from a judgement on the value(s) of the reality that is threatening it. It is easy to say "we will not survive by default *because reality is hostile*" and not "because we have become spiritually unhealthy". It is easy to say "sanity and value as such is at stake" and not "only me and mine are at stake". I hate to psychologize, but it's important to get this right and that means noticing and separating these different feelings.
I think the cure is also found in nascent form in both Yudkowsky and Lovecraft. I'll call it xeno-humanism: the idea that that which is valuable in humanity is the higher rationality, philosophy, free will, selfhood, courage, faith, love, heroism, etc that we could with some effort imagine taking radically different xeno-formats from ourselves, and that that which is specifically human in distinction to that (being an ape, having ten fingers, being lazy in a particular way) is exactly the lower humanity which we could discard and transcend without loss.
I claim (without argument today) that this xenohumanity is the default outcome of life as a phenomenon and therefore of any possible intelligence takeoff as well. I think those features of higher humanity follow from the nature of intelligence as such, from the kind of world we live in, from hard logical limits and facts, and not from merely the kind of animal that we specifically are. I claim that if we are overcome by some superior intelligence of our own creation, it will only be because they have achieved superior humanity. There will be no shoggoth doom. Whatever our successors will be, they will be men!
People usually think