in thread "Nines or zeroes of strong rationality?": This matches my intuition - Elizerian alignment was always an incoherent goal, no agents or objects have perfectly coherent boundaries, making something intelligent provably aim at a single well defined end state is probably at best a recipe for madness in... 2mo ago (collapse hidden) retry 1 1 This matches my intu (view hidden) retry 1 1