in thread "Nines or zeroes of strong rationality?": This matches my intuition - Elizerian alignment was always an incoherent goal, no agents or objects have perfectly coherent boundaries, making something intelligent provably aim at a single well defined end state is probably at best a recipe for madness in... 3w ago (collapse hidden) 11 This matches my intu (view hidden) 11