Eliezer Yudkowsky wrote an intriguing short story where three civilizations of conflicting values meet accidentally for the first time and face the dilemma of what to do with one another.

If you are interested in reading this quite interesting story, perhaps now would be a good time to scoot over and do that before I ruin it for you.

Eliezer skilfully presents a sandwich situation where humans run into an inferior race, the Baby-Eaters, who most highly value an activity that's genocidally awful to the point of requiring intervention from the humans' point of view, just when a third civilization arrives, the Maximum Fun-Fun Ultra Super Happy People, that is most powerful, and which finds both the Baby-Eaters and the humans' ways morally unacceptable. A dilemma arises whereby humans can either accept genetic reprogramming to make humans compatible with the Super-Happies, relinquishing the ability to suffer, while the Super-Happies adjust themselves to be compatible with humans too; or, the humans can escape this "awful fate", but at great cost - by blowing up a solar system, sacrificing billions of people.

For some reason, Eliezer, and apparently most of his readers, seem to think it self-evident that humanity ought to preserve its nature and sacrifice the several billion people to avoid merging with Super-Happies.

Eliezer's plot is further biased by a conspicuous absence of any human in the plot even considering that melting with the Super-Happies might be a desirable outcome, even if it was not forced.

Furthermore, when humans are faced with the prospect of having to undergo genetic reprogramming in order to become more like the Super-Happies - incapable of pain and suffering, primarily - Eliezer's plot has people commit mass suicides, with something like half the human population offing themselves to avoid this "awful, awful" fate.

What on Earth?

Given Eliezer's mighty (and admirable!) plans of building an AI to solve all of the world's problems; and given the importance he places on correctly infusing the AI with proper values, of which I am also convinced; and given his disregard for superficial niceties, in which I also agree with him;

given all this, I must honestly state that Eliezer has gone nuts a little bit, assuming perhaps that the values he considers intuitive are more universal, and shared by a vaster majority of people, than they in fact are.

The very fact that his story represents no character among the crew of the Impossible Possible World who would argue in favor of joining the Super Happies... and yet here I am... shows that he is likely assuming too much universality to his preference. Which, apparently, would be to kill off billions of people in the fictional scenario, so that people can continue "being human".

I have been in a state where I felt the sentiment "I do not want to change because then I will be not-me, I will be different". That was crap! I was miserable. It was hard to give up the miserable part, and to become a happy person. But I did, and guess what, now I don't miss being miserable!

Eliezer does not state the real reason why joining the Super Happies might not be preferable. It would not be preferable to become Supper Happy if it were not functional. If, for example, the Super Happies regularly wandered into self-destruction because they could not feel pain when doing harmful things, then this would be a strong argument in favor of not becoming Super Happies.

But instead, Eliezer portrays the Super Happies as a highly functional, superior civilization. As such, I see no reason humans should not agree to change themselves and join them - other than reasons that are self-destructive and vacuous.

Wanting to "remain human" is much like nationalism. It is speciesism, and will be similarly harmful.

Will conversion into Super Happy change your experience fundamentally? Yes. Will it cause you to stop experiencing? No. Will it vastly improve the quality of your experience? Yes. Then what's the holdup?

An even more annoying shortcoming of Three Worlds Collide is that it assumes that all three civilizations want to urgently impose their values on the others. The Baby Eaters want the humans to start eating their babies too; humans want the Baby Eaters to stop eating their babies; and the Super-Happies want everyone to be super happy. That just seems like naive bollocks to me. Not giving a damn about what nasty things other creatures do, as long as they keep it to themselves, is what allows the world to go round. Whereas, a civilization that wants to impose its values on all existing creatures cannot be consistent in this, unless it spreads throughout the entire Universe and does the most it can to find all perceived victims, and force their aggressors to refrain. A civilization that wants to enforce its values universally like that - would be like the jihadis.