Immortality would not necessarily preclude the need for morality. There is no logical need for the futility of morals for an immortal. Why can an immortal volitional robot not choose that breathing clean air, while maybe not necessary, feels better than traveling to the moon without a space suit? Maybe reminiscing by the beach on a yacht on a sunny day is worth the money earned while providing medical care to mortals or teaching philosophy and better than hanging out in a trailer while watching black and white TV. Just because death, in a conventional sense, has been removed from the situation, does not mean life and the pursuit of happiness loses meaning. Will there not come a time, say 1,000 years from now, where humanity’s children will have the technology to become impervious to disease, pressure and temperature extremes and aging? Keep in mind, more than a couple people have been walking around with artificial hearts for decades now, in the early 21st century. Can those with arbitrarily extended life spans not live as humans? Would they no longer need a morality to be happy and live better and be good rather than evil and unproductive and unhappy? And I don't see any significant difference for the need for morality, whether carbon-based or silicon or anything else-based sentience for that matter.