Some issues with this hypothetical situation:
"Sentience" is difficult to define. Further, there is no compelling reason why it should make an entity "more valuable" than some other entity. Such a dilemma is fortunately not applicable to my own path, as I endow all the Universe with "sentience" in some meaning of the word. But to place a premium value on sentience otherwise seems a tad anthropocentric to me.
Setting aside that issue, the hypothetical situation also seems to assume that quantity alone is a meaningful measure of value. This does not always hold true either, as other underlying aspects can be of greater importance or value. If a species is overpopulated, it makes more sense to save one and kill a thousand than save a thousand and kill one, for example.
"Sentience" is difficult to define. Further, there is no compelling reason why it should make an entity "more valuable" than some other entity. Such a dilemma is fortunately not applicable to my own path, as I endow all the Universe with "sentience" in some meaning of the word. But to place a premium value on sentience otherwise seems a tad anthropocentric to me.
Setting aside that issue, the hypothetical situation also seems to assume that quantity alone is a meaningful measure of value. This does not always hold true either, as other underlying aspects can be of greater importance or value. If a species is overpopulated, it makes more sense to save one and kill a thousand than save a thousand and kill one, for example.