gfish: (Default)
gfish ([personal profile] gfish) wrote2017-03-13 12:52 pm

A creature of perfect morality

(This exploration was inspired by de Beauvoir's Ethics of Ambiguity, coincidentally reinforced by parts of Palmer's Seven Surrenders, plus some of the style of Kierkegaard's Fear and Trembling. So if it's a little pretentious, well, you now know why.)

How do we value human life?

I'm walking down the street. Two cars crash and burst into flames. I help the driver in the first car, as it is slightly closer to me. While I get the first driver to safety, two people in the other car burn to death.

When on the bus, I sometimes look at the other passengers and simply try to appreciate the human mind. They're a commonplace miracle, yes, but a miracle none the less. Each is a unique collection of experiences and thoughts, never repeated before or since. Each a completely distinct portal into this world. In a very real sense, each is its own universe. Trying to put a value on them is like putting a value on an entire library of lost works.

Human lives are the only thing that we know for certain adds meaning to the universe. How do you assign a value to that which creates value? It's literally priceless, yet we've never found a moral/political/economic system that doesn't demand constant appraisals. How much of the budget should be dedicated to eradicating a specific cancer? How slow are you willing to drive to keep all pedestrians safe?

I'm walking by some train tracks. An out-of-control trolley is rolling towards a switch in the track. On one side of the switch there is a person tied to the tracks. On the other side, there are 10 people. Without looking to see what position the switch is in, I pull out my phone and call 911, trying to save a few seconds of response time. In the background, screams.

It's tricky, working with infinity. The promise of a future paradise has lead to some of the worst atrocities in all of human history. It doesn't matter if it is of the heavenly or earthly kind, religious or secular, the danger of paradise is assigning an infinite value to begin with. No number of deaths, no horror or depravity conceivable can't be justified in the here and now if you believe it can lead to an infinite reward.

Even mathematically, it took us a long time to learn how to safely use the concept of infinity. Xeno's Paradox was a legitimate problem until infinite sums were conquered, a process which took 2000 years. We're pretty good at it now, calculating with the infinitely large and small, an achievement which has led to wonders. The modern world simply cannot happen without calculus.

On my left, a cruise ship is sinking. On my right, a rowboat. I flip a coin to decide which I should help.

It could be human lives are of infinite value, but we haven't developed the appropriate analytical tools yet. There could be a rigorous calculus of ethics out there, waiting to be invented.

Can we imagine what that would be like, living with a consistent moral system that really sees all lives as having unbounded value? I don't think it would be recognizably human. The creature you would have to become, who cannot see the difference between a single death and a million -- is it a saint, or a monster?

From my limited, mortal perspective, I think it's best to just keep pushing up the value of human life. Regular, monotonic, finite increments. Let it reach infinity when t itself does. That might be soon enough.

I'm performing CPR on an octogenarian. On the other side of the room, an ICBM launch starts counting down. I wonder if anyone else will arrive in time to deal with it.
maellenkleth: (computer-blargle)

[personal profile] maellenkleth 2017-03-13 08:49 pm (UTC)(link)
These are the sorts of questions that are asked by reliability-programme psychologists. For what it's worth, in **that** setting, 'by the book' is the only allowable answer. :/

And yes, at some point I might abandon the CPR in order to go stop the countdown. But that's training asserting its existence.

[identity profile] 2017-03-14 07:39 am (UTC)(link)
Infinities do not have to behave like that -- this 10 lives = 1 life thing -- if you don't want them to. So that's not a knockdown argument against treating human life as of infinite value. Because you could track your moral value as two components, lives plus other.
(10 * infinity + 500) > (1 * infinity + 2000)
(10 * infinite + 500) < (10 * infinite + 2000)
and so on.

If you're thinking I'm being mathematically improper, I ain't. There are infinities for any purpose. Cardinal infinities behave like you describe, 2 * infinity = infinity (same number of odd integers as there are of all integers). They're a blunt family of infinities. Ordinal infinities are finer, distinguishing the sequence
1 < 2 < ... < infinity < inf + 1 < inf + 2 < ... < 2 * inf < 2 * inf + 1 < ... < inf * inf < ... (there's a lot of room up here)

For most people the problems with infinitely weighting human life come in how you value it against other things. Is a 1% chance of saving an anonymous life from malaria worth more or less than going to a concert. Is a 1% chance of truncating my own life worth heli-skiing that includes avalanche risk. Is saving a species of beetle worth a risk to human life. Is an expected one day of life extension worth more than any amount of happiness. Must all available resources be devoted to heroic medical technology to extend the process of dying. At some people everybody says they don't value life like that.
Edited 2017-03-14 07:39 (UTC)

[identity profile] 2017-03-16 04:01 am (UTC)(link)
Cool post. I like it.

Big thoughts. Thoughts are everything!
Big thoughts about the value of valuing thoughts that create value?