Eliezer Yudkowsky, a research fellow over at the Singularity Institute for Artificial Intelligence (see my previous post warning them about security issues), asks a very interesting question over at Overcoming Bias:
What’s the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck. […]
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you’ve exponentiated 7625597484987 times. That’s 3^^^3. It’s the smallest simple inconceivably huge number I know.
Now here’s the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
Below I’m going to reproduce what I wrote in the comments, but I encourage you to head over and read all the comments.
My first comment:
The dust speck is described as “barely enough to make you notice”, so however many people it would happen to, it seems better than even something a lot less worse than 50 years of horrible torture. There are so many irritating things that a human barely notices in his/her life, what’s an extra dust speck?
I think I’d trade the dust specks for even a kick in the groin.
But hey, maybe I’m missing something here…
Posted by: Michael G.R. | October 30, 2007 at 12:12 AM
Here’s the second:
Humans get barely noticeable “dust speck equivalent” events so often in their lives that the number of people in Eliezer’s post is irrelevant; it’s simply not going to change their lives, even if it’s a gazillion lives, even with a number bigger than Eliezer’s (even considering the “butterfly effect”, you can’t say if the dust speck is going to change them for the better or worse — but with 50 years of torture, you know it’s going to be for the worse).
Subjectively for these people, it’s going to be lost in the static and probably won’t even be remembered a few seconds after the event. Torture won’t be lost in static, and it won’t be forgotten (if survived).
The alternative to torture is so mild and inconsequential, even if applied to a mind-boggling number of people, that it’s almost like asking: Would you rather torture that guy or not?
Posted by: Michael G.R. | October 30, 2007 at 10:42 AM
Hmm, thinking some more about this, I can see another angle (not the suffering angle, but the “being prudent about unintended consequences” angle):
If you had the choice between very very slightly changing the life of a huge number of people or changing a lot the life of only one person, the prudent choice might be to change the life of only one person (as horrible as that change might be).
Still, with the dust speck we can’t really know if the net final outcome will be negative or positive. It might distract people who are about to have genius ideas, but it might also change chains of events that would lead to bad things. Averaged over so many people, it’s probably going to stay very close to neutral, positive or negative. The torture of one person might also look very close to neutral if averaged with the other 3^^^3 people, but we *know* that it’s going to be negative. Hmm..
Posted by: Michael G.R. | October 30, 2007 at 11:03 AM
Then Eliezer asked: “Followup dilemmas: For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?”
Here’s my answer:
To avoid *all* the dust specks, yeah, I’d pay a penny and more. Not a penny per speck, though 😉
The reason is to avoid having to deal with the “unintended consequences” of being responsible for that very very small change over such a large number of people. It’s bound to have some significant indirect consequences, both positive and negative, on the far edges of the bell curve… the net impact could be negative, and a penny is little to pay to avoid responsibility for that possibility.
Posted by: Michael G.R. | October 30, 2007 at 02:36 PM
Some of the other commenters picked torture using utilitarian reasoning that leads to the conclusion that the dust specks are worse than the torture. For that to be the case, they somehow needs to have a cumulative effect, which I don’t think they do. Maybe if you look at all human as a whole, but I’m not sure why you should do that since experiences of pain and discomfort are subjective — the important difference between the choices is qualitative and not quantitative.
Also, if the dust specks were taken in in isolation, I’d give a bit more weight to them (though probably not enough to change my conclusion), but I’ve been considering them as something added to a real human’s life which is already bathing in a “static” of events like the dust speck. The effect of one more will be lost in that static, and the only reason why “unintended consequences” becomes important is because of the huuuuuuge number of people that would be affected (which is why I’d pay a penny or more just to be safe).
How would you answer Eliezer’s question? Don’t be shy, go leave a comment.
Update: Eliezer posted his answer here.
In the new post, Eliezer mostly talks about those who refused to answer the dilemma:
If you actually answer the dilemma, then no matter which option you choose, you’re giving something up. If you say SPECKS, you’re giving up your claim on a certain kind of utilitarianism; you may worry that you’re not being rational enough, or that others will accuse you of failing to comprehend large numbers. If you say TORTURE, you’re accepting an outcome that has torture in it.
I falsifiably predict that of the commenters who dodged, most of them saw some specific answer – either TORTURE or SPECKS – that they flinched away from giving. Maybe for just a fraction of a second before the question-confusing operation took over, but I predict the flinch was there. (To be specific: I’m not predicting that you knew, and selected, and have in mind right now, some particular answer you’re deliberately not giving. I’m predicting that your thinking trended toward a particular uncomfortable answer, for at least one fraction of a second before you started finding reasons to question the dilemma itself.)