Moral intuitions can lead us astray in just the same way that the biases literature tells us we can be led astray in making financial decisions.
On this view, Foot, Thomson, and Edmonds go wrong by treating our moral intuitions about exotic dilemmas not as questionable byproducts of a generally desirable moral rule, but as carrying independent authority and as worthy of independent respect. And on this view, the enterprise of doing philosophy by reference to such dilemmas is inadvertently replicating the early work of Kahneman and Tversky, by uncovering unfamiliar situations in which our intuitions, normally quite sensible, turn out to misfire. The irony is that where Kahneman and Tversky meant to devise problems that would demonstrate the misfiring, some philosophers have developed their cases with the conviction that the intuitions are entitled to a great deal of weight, and should inform our judgments about what morality requires.
Oh, the irony! Except for the fact that other researchers in Kahneman and Tversky’s own field have a position that roughly mirrors that of the philosophers who “go wrong” in Sunstein’s estimation.
This all makes utilitarianism seem oh-so-scientific, but of course, there is an obvious problem: why should we accept that maximizing welfare or happiness (by some agreed upon specification of either) is the telos of moral philosophy (or moral science, if Sunstein is to be taken literally)?
Perhaps it rests on intuitions such as “happiness (or welfare) are good, and happiness (or welfare) for more people is better than happiness (or welfare) for fewer people.”
It’s hard to understand why (scientifically speaking) this should be treated as somehow more special or more objective than the moral intuitions emphasized by the virtue ethicists and others.
Of course Sunstein admits that we’re in murky waters here, so I don’t want to put words in his mouth. But it seems to me that implying that violating consequentialist logic is similar to the logical errors found by Kahneman seems to me to be rather…reaching.