2006-12-10 The Modesty Argument

2006-12-10 Eliezer Yudkowski LessWrong \fallacy of moderation\Aumann's Agreement Theorem http://lesswrong.com/lw/gr/the_modesty_argument/ The Modesty Argument The Modesty Argument  The Modesty Argument states that when two or more human beings have common knowledge that they disagree about a question of simple fact, they should each adjust their probability estimates in the direction of the others'. (For example, they might adopt the common mean of their probability distributions. If we use the logarithmic scoring rule, then the score of the average of a set of probability distributions is better than the average of the scores of the individual distributions, by Jensen's inequality.) ... I've always been suspicious of the Modesty Argument. It's been a long-running debate between myself and Robin Hanson.

Robin seems to endorse the Modesty Argument in papers such as Are Disagreements Honest? I, on the other hand, have held that it can be rational for an individual to not adjust their own probability estimate in the direction of someone else who disagrees with them. ... Suppose a creationist comes to me and offers: "You believe that natural selection is true, and I believe that it is false.  Let us both agree to assign 50% probability to the proposition." And suppose that by drugs or hypnosis it was actually possible for both of us to contract to adjust our probability estimates in this way. This unquestionably improves our combined log-score, and our combined squared error. If as a matter of altruism, I value the creationist's accuracy as much as my own - if my loss function is symmetrical around the two of us - then I should agree. But what if I'm trying to maximize only my own individual accuracy? In the former case, the question is absolutely clear, and in the latter case it is not absolutely clear, to me at least, which opens up the possibility that they are different questions. Yudkowski does finally work around to the idea that it is reasonable not to compromise in this way because the creationist might not be an honest Bayesian, but he seems to be assuming that there is no objective way of detecting rationality.

In response to the "scientist advocating a pet theory" scenario: if 100 people are trapped in a maze, and one of them finds a way out, should that person refrain from advocating that way out simply because the average position would be "I'm pretty sure there's no way out"?

&ldquo;The Modesty Argument states that when two or more human beings have common knowledge that they disagree about a question of simple fact, they should each adjust their probability estimates in the direction of the others'. ... I've always been suspicious of the Modesty Argument.&rdquo;   