More on Those Brain Teasers
I'm still getting email about my last NYT column, specifically the question about bats and balls: "A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the ball cost?" The answer if 5 cents. If you don't believe it, consider two equations: 1.05+.05=1.10 and 1.05-.05=1.00
In related news, here's a clarification from Shane Frederick, who raises the concern that some readers may misunderstand the article's ending. (I understood his point, but may not have qualified it sufficiently, especially for quick readers.)
One of the major findings of decision theorists is that subtle, normatively irrelevant, wording changes can dramatically influence preferences -- something often termed "framing effects." Postrel's article may be interpreted as suggesting that such effects can typically be explained by differences in intellectual ability. This is certainly false, since assignment to conditions is typically randomized. However, researchers do sometimes also attempt to account for "cross-study" framing effects, which arise when different researchers coincidentally or deliberately use different procedures or wordings or response formats. An issue raised, but not discussed, in the JEP article is that if those researchers are located at two universities whose students differ markedly in intellectual abilities, it could be this, rather than the difference in procedure, that accounts for the difference in results. To use a fanciful example, suppose two different researchers asked an identical question (e.g., How much would you be willing to pay for a coin flip which pays $100 if "HEADS" falls?) to students at the University of Toledo and at Princeton, but each happened to or chose to print them on different colored paper. Someone who noted the methodological details, but ignored the fact that they were conducted at different universities might conclude (probably falsely) that the color of paper somehow affects risk preferences. The interpretational confound could be easily eliminated by making paper color or whatever an experimental variable. But this, of course, is rarely done in practice, and, thus, the problem remains for those looking across multiple studies conducted at different institutions.
Now it's time to work on my next column, which will be published Thursday.