Tuesday, February 14, 2017

Against panaceas

Some academic thinking and especially the translation of academic thinking into policy is very prone to what Elinor Ostrom used to call "panacea thinking." There is a very interesting special issue in the Proceedings of the National Academy of Sciences in 2007 devoted to the risks of panaceas. Ostrom focuses on the failure of applying panaceas to solving collective action problems to manage "commons." The so-called "tragedy of the commons" predicted by Hardin can be overcome in a number of ways, but the precise mechanisms depend on local conditions, history and culture. Hardin himself thought that the panacea was state ownership of rival but non-excludable goods. Others suggested that the panacea would be property rights, meaning privatization of the common property so that a single owner would internalize all the relevant externalities. Still others, perhaps believing that they were interpreting the work of Ostrom, thought that the panacea would be management by user groups. In fact, different solutions apply to different cases, and sometimes complex jurisdictional systems combine different solutions at different scales in an overlapping way. For some time, one of the panaceas was to rely on insulated expert agencies, but as we know even this panacea is also in crisis. This is what William Easterly had to say in the Financial Times about this in his review of the book "Thinking, fast and slow" by Daniel Kahneman: "Kahneman regards even the experts as prone to the mistakes of System 1 listed above, and cheerfully admits that he is no exception. But he wants to know whether this view can be reconciled with cases such as that of the firefighting captain. So he engages one of his vehement critics on this issue and they debate their way to a joint paper. Their answer is that expertise can be learnt by prolonged exposure to situations that are “sufficiently regular to be predictable”, and in which the expert gets quick and decisive feedback on whether he did the right or the wrong thing. Experts can thus train their unconscious “pattern recognition” mechanism to produce the right answer quickly. So this certainly applies to chess, and it certainly does not apply to predicting the course of Middle East politics. Another classic bias is called the “halo effect”, when somebody very good at some things is falsely assumed to be good at everything. This book itself could benefit from something similar, as amid its general excellence a few stumbles are easily overlooked. The main flaw comes predictably in the final section in which, according to some mysterious universal law, all authors in the social sciences are required to produce a public policy fix for the problems they have identified. Kahneman’s endorsement of “libertarian paternalism” contains many good ideas for nudging people in the right direction, such as default savings plans or organ donations. But his case here is much too sweeping, because it overlooks everything the rest of the book says about how the experts are as prone to cognitive biases as the rest of us. Those at the top will be overly confident in their ability to predict the system-wide effects of paternalistic policy-making – and the combination of democratic politics and market economics is precisely the kind of complex and spontaneous order that does not lend itself to expert intuition."

1 comment: