Economics

Book review: Poor Economics: A Radical Rethinking of the Way to Fight Global Poverty by Abhijit V. Banerjee and Esther Duflo.

This book gives an interesting perspective on the obstacles to fixing poverty in the developing world. They criticize both Jeffrey Sach and William Easterly for overstating how easy/hard it is provide useful aid to the poor by attempting simple and sweeping generalizations, where Banerjee and Duflo want us to look carefully at evidence from mostly small-scale interventions which sometimes produce decent results.

They describe a few randomized controlled trials, but apparently there aren’t enough of those to occupy a full book, so they spend more time on less rigorous evidence of counter-intuitive ways that aid programs can fail.

They portray the poor as mostly rational and rarely making choices that are clearly stupid given the information that is readily available to them. But their cognitive abilities are sometimes suboptimal due to mediocre nutrition, disease, and/or stress from financial risks. Relieving any of those problems can sometimes enable them to become more productive workers.

The book advocates mild paternalism in the form of nudging weakly held beliefs about health-related questions where people can’t easily observe the results (e.g. vaccination, iodine supplementation), but probably not birth control (the poor generally choose how many children to have, although there are complex issues influencing those choices). They point out that the main reason people in developed countries make better health choices is due to better defaults, not more intelligence. I wish they’d gone a bit farther and speculated about how many of our current health practices will look pointlessly harmful to more advanced societies.

They give a lukewarm endorsement of microcredit, showing that it needs to be inflexible to avoid high default rates, and only provides small benefits overall. Most of the poor would be better off with a salaried job than borrowing money to run a shaky business.

The book fits in well with Givewell’s approach.

Book review: How China Became Capitalist, by Ronald Coase and Ning Wang.

This is my favorite book about China so far, due to a combination of insights and readability.

They emphasize that growth happened rather differently from how China’s leaders planned, and that their encouragement of trial and error was more important than their ability to recognize good plans.

The most surprising features of China’s government after 1978 were a lack of powerful special interests and freedom from ideological rigidity. Mancur Olson’s book The Rise and Decline of Nations suggests how a revolution such as Mao’s might free a nation from special interest power for a good while.

I’m still somewhat puzzled by the rapid and nearly complete switch from a country blinded by ideology to a country pragmatically searching for a good economy. Coase and Wang attribute it to awareness of the harm Maoism caused, but I can easily imagine that such awareness could mainly cause a switch to a new ideology.

It ends with a cautiously optimistic outlook on China’s future, with some doubts about freedom of expression, and some hope that China will contribute to diversity of capitalist cultures.

Automated market-making software agents have been used in many prediction markets to deal with problems of low liquidity.

The simplest versions provide a fixed amount of liquidity. This either causes excessive liquidity when trading starts, or too little later.

For instance, in the first year that I participated in the Good Judgment Project, the market maker provided enough liquidity that there was lots of money to be made pushing the market maker price from its initial setting in a somewhat obvious direction toward the market consensus. That meant much of the reward provided by the market maker went to low-value information.

The next year, the market maker provided less liquidity, so the prices moved more readily to a crude estimate of the traders’ beliefs. But then there wasn’t enough liquidity for traders to have an incentive to refine that estimate.

One suggested improvement is to have liquidity increase with increasing trading volume.

I present some sample Python code below (inspired by equation 18.44 in E.T. Jaynes’ Probability Theory) which uses the prices at which traders have traded against the market maker to generate probability-like estimates of how likely a price is to reflect the current consensus of traders.

This works more like human market makers, in that it provides the most liquidity near prices where there’s been the most trading. If the market settles near one price, liquidity rises. When the market is not trading near prices of prior trades (due to lack of trading or news that causes a significant price change), liquidity is low and prices can change more easily.

I assume that the possible prices a market maker can trade at are integers from 1 through 99 (percent).

When traders are pushing the price in one direction, this is taken as evidence that increases the weight assigned to the most recent price and all others farther in that direction. When traders reverse the direction, that is taken as evidence that increases the weight of the two most recent trade prices.

The resulting weights (p_px in the code) are fractions which should be multiplied by the maximum number of contracts the market maker is willing to offer when liquidity ought to be highest (one weight for each price at which the market maker might position itself (yes there will actually be two prices; maybe two weight ought to be averaged)).

There is still room for improvement in this approach, such as giving less weight to old trades after the market acts like it has responded to news. But implementers should test simple improvements before worrying about finding the optimal rules.

trades = [(1, 51), (1, 52), (1, 53), (-1, 52), (1, 53), (-1, 52), (1, 53), (-1, 52), (1, 53), (-1, 52),]
p_px = {}
num_agree = {}

probability_list = range(1, 100)
num_probabilities = len(probability_list)

for i in probability_list:
    p_px[i] = 1.0/num_probabilities
    num_agree[i] = 0

num_trades = 0
last_trade = 0
for (buy, price) in trades: # test on a set of made-up trades
    num_trades += 1
    for i in probability_list:
        if last_trade * buy < 0: # change of direction
            if buy < 0 and (i == price or i == price+1):
                num_agree[i] += 1
            if buy > 0 and (i == price or i == price-1):
                num_agree[i] += 1
        else:
            if buy < 0 and i <= price:
                num_agree[i] += 1
            if buy > 0 and i >= price:
                num_agree[i] += 1
        p_px[i] = (num_agree[i] + 1.0)/(num_trades + num_probabilities)
    last_trade = buy

for i in probability_list:
    print i, num_agree[i], '%.3f' % p_px[i]

Charity for Corporations

In his talk last week, Robin Hanson mentioned an apparently suboptimal level of charitable donations to for-profit companies.

My impression is that some of the money raised on Kickstarter and Indiegogo is motivated by charity.

Venture capitalists occasionally bias their investments towards more “worthy” causes.

I wonder whether there’s also some charitable component to people accepting lower salaries in order to work at jobs that sound like they produce positive externalities.

Charity for profitable companies isn’t likely to become a popular concept anytime soon, but that doesn’t keep subsets of it from becoming acceptable if framed differently.

Book review: Why Nations Fail: The Origins of Power, Prosperity, and Poverty, by Daron Acemoglu and James Robinson.

This book claims that “extractive institutions” prevent nations from becoming wealthy, and “inclusive institutions” favor wealth creation. It is full of anecdotes that occasionally have some relevance to their thesis. (The footnotes hint that they’ve written something more rigorous elsewhere).

The stereotypical extractive institutions certainly do harm that the stereotypical inclusive institutions don’t. But they describe those concepts in ways that do a mediocre job of generalizing to non-stereotypical governments.

They define “extractive institutions” broadly to include regions that don’t have “sufficiently centralized and pluralistic” political institutions. That enables them to classify regions such as Somalia as extractive without having to identify anything that would fit the normal meaning of extractive.

Their description of Somalia as having an “almost constant state of warfare” is strange. Their only attempt to quantify this warfare is a reference to a 1955 incident where 74 people were killed (if that’s a memorable incident, it would suggest war kills few people there; do they ignore the early 90’s because it was an aberration?). Wikipedia lists Somalia’s most recently reported homicide rate as 1.5 per 100,000 (compare to 14.5 for their favorite African nation Botswana, and 4.2 for the U.S.).

They don’t discuss the success of Dubai and Hong Kong because those governments don’t come very close to fitting their stereotype of a pluralistic and centralized nation.

They describe Mao’s China as “highly extractive”, but it looks to me more like ignorant destruction than an attempt at extracting anything. They say China’s current growth is unsustainable, somewhat like the Soviet Union (but they hedge and say it might succeed by becoming inclusive as South Korea did). Whereas I predict that China’s relatively decentralized planning will be enough to sustain modest growth, but it will be held back somewhat by the limits to the rule of law.

They do a good (but hardly novel) job of explaining why elites often fear that increased prosperity would threaten their position.

They correctly criticize some weak alternative explanations of poverty such as laziness. But they say little about explanations that partly overlap with theirs, such as Fukuyama’s Trust (a bit odd given that the book contains a blurb from Fukuyama). Fukuyama doesn’t seem to discuss Africa much, but the effects of slave trade seem to have large long-lasting consequences on social capital.

For a good introduction to some more thoughtful explanations of national growth such as the rule of law and the scientific method, see William Bernstein’s The Birth of Plenty.

Why Nations Fail may be useful for correcting myths among people who are averse to math, but for people who are already familiar with this subject, it will just add a few anecdotes without adding much insight.

The CFTC is suing Intrade for apparently allowing U.S. residents to trade contracts on gold, unemployment rates and a few others that it had agreed to prevent U.S. residents from trading. The CFTC is apparently not commenting on whether Intrade’s political contracts violate any laws.

U.S. traders will need to close our accounts.

The email I got says

In the near future we’ll announce plans for a new exchange model that will allow legal participation from all jurisdictions – including the US.

(no statement about whether it will involve real money, which suggests that it won’t).

I had already been considering closing my account because of the hassle of figuring out my Intrade income for tax purposes.

Book review: The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t by Nate Silver.

This is a well-written book about the challenges associated with making predictions. But nearly all the ideas in it were ones I was already familiar with.

I agree with nearly everything the book says. But I’ll mention two small disagreements.

He claims that 0 and 100 percent are probabilities. Many Bayesians dispute that. He has a logically consistent interpretation and doesn’t claim it’s ever sane to believe something with probability 0 or 100 percent, so I’m not sure the difference matters, but rejecting the idea that those can represent probabilities seems at least like a simpler way of avoiding mistakes.

When pointing out the weak correlation between calorie consumption and obesity, he says he doesn’t know of an “obesity skeptics” community that would be comparable to the global warming skeptics. In fact there are people (e.g. Dave Asprey) who deny that excess calories cause obesity (with better tests than the global warming skeptics).

It would make sense to read this book instead of alternatives such as Moneyball and Tetlock’s Expert Political Judgment, but if you’ve been reading books in this area already this one won’t seem important.

Book review: The Institutional Revolution: Measurement and the Economic Emergence of the Modern World, by Douglas W. Allen.

What do honor duels, purchases of commissions in the army, and privately managed lighthouses have in common?

According to Allen, they were institutions which made sense in the pre-modern age, but were abandoned when improvements in measurement (of labor quality, product quality, time, location, etc) made them obsolete in the nineteenth century.

Allen presents a grand theory of how large variations in job performance, product quality, etc, before 1800-1850 created large transaction costs which caused widespread differences from modern life, and which explain a wide variety of institutions which seem strange enough to people with a presentist bias that most have dismissed many pre-modern institutions as obviously foolish.

What starts out as an inquiry into some apparently quirky and unusual practices finishes as an ambitious attempt to explain the industrial revolution as a revolution whose institutional changes were more pervasive and valuable than the technological advances which triggered them.

The book convinced me that it explains the timing of some important and often forgotten social changes. But the frequent implication that the institutions in question were the most rational way to deal with limitations of pre-industrial life seem overdone. I suspect that there was often a mixture of reasons behind those institutions that included some foolishness and some catering to special interests.

For example, his theory requires that honor duels be designed so that skill at dueling is fairly unimportant compared to random luck. He provides some evidence that people tried to introduce randomness into the dueling process, but leaves me doubting that it made skill unimportant.

The book provides a framework that might be valuable in predicting future institutional changes as technological change further reduces transaction costs, and does a valuable job of offsetting the tendencies of economists other than Coase to downplay the importance of transaction costs.

This was the first book I’ve read in several years that seems too short.

Book review: Manias, Panics and Crashes: A History of Financial Crises 6th ed., by Charles P. Kindleberger and Robert Aliber.

The book starts with a good overview of how a typical bubble develops and bursts. But I found the rest of the book poorly organized. I often wondered whether the book was reporting a particular historical fact as an example of some broad pattern – if not, why weren’t they organized in something closer to chronological order? It has lots of information that is potentially valuable, but not organized into a useful story or set of references.

Book review: Thinking, Fast and Slow, by Daniel Kahneman.

This book is an excellent introduction to the heuristics and biases literature, but only small parts of it will seem new to those who are familiar with the subject.

While the book mostly focuses on conditions where slow, logical thinking can do better than fast, intuitive thinking, I find it impressive that he was careful to consider the views of those who advocate intuitive thinking, and that he collaborated with a leading advocate of intuition to resolve many of their apparent disagreements (mainly by clarifying when each kind of thinking is likely to work well).

His style shows that he has applied some of the lessons of the research in his field to his own writing, such as by giving clear examples. (“Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular”).

He sounds mildly overconfident (and believes mild overconfidence can be ok), but occasionally provides examples of his own irrationality.

He has good advice for investors (e.g. reduce loss aversion via “broad framing” – think of a single loss as part of a large class of results that are on average profitable), and appropriate disdain for investment advisers. But he goes overboard when he treats the stock market as unpredictable. The stock market has some real regularities that could be exploited. Most investors fail to find them because they see many more regularities than are real, are overconfident about their ability to distinguish the real ones, and because it’s hard to distinguish valuable feedback (which often takes many years to get) from misleading feedback.

I wish I could find equally good book for overuse of logical analysis when I want the speed of intuition (e.g. “analysis paralysis”).