Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
African poverty is falling…much faster than you think (voxeu.org)
71 points by cwan on Dec 7, 2010 | hide | past | favorite | 23 comments


This gives some hope, especially the Gini graph. I'll wait for more information to have a more definitive opinion though as autonomy is not always well captured in those quantitative data.

For example if more people live in big cities, is this growth enough ? (1$ can mean something different : in New York $400 certainly makes you poor…). Is this growth compensating former hidden markets, like self food producing and local non monetary exchanges ? (you can sell cotton and have more money, while still having it harder to buy food). Do their subsistance depend more or less on external factors like speculation ? Are there less wars/civil wars ?


Right: how do they calculate the <$1/day rate. Its nearly inverse relationship with the GDP per capita graph implies that deep down in the data the basis for this $1/day rate could essentially be the GDP rate divided by population, or similar. I.e. without seeing where that data comes from this is a real possibility. However I would assume these folks are doing a rigorous analysis so this is probably unlikely.


Since they point out this inverse relationship as remarkable, I don't suppose there is an expected relationship of this sort between the two. Certainly there would be a negative correlation, but I don't believe they are coming from the same data.


Uhm, they define poverty as living on less than one dollar per day. Then they show it dropping over the last 20 years ... but don't control for inflation. They show 10% improvement against a benchmark that (aside from being meaningless as a way of measuring abject poverty) lost more than 50% of its value and label it progress. This to me suggests basic innumeracy. It's entirely unclear from the data presented whether real poverty has been reduced at all.


These statistics are normalized across countries and over time by purchasing power parity. I don’t know what the precise calculation is, but it is relatively sophisticated. I’m sure if you search around you could find the details. They are just reported in simpler terms to make the comparison easily comprehensible to the non-technical. See http://en.wikipedia.org/wiki/Poverty_threshold and also http://en.wikipedia.org/wiki/Geary-Khamis_dollar

(In other words, there are at least a few statisticians working for the UN and the World Bank, and they tend not to be “basically innumerate”.)


Here's the actual paper, which does name its units (international poverty line):

http://www.salaimartin.com/media/pdf/Africa_Poverty_NBER.pdf

And criticism of their results from someone working on producing similar numbers for the World Bank:

http://blogs.worldbank.org/africacan/is-african-poverty-fall...

I must say that it irks me that they would use such a easy-to-understand-and-wrong ($1/day, when they mean "below the international poverty line") unit in writing without specifically qualifying it.


It all depends on how low you set the poverty threshold. This morning I saw my old Economic History professor Stephen Broadberry get news coverage for a paper that suggests that on average people in several sub-Saharan African countries were not only poorer, but much poorer than the medieval English (less than half GDP per capita, using access to food as one of the main barometers).


On the other hand, medieval England wasn't one of the worst places.


True, but it's an indication of how low a poverty threshold of a PPP-adjusted dollar a day is when its below the level of a mostly-peasant population that didn't have access to adequate sanitation, any form of energy other than wood-burned heat, or anything invented after the 1300s


So does this mean religious charities will finally stop trying to guilt trip me with starving children when I'm trying to have a few drinks while watching television?

I didn't think so.


Only because you can be sure of two things:

1) People who use guilt trips as a technique will always find new fodder with which to guilt trip you. They will never say "Oh, ok, that's been resolved so we're done now"

2) (Sadly,) There will always be a child starving somewhere


I'm a strong supporter of help where you live. There are enough homeless and starving people within 50km of you that your money is guaranteed to get to and help, rather than send it half way around the world and hope your aid actually goes to someone in need rather than fall into the hands of a corrupt regime.

I'd prefer to give money to something like Child's Play for Toronto Sick Kids than give it to somewhere in Africa and I know many people will disagree with me for it, but IMO it sucks greater being the 1 in 100 kid that is gravely ill rather than the 1 in 2 kid that is starving. Throwing money at a norm isn't going to resolve it, but throwing money at the abnormal can indeed resolve it.

However, if you're an orangutan I'll donate to help you wherever you are - I like them much better than people.


Don't forget, most aid is actively harmful.

http://lesswrong.com/lw/z2/another_call_to_end_aid_to_africa...


yeah, because laying out facts in a sterile and objective manner is going to compel you otherwise? don't confuse being presented truth with a guilt trip, sometimes we need a wake up call and it's not always comfortable.


You have to love it when authors take scaling out of context in graphs. The vertical axis should most likely always start at zero.


Not sure that I agree here. I don't know about you, but my understanding of the variety of metrics they used in this article is quite minimal. Instead, it is instructive to just see the relative variation in the past five years or so compared to the 30 before that.

For instance, I have no idea what the metrics in Figs 2 and 3 are exactly, but I can see that the inequality metric in Figure 2 has reduced itself to approximately 1970 levels and that, since 1995, the welfare metric in Figure 3 has increased by (very roughly) twice the standard deviation of the past 30 years' measurements.

Of course scale is important, but it is folly to trot out the same argument every time a graph does not include the origin. Without knowing something about the y-axis variable we cannot make this type of claim.


I agree with you, kermit_de_fro, but it could be better. If the intent is to show relative changes, then it should show percentages or have a hard baseline that indicates "zero change".

I don't think "the bottom of the Y-axis isn't zero" is a guaranteed bad graph, especially considering some of the horrendous graphs out there. These ones are not that bad.


As a slight counterexample:

Suppose we want to show changes in temperature---unless you use the Kelvin scale, showing percentage changes is going to be bogus. Showing change relative to recent volatility seems useful.


If the meaning of the graphs isn't made clear, and the origin is not included (obscuring significance), it is not terribly unreasonably to suspect manipulation.


What if this is caused by inflation. Meaning that 1$ per day is not comparable between years.


Hail Internet/Globalization.


When you're down, the only way is up.


I wish that was true, but it simply isn't. It's always possible to sink lower and it's the case way too often.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: