The World Bank started making international comparisons of poverty only about two decades back. For obvious reasons of convenience, it developed two simple notions of poverty. The lower poverty line was set at $1 a day per capita. Those below it were considered to be “the poorest of the poor.” The upper poverty line was set at $2 a day. Those living on $1 to 2 a day were still poor, but not as bad off.
However, there was a problem. It was realized that $1 goes much farther in purchasing necessary items of consumption in a poor country than in a rich one. To make purchasing power across countries comparable, economists developed what is known as the PPP (purchasing power parity) index. Taking into account the lower cost of living in impoverished countries, a conversion factor is now applied to market exchange rates to calculate what is minimally necessary to survive there. Using World Bank numbers, applying this conversion factor for India effectively means that if you survive on 1 PPP dollar a day in that country, it is equivalent to being given 20 cents in your hand in the U.S.
A dominant impression is that the poor are living on less than $1 a day. In fact, it would be enormously more accurate, as far as everyday English is concerned, to say that the poor across the world are living on less than 20 cents a day. The reason why this is not done is obvious: It would give an even-more-alarming picture of the scale and depth of poverty across this enormously wealthy world. Most decent people are shocked enough by the understated numbers in the form they are widely quoted. More reality would numb and paralyze even the grittiest of activists. “Humanity,” T.S. Eliot wrote, “cannot bear much reality.” He had the privileged in mind.