Skip to main content
2014 Human Development Report, p. 165.
2014 Human Development Report, p. 165.
Charles Kurzman, “Syria’s Human Development Crisis,” March 31, 2015.

By early 2013, Syria’s “Arab Spring” had turned into a horrendous civil war. Entire neighborhoods and villages were destroyed. Millions of people had been displaced. Schools and medical facilities barely functioned in many parts of the country.

According to the United Nations Development Programme, however, none of this affected the Syria’s level of human development. The UNDP’s most recent Human Development Report, released in July 2014, rated Syria’s human development index at .658 for 2013, virtually the same as every year since 2005. How could this be?

The answer raises a fundamental problem with statistics from international agencies: too many of them rely on government reporting, and are not granted enough autonomy to judge official figures and explore alternative data.

Syria is not the only example where these inter-governmental procedures generate macabre results. The 2014 Human Development Report gives the same rating for Rwanda in 1990 and 1995, despite the genocide that slaughtered one seventh of the country’s population in 1994. The report gives a higher rating for Iraq in 2005, after the U.S.-led invasion destroyed the country’s power grid and most of its government institutions, than before the invasion, as well as stable or higher ratings each year since, despite ongoing civil war.

Selim Jahan, director of the UNDP’s Human Development Report Office, emphasizes that its index is not intended to “measure short-term human development achievements,” but instead “is composed of long-term human development outcomes.” This seems fair: most countries are unlikely to see dramatic year-to-year shifts in the indicators that make up the index: life expectancy at birth, mean years of schooling, expected years of schooling, and gross national income (GNI) per capita.

However, some countries do experience dramatic year-to-year shifts — downward shifts — as a result of war or other catastrophes. We may not always know the precise scale of these shifts, since such crises hamper the government’s ability to collect accurate data. The Syrian government, for example, is hardly in a position to assess the health, educational, and economic situation of regions it no longer controls, and the revolutionary groups that control these regions are not capable of collecting international-quality measurements, even if they wanted to.

Debates over statistics may not be our greatest concern in the face of humanitarian crisis. But since policy makers and social scientists often rely on information sources like the Human Development Report, the most prudent approach to data in such situations would be to report the figures as unknown.

What would trigger such a judgment? At what point should the Human Development Report decide that government statistics are no longer valid?

This raises the tricky question about the relationship between intergovernmental organizations such as the United Nations and the national governments that are its members. Member “nations” — actually the governments that claim to represent nations — are generally loathe to cede authority to intergovernmental agencies. That’s why the UN commission on human rights includes some of the world’s worst human rights offenders (as assessed by independent groups). That’s why the permanent members of the Security Council veto resolutions that question their own actions.

Many governments are sensitive about development statistics, especially numbers that make the government look bad. In the final years of the Pahlavi monarchy in Iran, for example, top officials ordered their departments to generate rosy numbers, or spouted fake figures on their own. “Unwilling to reform the condition of life in Iran,” one report commented, Iranian authorities “kept reforming the data.” And that was before indexes like the Human Development Report became widespread.

Yet there are precedents for calling government data into question, even at United Nations agencies. The 1997 Human Development Report listed Rwanda’s rating for 1994 as half of its 1993 rating; the following year Rwanda was removed from the list entirely, presumably for lack of reliable data. Iraq was absent from the human development index for many years after the 2003 invasion, reappearing only in 2011.

An even bolder approach was adopted by the World Income Inequality Database, which is produced by the United Nations University’s World Institute for Development Economics Research in Helsinki. The WIID includes an assessment of the quality of each figure, based on experts’ judgment of the methods used to create the data.

The United Nations Statistical Commission has been working on data quality for years, delineating fundamental principles for official data and promulgating quality assurance frameworks. At its most recent session, held in New York earlier this month, the commission commented diplomatically on “the possibility of measurement and capacity constraints of member states.”

These constraints ought to be acknowledged in the United Nations’s reports and datasets. The credibility of the these products is not well served by false confidence and false precision in figures like Syria’s human development index of .658.