.

Wednesday, June 22, 2011

"Failed State Index" - pseudo-science?

Foreign Policy and The Fund for Peace have come out with their Failed State Index 2011.

According to their criteria, Israel is a "borderline" state, ranked 53rd worst, slightly better than Egypt and worse than Zambia.

The question is: what are their criteria?

They seem to include the West Bank with Israel, but it is unclear if they include Gaza.

Given that, here are their stated criteria:
Demographic Pressures, Refugees/IDPs, Group Grievance, Human Flight, Uneven Development, Economic Decline, Delegitimization of the State, Public Services, Human Rights, Security Apparatus, Factionalized Elites, and External Intervention
And how do they score each of these criteria?
Full-text data are electronically gathered from a range of publicly available print, radio, television and Internet sources from all over the world, including international and local media reports, essays, interviews, polling and survey data, government documents, independent studies from think tanks, NGOs and universities, and even corporate financial filings. The software determines the salience of the 12 indicators as well as hundreds of sub-indicators by calculating the number of "hits" as a proportion of the sample for a given time period. Quantitative data is also included, when available. Subject-matter experts then review each score for every country and indicator, as well as consult the original documents, when necessary, to ensure accuracy.

...The raw data are from millions of news articles and reports. As a practical matter, it is not readily transferable without the methodology and the software.
Essentially there is a black box where data from the media, interviews, essays, NGOs and so forth are magically converted into numbers. But exactly how this is done, so others can critique the methods, is hidden. And then they have "experts" ready to massage the numbers if they seem out of whack!

To even call this a "methodology" seems a stretch. I could do just as well by Googling country names and keywords, graphing how often the pairs are found in web searches, and then modifying the results to make sure that they don't look too outrageous.

Does the index consider the UNRWA definition of "refugee" to be operative when it applies to no other region? How exactly does a news report get converted into data - is it number of stories? Which NGOs are considered reliable and which are not?

Moreover, including the territories with Israel here means that they are combining data from two (or three) governments, each with different agendas and priorities. But guess how the results will be reported? No question, it will be looked upon as if Israel is responsible for the government of the PA and Hamas.

Science, by definition, must allow independent researchers to reproduce the results. By putting a numeric ranking next to each country, this index fools the casual reader into believing that these are all measured objectively. It takes some digging to find out that it is literally impossible for this index to be objective. Other people looking at the identical data could easily come up with different conclusions.

Which is what makes these exercises into quantifying the world's problems often worse than useless.
(h/t Ron)