In their new article Human Capital Index and the hidden penalty for non-participation in ILSAs, Ji Liu (PhD 2018) and Gita Steiner-Khamsi bring attention to the World Bank’s problematic methodology for calculating the new Human Capital Index (HCI) and reflect on its troubling consequences in shifting dynamics between countries and International Large Scale Assessments (ILSAs).

 

 

Source: (World Bank, 2020). Author’s compilation.

 

International Large Scale Assessments (ILSAs) have become increasingly popular. The OECD, IEA, and the global testing industry have created new discourse incentives, while globalization and political pressure to ‘fit in’ have added to countries’ anxiety over ILSA outcomes. In their new research article, Liu and Steiner-Khamsi show that this equilibrium is about to shift due to the World Bank’s recent bet on ILSAs as the answer to addressing #learningcrisis globally.

 

Late 2018, the Bank released its Human Capital Index (HCI), which aims to provide a “direct measure of school quality and human capital” and acts as a barometer for many of the Bank’s lending programs. This measure harmonizes different ILSA and regional assessment scores into one single scale by producing a ratio-linking ‘learning exchange rate’ that is based on test-overlap systems. HCI relies on systems that took part in multiple ILSAs or regional assessments to convert scores for partial- and non-participants.

 

The authors empirically examine ILSA (in)comparability by cross-checking test sample characteristics. They find that test-overlap systems are drastically different from those that are more selective in test-participation. Even within the same test system and in the same test year, different ILSAs choose drastically different test samples.

 

As byproduct of the Bank’s crude methodology in harmonizing ILSA scores, this stylized approach generates new penalties that may coerce partial- and non-participants, most of which are low- and lower-middle income countries, to rethink their decisions. This will be exacerbated as the Bank chooses to use this information as an instrument to assess or reevaluate lending programs. Notably, test participation type alone accounts for about 58 percent of the variation in harmonized scores, and score penalties associated with partial ILSA participation equate to at least one full year of learning.

 

Improving learning is critical, but the World Bank’s flawed harmonization of ILSA scores in effect penalizes governments that have chosen alternative nonstandardized pathways for measuring learning. In an era of global monitoring of education development, these “methodological glitches” have far-reaching political, social and economic consequences that need to be brought under scrutiny.

 

Check out this podcast to learn more about this project.