The Rosetta Stone Study is designed to measure global progress towards SDG 4.1.1 by relating national and regional learning assessments to international learning assessments. It is named after the famous archaeological discovery that enabled translation between different written languages: the Rosetta Stone. The goal of the study is to provide countries that participated in regional or national assessments but not in international assessments with information about the proportions of primary school students who have achieved a minimal level of competency in literacy and numeracy (SDG 4.1.1) that allows international comparisons.
In a first effort to implement this approach and establish concordance tables, the regional assessments Regional Comparative and Explanatory Study (ERCE) and Programme d'analyse des systèmes éducatifs de la CONFEMEN (PASEC) are linked to two international assessments of the International Association for the Evaluation of Educational Achievement (IEA), namely Trends in International Mathematics and Science Study (TIMSS) for mathematics and Progress in International Reading Literacy Study (PIRLS) for reading. All the results are presented in the publications section of this page.
This document presents an executive summary of the first results of the Rosetta Stone Study, consisting of the establishment of a concordance table that projects the score distributions estimated from two regional assessments to distributions on TIMSS and PIRLS. These regional assessments are UNESCO’s Regional Comparative and Explanatory Study (ERCE; Estudio Regional Comparativo y Explicativo) in Latin America and Caribbean countries and the Programme for the Analysis of Education Systems (PASEC; Programme d’Analyse des Systèmes Éducatifs de la CONFEMEN) in francophone sub-Saharan African countries.
Analytical Reports
ERCE - Regional Comparative and Explanatory Study (Estudio Regional Comparativo y Explicativo)
ERCE is a regional assessment conducted in 19 countries of Latin America and the Caribbean. It measures students’ achievement in the sixth grade.1 The content of ERCE is expected to align well with the TIMSS fourth grade assessments in numeracy and mathematics. The reading component of ERCE is expected to align with the PIRLS fourth grade assessment in literacy and reading comprehension. The overarching goal is to construct a concordance table in which the scores from ERCE in mathematics and reading equate the scores from TIMSS and PIRLS, respectively. The concordance tables represent the “Rosetta Stone”, providing a translation between countries’ ERCE results and the TIMSS and PIRLS achievement scales. Countries participating in ERCE will be able to use the tables to determine the percentage of their students expected to reach the TIMSS and PIRLS international benchmarks or any other benchmarks which can be measured on the TIMSS and PIRLS scales. An implementation of the Rosetta Stone methodology was conducted by specialists from UNESCO Santiago, IEA Hamburg and Boston College in Colombia and Guatemala.
1 ERCE also measures achievement in lower grades. However, the assessed grade level is different (3rd grade for ERCE).
*The implementation of the project in Chile was postponed.
PASEC - Programme d'analyse des systèmes éducatifs de la confemen
PASEC is a regional assessment conducted in approximately fifteen countries mainly located in Africa. For the purpose of developing concordance tables with PASEC, testing instruments from PASEC 2019 were administered in three African countries - Burundi, Guinea and Senegal - together with those used to create the Rosetta Stone concordance tables, TIMSS and PIRLS, in March and April 2020. Based on the results of the preparatory implementation in Senegal, the most appropriate testing instruments related to the IEA tests (TIMSS and PIRLS) are determined for the main data collection phase. This phase consists of the re-administration of all the test instruments from PASEC 2019 in conjunction with a separate test session for TIMSS and PIRLS, the Rosetta Stone instruments. It is implemented in approximately 100 schools per countries, generating data for approximately 2300 to 2500 students per country. The data collected from the administration of PASEC assessments is entered, cleaned, scaled and weighted by the PASEC team, using the same procedures as those used for PASEC 2019. IEA and the TIMSS & PIRLS International Study Center at Boston College then take on the data analysis and apply necessary procedures for the Item Response Theory (IRT) to establish the links between the evaluations.On 27 June 2022, policymakers and stakeholders were invited to learn how Rosetta Stone can help compare regional and international assessment programmes, ensuring countries report on SDG4 global indicator 4.1.1 using adequate data. It was an opportunity to hear from regional assessment programmes that participated in the study - Programme d’analyse des système éducatif de la CONFEMEN (PASEC) and Estudio Regional Comparativo y Explicativo (ERCE), and was managed by the Regional Bureau for Education for Latin America and the Caribbean (OREALC/UNESCO Santiago). Countries who used Rosetta Stone shared their experiences, challenges and outcomes.
Presentations
- Rosetta Stone: Improving the global comparability of learning assesments
- Silvia Montoya (UNESCO Institute for Statistics)
- Rosetta Stone: Linking assessment programmes for reporting of SDG 4.1.1
- Silvia Montoya (UNESCO Institute for Statistics)
- Establishing a Concordance between Regional Assessments and TIMSS/PIRLS
- Lale Khorramdel and Matthias von Divier (TIMSS & PIRLS International Study Center, Boston College)
- IEA's Rosetta Stone: Project and Implementation
- Oliver Neuschmidt (IEA)
- ERCE 2019 and Rosetta Stone Project
- Carlos Cofre Cayuman (LLECE)
- Rosetta Project: PASEC Experience
- Hilaire Hounkpodote (CONFEMEN - PASEC)