At Pencils of Promise (PoP), we hold ourselves accountable to a rigorous evaluation design and the results of our statistical analyses, in lightness or the dark. Part of being transparent means standing behind strong methods and their arbitrations.
We also subscribe firmly to the rationale that a student will perform better in a classroom if they’re led by an enthusiastic, inspired and well-trained teacher. Therefore, we measure students’ literacy ability before and after their teacher experiences PoP programming, and compare their performances with similar students whose teachers did not receive it. This gives us insight into the impact of our teacher training, through the lens of student literacy outcomes.
Laos is Unique
Our Teacher Support (TS) program in Laos looks fairly different from our TS programs elsewhere. Where we normally focus on literacy instruction in students’ national language — English in Ghana and Spanish in Guatemala — English is hardly a native subject in Laos. In fact, it’s typically the second, or even third, language that students learn after Lao and/or a local Lao dialect. What makes this program so exciting for us is that the students we’re working with are often seeing English for the first time.
So, our evaluation focuses primarily on the foundational aspects of literacy we standardize across assessments in Ghana, Guatemala, and Laos: the reading, writing, and comprehension of letters, words, and sentences. The test we’ve been using (the Early Grade Reading Assessment-based test, or EGRA), is steeped in academic validation and is carefully sensitized to various contexts.
However, the Lao literacy context specifically means that we look at another metric to measure literacy development: zero reduction, the change in the percentage of students that score zero points on any given section between the start and end of the year.
We still measure changes in literacy scores, but looking at zero reductions will supplement that with other meaningful information. It’s important to us that all students leave the school year further down the path of literacy than when they started, and zero reductions help us measure that without relying on somewhat sparse regression data.
Great, So… What Did We Do?
Overall, PoP’s 2017–2018 Teacher Support (TS) program in Laos supported English teachers teaching Grade 3 students in 60 schools.
Key program components of PoP’s Teacher Support program include:
- Group teacher workshops:
Held at the beginning of the year, these workshops train teachers on PoP methodology and demonstrate a series of full lessons. - Individualized coaching sessions:
TS staff meet teachers one-on-one twice per month. After modeling, co-teaching or observing a teacher’s lesson, PoP staff provide individualized feedback that is critical for improving the implementation of new methodologies and materials. These sessions aim to build teacher capacity, leading to improved teaching techniques and more engaging classrooms. - Government alignment:
Pre-meetings are held in conjunction with the Lao Department of Education representatives. These meetings help to align goals around the program’s objectives, share names of teachers to be trained, and sign the school agreement that establishes the commitment of both PoP and the school.
Our commitment to program evaluation is baked into the program’s start, so our design allows for fair comparisons and valid mathematics: we start with randomizing which schools participate in testing for our Treatment and Control groups from a pool meeting inclusion and exclusion criteria.
It is important to consider not only the changes in PoP schools, but the changes in PoP schools as compared with changes in non-PoP schools; this enables us to examine the effectiveness of our work above and beyond the gains which students make outside of PoP programming. So, PoP conducts baseline and endline testing in Control schools in order to observe learning gains in schools which did not receive PoP’s TS program. Both Treatment and Control schools conduct English lessons for the same amount of time each week (i.e., 1.5 hours for one day per week), and balance testing on a variety of structural factors (e.g. access to electricity) was conducted.
PoP tested a two-stage cluster sample (selecting schools and then students in the schools) averaging about 10 students per school across 26 Treatment schools and 21 Control schools. This gives a total of 474 students across 47 schools (197 for Control, and 277 for Treatment) that participated in both baseline and endline tests. All students were in Primary 3.
Okay. Math Me.
We’re using a forward step-wise built multiple linear regression. This means we (carefully) made a mathematical model which predicts an outcome (for example, a students’ post test score) using a handful of variables which are related to that outcome (for example, the students’ pre-test score, and/or exposure to our programming).
How much does X affect Y? How much does A, B, and C affect Y? How about, X affecting Y when controlling for A, B, and C? We can see how useful variables are in predicting the outcome. This is a particularly interesting space in evaluation right now, and I’m happy to speak more over email about this technical topic.
We supplement this regression method with the zero reductions as discussed, using both the percent changes from Baseline to Endline. We also use a significance test (Mann-Whitney) to examine whether more math supports these as very important differences.
Results
We found that students in TS schools saw greater improvements on test scores than students in non-TS schools, but primarily for the fundamental literacy skills featured in the early sections of the EGRA test. Technically speaking, the Treatment variable is highly significant for Section 1A (Letter Identification). A less conservative significance level is achieved for Section 1B (Letter Writing); the same applies for Sections 1C and Section 2 (Letter Cases, Letter Sounds).
A substantially smaller proportion of students in TS schools scored 0 points on several sections (S1A Letter Identification S1B Letter Writing, S1C Letter Cases, and S2 Letter Cases) than in non-TS schools, although the fraction of students scoring 0 points on these sections remains too high (between 4.69% and 37.90%).
We take the first two sections for illustration.
Later literacy skills in Sections Two through Four (e.g. Reading, Writing Sentences) had a high proportion of zero scores and were resilient to change through Teacher Support programming.
We conclude that while effective for early literacy skills, our program covered too many English language skills in the first year and needs to take into account when students are learning English as their second, third, or fourth language. Moving forward, PoP will critically evaluate our approach to student evaluation, with the goal of focusing on fundamental strategies that will set them and their students up for success.
Learnings
The importance of examining zero scores here implies that changes in our metric are needed. Sensitizing our expectations and methods to the Lao context has been an ongoing conversation at PoP for some time; this analysis builds inertia for our new orientation to evaluation in Laos. With a measure that’s more sensitive to the abilities of the students and their curriculum, PoP will be able to provide more insightful feedback into the effectiveness of programming and how to adapt specific teacher strategies to help students. As of now, it appears teaching early literacy skills should be a priority.
As always!