It is often possible for officials to focus more attention on a small number of schools during a pilot programme – although it may not be possible for them to continue to provide this much attention per school as a programme grows to scale. Teams should consider this when designing both the pilot and the scale up. Piloting in a relatively small number of schools may allow local actors to work out the kinks and make better plans for the future, but teams should be realistic about how well pilot systems will work when programmes go to scale.
Choose appropriate data collection tools for the context.
Since many schools in the Catch Up pilot districts had unreliable cell phone signal and no access to Internet, paper tools were used. The data was digitised when aggregated at the zone or district level, but this process proved challenging.
Collect and share assessment data as quickly as possible in order to allow officials to provide support to schools.
We recommend having data collected and shared within the first 10 days of the programme.
Consider whether data collected is useful for decision-making to improve teaching or the overall TaRL model.
An early version of the classroom observation tool required yes/no answers and, when used, did not show much variation in answers, with teachers scoring well in almost all areas. This made it difficult for ZICs and VVOB coordinators to know which schools to target with additional support. In response, the tool was adjusted to include more multiple-choice questions, resulting in a wider range of responses. When working with government monitoring systems aim to have a bare minimum of necessary questions. Early version of the Catch Up tool included several questions which were burdensome for the government system.
Set mentoring and monitoring expectations early, particularly if mentor/monitors play a dual role.
Sometimes, mentor/monitors succeeded in their monitoring role (i.e. observing the teacher, filling in forms, etc.) but failed at their mentoring role (i.e. helping groups of children, demonstrating activities, and actively supporting the teacher in the classroom). Supportive mentorship which teachers find helpful and empowering rather than punitive is a key principle of the TaRL approach. This was a missed opportunity to use the mentors’ TaRL expertise to advise and support teachers in the classroom. This was addressed in part by advising that mentors complete the forms after observing the class.
Carefully consider possible scheduling conflicts and have a backup plan in place.
In one district, Catch Up took place during the school holidays, while some senior teachers were attending professional development courses. To counteract this, ZICs and DRCCs were asked to make additional visits, which they did successfully. In some districts, sports and co-curricular activities prevented children from taking part in Catch Up activities on some days, therefore reducing the number of classroom observations that mentors could make.
Guide mentors in using data to improve implementation.
In some cases, it proved difficult to use data to inform specific actions. This indicates that more guidance is needed to effectively use data. In creating monitoring and aggregation tools, implementers should consider how the information will be used at every level and create tools and report templates which point towards action steps.
Supporting mentors to improve the monitoring process.
Process monitoring of the Catch Up pilot found a lack of monitoring and reporting on the performance of mentors themselves. Having a monitoring and feedback system in place for mentors could help them to more effectively mentor and monitor teachers. For example, more senior mentors could observe classes with senior teachers and provide feedback after the visit.
Create user-friendly data aggregation tools and processes.
Paper monitoring tools were correctly used but were difficult and time-consuming to convert to an aggregated electronic form for proper analysis and review at more central levels of the government. This was mitigated in part with the help of VVOB coordinators, who were responsible for supporting the aggregation process. In the scale-up, this challenge is being mitigated by creating a simpler, paper-based aggregation process, which can then be entered using a simple spreadsheet. A broader lesson learned is to simplify the data collection and aggregation process as much as possible to ensure that accurate, useful data is collected efficiently.
Set clear expectations for creating reports.
Although district- and zone-level officials were responsible for completing reports, in some cases the VVOB coordinator completed the report instead. In order to ensure that DRCCs and ZICs complete reports, the expectation could be made clearer at trainings and re-emphasised during implementation, to ensure that mentors see reporting as a core responsibility.