Two key pieces of information were collected:

1.)  Information from schools about children’s learning levels and attendance

      School teachers recorded children’s attendance and assessment data. CRPs aggregated data from the schoolsBlock level data entry personnel collected these sheets from all CRPs in the block and entered data onto a data entry portal created by Pratham. District officials tracked data entry status on the portal and followed up with blocks to ensure timely data entry. Pratham created dynamic data visualizations, which were available to everyone on a website to facilitate timely decision-making.

2.)  Information from mentors about TaRL classroom visits

CRPs recorded their classroom observations in a sheetThese sheets remained with them to facilitate discussions during the review meetings at the block level. Even at higher levels, the focus was on discussing the observations in review meetings. The observation data was not entered.

The forms and processes are explained in detail below.

Information From Schools

The following processes were followed to get the data quickly collected, entered, analysed and reported. People were thoroughly trained on these forms, systems and processes

a) Teachers used the Learning Progress Sheetto record child-wise assessment and attendance data. The sheet was designed such that teachers could fill it easily. After completing the one-to-one assessments, teachers aggregated the data, which gave them a clear picture of the learning levels of their class.

b) CRPs visited all schools within 3-4 days after the assessments to consolidate the data in the Cluster Consolidation Sheet. Note that the data was recorded grade-wise for every school. This sheet provided a summary of all schools a CRP was responsible for.

To increase engagement with data, the CRPs were also provided with a sheet to visualize the assessment data and to prioritize and plan their support to schools.

c) The Cluster Consolidation Sheetswere submitted at the government block offices for data entry. Pratham designed an online data entry portal with separate login accounts for every block. The portal was simple to use and had strict data validations to ensure accurate data entry. Everyone involved in the program could track the status of data entry in real time. District officials followed-up with blocks where data entry was lagging and ensured completion of data entry in the stipulated time.

d) An online dynamic dashboard, which was available to all on a website, was created to showcase data in an easy-to-understand manner. The dynamic dashboards allowed people to see the data that was relevant to them and compare their location’s data with other locations.

Information From Mentors

The main cadre responsible for regular school visits and mentoring of teachers was the cluster-level cadre of CRPs. Every CRP had 10-15 schools under their purview. It was critical that the CRPs conduct regular visits to schools to ensure that the program was being implemented as per plan, provide feedback to teachers, review the program periodically and take course-corrective measures. To prepare the CRPs for this crucial role, it was important that the CRPs understand the TaRL approach well. After the training, all CRPs conducted practice classes for 15-20 days. Having used the materials and methods themselves and seen for themselves how children’s learning levels improved, these CRPs were much better able to train and guide the teachers in their charge. 

The following measures were taken to set up a robust monitoring and review structure: 

  • Members of the block monitoring team were linked with specific CRPs to ensure that members of the block monitoring team also had accountability for specific CRPs and their clusters. Every district nodal officer was also assigned a specific block.
  • Ideally, a CRP was asked to make at least 5 visits to each school under his or her charge. At each stage, the CRP provided support and guidance to teachers. Of the five visits, the first visit was made in the first two weeks at the beginning of the intervention, the second was between the baseline and the midline, the third was scheduled immediately after the midline, the fourth visit was between the midline and endline and the last visit was made a few days before the endline. Note that this was a guideline, the CRPs had the freedom to make more visits in the schools that were struggling to make progress. In fact, based on the assessment data, each CRP identified 5 “least performing schools” to provide more support to.
  • Once in a class, every CRP first observed the classroom activities, interacted with children and then demonstrated effective learning activities. They used a School Observation Sheet to make note of important observations. The sheet also acted like a checklist to remind the CRPs about the various things they should be observing –
    • Attendance: overall and of weakest
    • Assessment data use and understanding
    • Grouping: appropriate and dynamic
    • Materials: available and being used
    • Activities: appropriate and participatory
    • Progress: in reading and math
    • Challenges: faced by teacher
  • Review meetings were scheduled twice a month at the block level and once a month at the district level. Nodal officers or BRLs led the meetings at the block level, and the DIET Principals led the district level meetings. The focus of these meetings was on discussing the observations made by mentors and activities undertaken by them in the class to support and improve the situation. The Pratham team members at the district level used to be present in these meetings to facilitate the discussions. The group also worked together to plan guidelines for subsequent visits in such meetings.
  • Review at the state level happened after every assessment cycle to compare progress across locations and discuss field-level challenges and strategies to overcome them. Participants in these meetings were government and Pratham state-level personnel and DIET Principals.
Translate »
Scroll to Top