BetterData helps DFID funded social accountability programme to maintain data quality at scale

The Citizen Engagement Programme (CEP) supports citizens to monitor the quality of health and education service delivery in four of Mozambique’s eleven provinces, and to advocate at district, provincial and national levels to improve the quality of those services. The programme facilitates dialogue between citizens / communities and service providers to improve the quality of services.  Kwantu is one of the consortium partners implementing the programme.

CEP plans to work at scale across 773 health facilities and schools in Mozambique. The programme started working last year with local partners to run community score card (CSC) processes in many of these clinics, hospitals and schools. This is a participatory, citizen led methodology for monitoring the quality of service delivery.

Logistical and data quality challenges

Implementing over 700 CSC processes presents significant logistical challenges for the partners and teams managing implementation, monitoring and evaluation and internal learning and documentation.  The teams need qualitative and quantitative data from the implementing partners at different stages of the process. Collecting, collating and processing the data for these three requirements using Excel or Word templates would overwhelm the partners and the CEP team as the programme scales.

Working at this scale often introduces data quality challenges too.  Key considerations were:

  • Ensuring the same operational definitions were shared across all sites in 10 districts in 4 provinces
  • Clarifying responsibility for data collection and data review on each site
  • Standardising data collection forms across all sites and providing clear guidance
  • Agreeing on a consistent approach for aggregating data to report against indicators
  • Retaining a clear audit trail to all source forms to help with data quality checks and evaluation

Anticipating these challenges, we spent time during the programme inception period to document the CEP CSC process and implement a system that could help manage and monitor this process.

Documenting community scorecards as a process

The CEP approach to CSC was developed by the consortium partners, many of whome (CESC, N'weti and IDS) already had considerable experience of using this tool.  Once the approach was refined and adapted we ran a workshop with CEP programme staff to:

  • Review the CSC methodology and agree on ways in which a standard version could be implemented for the CEP context
  • Identify a series of steps to guide the implementation process and agree on points where review or sign-off is needed
  • Document the data requirements of the monitoring and evaluation and programme teams and ensure that data needed to measure indicators is included on a data collection form
  • Map data requirements to a set of simple forms that can be completed by the implementing partners at the appropriate stage of implementation
  • Map the forms to the steps in a standardised workflows required for data collection
  • Define reports that can export this data, aggregating across all sites

We then produced paper versions of the forms in Portuguese and English for field-testing with partners. 

Implementing a BetterData app to manage and monitor the process

The configuration tools in BetterData enabled us to implement the monitoring and evaluation system without involving software developers.  This was important, as feedback during the rollout process identified a number of areas where small changes to the forms or workflow were needed.  We expect this to continue as the programme reaches scale.

To configure the Community Scorecard App we used the following BetterData components:

Forms

We created each of the following forms identified in the documentation process:

  • Registration form (used to register a new facility)
  • Group form (used to register a community group taking part in the scorecard process for a facility)
  • Evidence form (used to register the scorecard completed by each group)
  • Action plan form (used to register the joint action plan agreed by the community and provider groups)
  • Meeting form (used to record engagement meetings)
  • Outcome form (used to track planned or un-planned outcomes resulting from the process)
  • Contact form (used to register key stakeholders)

Taxonomies

We defined taxonomies (standardised list of terms) to help cross-reference data.  For example, we have a taxonomy of districts that CEP operates in, another of types of groups that are recruited in the CSC process.  These ensure that each site cross-references information in the same way.  It also makes it easy to do cross-cutting analysis.

Workflow

We can configured workflow to match each of the stages documented above.  This helps to guide staff through the implementation process and ensure that facilitators on each site complete the same activities, using the same forms at each stage of the implementation.  The workflow links to system notifications that prompt users when a new step has started and what specific actions they need to take.  It also includes sign-off steps, where someone else must review the data entered.  Building data quality checks in at the operational level is a powerful way to address problems when they are still easy to fix.

Reports

The reports component makes it easy to see the full data model of each form.  These can be mapped to column headings in a report.  Since the Scorecard app includes a workflow, we can also include the workflow stage in the reports.

We used this to create reports that show the workflow status of each site in relation to the target dates agreed for that site.  This is an important management tool to track the implementation of all sites and see which are behind schedule.

Other reports track issues raised by groups in their scorecards and then those agreed across all groups in the action plans.  This makes it easy to analyse which issues are being raised across several sites and which issues are raised by groups, but not included on action plans.

Since the full data model for all forms is accessible, more reports can be configured at any time as new questions arise.  Once configured, reports can be exported as XLS at any time.

Indicators

Finally we configured the indicators used in the logframe and linked these to the forms that collect this data.  This ensures that the data used to monitor each indicator can be tracked back to the source.

Profiles

All staff and partners can browse a list of scorecard profiles on BetterData.  Each profile has pages to show the workflow status, the forms submitted and the people working on that site.  This makes it easy for staff working on different sites to share information and track their progress.

Benefits

This approach brought the following benefits:

  • Time saved on data collection and data aggregation -  Since BetterData aggregates activity level data automatically, the time saved can be spent instead on data analysis to feed into advocacy and learning
  • All programme staff and partners have access to data - CEP includes partners running health and education learning hubs.  These focus on networking and knowledge sharing in the health and eduction sectors.  Partners running the learning hubs have instant access to reports highlight issues raised by citizens.  They can also generate their own reports to explore new questions.
  • Standardised methodology makes it easy to share and scale the programme - The programme approach, activities and process have been documented in detail in a process manual.  This is encapsulated in a BetterData app, that makes it easy for other partners to adopt the CEP approach to community scorecards.