Public service delivery in South Africa is in crisis. Service delivery protests are increasing dramatically and becoming more violent.
Government has tried many ways of engaging citizens through public forums, presidential hotlines and facility-based complaints mechanisms, but has not yet found a way of ensuring what citizens say becomes part of how government monitors and manages its performance.
In August 2013, the Cabinet approved a ‘Framework for Strengthening Citizen-Government Partnerships for Monitoring Frontline Service Delivery’. All government service departments are now required to find ways of including the views and experiences of citizens on service delivery into their M&E frameworks.
The Department of Performance Monitoring and Evaluation in the Presidency has contracted Keystone and a local NGO, the Seriti Institute, to design and test an approach to citizen-based monitoring (CBM) in which feedback from those who receive services (the citizens) AND from those who have to deliver those services (the frontline facility staff) are heard and engaged with.
The pilot will run from October 2013 to March 2015, starting in Msinga in KZN and Phuthaditjhaba in Free State, and then expanding to pilot sites in all nine provinces. The pilot will be implemented in the police stations, health facilities and facilities of the Department of Social Development and SASSA, the agency responsible for social grant payments.
This is an ambitious pilot – and the first attempt by a government to systematically put feedback from citizens at the heart of its performance management in all departments.
Drawing on Keystone’s Constituent Voice method, the approach being piloted starts with developing simple survey tools to collect feedback in the two feedback loops that critically affect service quality: Citizens will give feedback on their experience of the service facilities while frontline staff will give feedback on how well the senior department managers support and enable them to perform well.
Feedback will be collected independently through community organizations using traditional or mobile data collection methods. The feedback will then be analysed and comparative ‘performance reports’ created that the facility and senior management can use to affirm good practice, identify improvements needed and take action to implement them.
The simple graphic reports will be used to structure open dialogue among stakeholders to validate the feedback, to affirm strengths and establish agreed actions for improvement. Performance against these commitments will be monitored through subsequent feedback cycles.
The duly validated (or amended) performance data can be integrated into the formal reporting and performance management systems of the relevant department, thus giving meaningful voice to citizens and frontline staff, managing expectations and building mutual accountability for improved services.
We will be blogging regularly on the progress of this bold experiment as it unfolds over the next 18 months.
A remarkable development, with potential implications globally! I’m curious — how do you plan to assess its success and harvest lessons for replication elsewhere? Best of luck with this ambitious and important experiment.
Hi Alnoor,
we’re still in the process of defining our hypotheses and what ‘success’ will look like for whom in a pilot situation – and how we will measure it. Stuff for a further series of blogs in its own right I think.
But DPME are taking a very open learning-focused approach by planning three iterative pilot cycles over the next 18 months. The first (in only 2 sites) is being planned now in which we will introduce the basic elements of what such a feedback cycle/system might look like (e.g. perhaps using paper surveys and manually created reports) to demonstrate the concept in practical terms that people can engage with. The experience will be documented and widely discussed in government and with civil society.
A second pilot cycle in 4-6 sites will build on this initial experience to improve the community training, survey content and collection introduce elements of the technological architecture such as mobile data collection, automated analysis and report generation etc. followed by a process of reflection and learning.
… and then a third cycle will begin (perhaps a year from now) in which a thoroughly revised approach, tools and methods will be applied in a further 4-6 sites. At least that is how it looks now in these very early days. Perhaps other players in this space will engage with the process and plan their own independent initiatives in SA or elsewhere and a learning community will emerge.
Many thanks, Andre, for your detailed reply. Looking forward to following your experience and learning process. Carpe diem!