Towards cross-system outcomes measurement: Alison Ford discusses quarterly reporting in Tameside and Glossop
NHS Tameside & Glossop (T&G) CCG has been working towards cross-system outcomes measurement in relation to services for children and young people’s emotional and mental health over the last two years. The aim of T&G CCG was to collaboratively develop routes to outcomes-based commissioning, service improvement and meaningful measurement. CORC was asked to get involved to deliver practice-based support and provide data reports.
The project evolved in several phases:
Phase 1 – Setting the scene, fostering collaboration
CORC and the T&G commissioner met with commissioned provider organisations, such as statutory CAMH services, voluntary sector providers and the local authority team, to agree parameters of the project and arrive at a common understanding of the project aims and everyone’s roles/responsibilities. Concepts such as types of measures, reporting requirements, and available support were discussed and debated. It was key that providers were present at this point, so that the spirit of working together was facilitated.
Phase 2 – Scoping and agreeing measures
Facilitated by CORC, project representatives jointly determined which measures to use across the system, and were encouraged to consider what might work well in practice alongside the commissioning data requirements: CORC was able to offer expert advice and guidance based on existing knowledge and learning from the field, and through consideration of other CORC members’ experience of using measures.
A suite of four measurement tools were agreed: the Outcome Rating Scale (ORS), the Session Rating Scale (SRS), Goals, and the CHI-ESQ. The commissioner and providers felt that this set of tools offered an appropriate range of data and, importantly, a personalised view from the child’s perspective. Both outcomes and feedback data were desired, and a mix of standardised tools (SRS, ORS, CHI-ESQ) and individualised information (Goals) were important.
Scheduling of data submission and reports was also discussed at this early point; the commissioner’s ambition was to receive quarterly data reports, so that services were supported to respond in a timely manner to feedback and outcomes information. This required some learning on CORC’s part (since we usually provide annual reports) and we developed the mechanisms by which more frequent reports could be made available in a way that wasn’t cost-prohibitive to the CCG.
Phase 3 – Training and discussion
CORC developed and delivered a whole-day training event for cross-sector practitioners, which focussed on ‘why’ we use measures, ‘what’ the selected measures were, ‘when’ measures should be used, and ‘how’ they should best be used in practice. The training was aimed at facilitating discussion between individual practitioners, organisations and the commissioning team and included points on data submission processes and timescales. Early consideration of potential challenges to embedding measurement into practice was encouraged.
Phase 4 – Implementation
A start date was agreed and tools, resources, and avenues of ongoing support were made available. Practitioners were encouraged to use the CORC support available to them, and to raise queries and questions with the identified person within their organisation. Regular meetings between providers and commissioners took place, and CORC attended where the group thought helpful.
Phase 5 - Reporting
During the first twelve months of the project CORC delivered quarterly reports to T&G CCG, based on the data submitted by providers. Early reports contained less data than was desirable, since practice was still being embedded. Ongoing meetings between commissioners and providers, with support from CORC, enabled more data to be flowed. More recently, we have moved to 6-monthly reporting due to a need to address the burden of data submission (particularly for some of the smaller voluntary organisations) and the cost of producing more frequent reports. As part of the support, CORC and the commissioning team met with each provider to discuss their reports and consider learning from the data.
So what have we learnt?
Learning for CORC, T&G CCG and local providers during this project has been plentiful! At a grass-roots level, practitioners are seeing the benefit of using measures in practice and have fed back that they value the information gathered from the tools at a practical level. Provider organisations have found it useful to see the impact their provision is having at service level, and value the benchmarking information in the CORC reports. Commissioners are able to get a system-level view of impact and this has helped guide further spend and increase funding in some instances. Across the board, the service feedback information has been well received (not least because the data is especially positive in T&G!).
However, cross-system outcomes measurement is not without its challenges. Changes in practice must always be adequately resourced and supported, and wider contextual changes for some providers in this project brought about particular challenges in submitting data. The increased length of time that system changes (as opposed to single provider practice changes) take should not be underestimated; now we are two years hence the T&G commissioners and providers are starting to reap the benefits of their data collection and submission, though there have been barriers to overcome along the way.
CORC support has been valued by both the commissioners and the providers during this project; it was particularly useful to offer an external source of support and guidance. We have certainly learned a lot in CORC thanks to the dedication of all those involved! Collaboration has been absolutely critical to the success of this project.