Coronavirus is affecting the way we work and live, changing the demands placed on services and the needs and challenges we face. Many services are adapting to delivering services in new ways: here we share our guidance and advice on working with feedback and outcome measures for services transitioning to working remotely with children, young people and families either online or by phone.
We are keen to hear from you about learning, successes and challenges in working with questionnaires remotely - corc@annafreud.org. The advice and tips on this page reflect work in progress - we are actively gathering suggestions, tools, insights and experience that we can share in this area. Click below to download our advice and tips thus far in full, or see summary information on specific areas in the boxes below.
-
Measuring and monitoring the outcomes of care is just as important for services delivered remotely as for those delivered face to face. This is part of offering responsive and person-centred care, improving and developing services, and being accountable and transparent.
There is even more of a case for working with feedback where there has been a change in the way support is provided. However you approach this, good practice principles are:
- ‘no measurement without meaning’ – know why you are asking, what you are going to do with the information, and explain this clearly
- always acknowledge and respond to the feedback shared through questionnaires
- ask questions as they were designed to be asked – if you adapt an outcome measurement questionnaire, you can’t be confident that it will still be a good tool for measuring what it is supposed to measure.
-
We look at this in terms of three over-arching approaches:
- The questionnaire is completed at the service user’s end: the service user will download and complete a questionnaire and send it back to the service (e.g. by email)
- The service makes a shared electronic space available: this might be a portal or platform where measures can be accessed, completed and saved or submitted by the service user, without the service user needing to ‘send’ anything to the service
- The service user shares their responses to a questionnaire verbally, and these are recorded by the practitioner who stores them in the appropriate place in the service’s care record system.
Our full guidance includes a table (page 4) that breaks down how these three approaches can work in practice and some of the infrastructure, data protection, back office and practice considerations that you might like to take into consideration in planning your approach.
-
If you have access to PDF versions of the questionnaire you want to use:
- The free Adobe Acrobat Reader does have commenting tools that allow you to add, circle or highlight text and then save the PDF with these 'comments'
- If you are a practitioner going through the questionnaire verbally online and can share screen, we feel it works well for the child or young person to be able to see you edit the questionnaire as you go through it
- The questionnaire can also be edited this way at the service user’s end, although do experiment with this yourself to check its suitability for those you work with - not everyone will find this equally user-friendly and practice or advice may help.
Microsoft office formats, e.g. word, excel:
If you are considering reproducing the questionnaire in another format, please bear in mind possible copyright considerations, and the research findings about how questionnaire responses might be influenced by particular contexts or formats.
-
If the copyright for a measure is held by a measure developer (or other party), specific terms of use may have been set, for example regarding the purposes for which the measure is used or the ways in which it is reproduced. If you are thinking about reproducing or modifying a questionnaire – for example so it can be completed in a particular electronic format – you should check whether this is compatible with the terms of use.
In response to the impact of coronavirus, some developers are taking a more flexible approach: where we have information about this we have highlighted it in our section on ‘specific measures’ below, as well as putting details on the page for the relevant measure in our measures hub, which signposts to information about terms of use where possible.
Please note that NHS Digital have negotiated licences for a wide range of measures (although there may still be restrictions e.g. to a particular geographic territory): to find out more about a specific measure you may wish to contact NHS Digital (clin.licences@nhs.net).
Modifying standardised measures might affect your ability to flow data from that measure to the Mental Health Services Dataset or to CORC. It may also have an impact on how far your data can be meaningfully compared with data captured using the unmodified measure.
-
Considerations around confidentiality, informed consent and data protection apply in the normal way. Furthermore, when completing and discussing questionnaires with service users remotely, it may be helpful to suggest going to an area where conversations are less likely to be overheard. You may wish to discuss with service users any new potential risks to the confidentiality of information which you as a service will not be able to control or mitigate – for example:
- If questionnaires need to be saved or printed on devices owned by or shared with others (e.g. parents, carers, siblings)
- If you will be corresponding about questionnaires with both parents and young people using the same email address.
In thinking about any online data transfer (e.g. email, secure messaging app) or online data collection platform, you should consider similar data security questions as for other systems for example what is the sensitivity of the data being collected, where would password-protection of documents be appropriate, how are user accounts set up and secured, who has access to what data, etc. These aspects should be reviewed regularly as part of your service’s approach to data security. Useful resources include ICO guidance, Cyber Essentials, and the Data Protection and Toolkit Standard.
We have received a number of queries relating to SurveyMonkey specifically: it may be useful to be aware that this platform stores data on servers in the U.S., which some Data Protection Officers may not be comfortable with, depending on the type of data being collected and local policies. However, SurveyMonkey does currently have certification for the Privacy Shield Framework, which is a scheme that places requirements on U.S. companies to process data in a way that is considered adequate by the EU Commission.
-
There are two aspects of research which may be helpful to bear in mind when you are working with outcome measures online.
If you are considering adapting questionnaires, or paraphrasing questions:
Developers put a lot of expertise into developing measures and the psychometric validation of these – copyrights aim to protect the validity and reliability of the questionnaire:- validity is the extent to which something measures what it is intended to measure, or how accurate it is
- the reliability is how consistent a measure is, for example, over time or between respondents.
If research shows the questionnaire is valid and reliable, we know it can be meaningfully used with multiple young people, and that we can compare responses across young people and across teams or services. Even small tweaks can mean the questionnaire it is no longer a valid or reliable measurement tool. It would no longer be sound to look at scores against clinical thresholds or norms, or to benchmark the data or to draw meaningful conclusions when looking at multiple scores together at an aggregate level.
When interpreting data from questionnaires completed online
Even when the questions are not changed, research suggests different formats can have an impact on the way people respond to mental health and wellbeing questionnaires. The full version of our guidance discusses the published findings (page 7) and references (page 12) – however these are not clear-cut and there is a need for more research.
Given we do not fully understand the way questionnaires function differently in different settings, we recommend that practitioners and services bear in mind when they are interpreting questionnaire results that there is potentially a difference in the way that young people (and others) respond to measures remotely (compared to completing a measure on paper).
When given a choice, the majority of adults and young people opt for electronic versions of measures over paper-based versions (Buchnell Martin & Parasuraman, 2003).
-
Our full guidance (page 8-9) talks through some of the tips we have come up with when we have been using measures online.
Summary suggestions if you are filling in a questionnaire together, verbally:
- go straight through all of the questions in order without stopping to discuss individual items – explain you will do this and then discuss at the end
- if you can, it is best if both of you can see the questionnaire you are filling in e.g. by sharing screen or emailing the blank questionnaire in advance
- use the exact language in the questionnaire –this can feel clunky until you have worked out what is fluent for you and done a couple of practice runs
- When you reach the end of the questionnaire, talk together about the responses given.
-
A change in the way your service is delivered might cause you to review your choice of measure. If you are talking through a measure in a session you might want to consider:
- the length of the questionnaire: if you have been using longer measures, it is worth bearing in mind that research suggests briefer measures can also be psychometrically robust as well as simple to use (e.g. the Child Outcomes Rating Scale (Casey et al, 2019) or Young Person’s CORE (Twigg et al., 2009).
- Goal-based measures may feel more natural to integrate into a remote session. Recent research on the Goals and Goal-based Outcome tool (GBO; Law, 2011) has found good levels of internal consistency between goals which suggests that even though goals vary in content, the goal ratings work together in a more cohesive way than previously thought - similar to standardised measures of mental health and wellbeing outcomes (Edbrooke-Childs et al., 2015).
We encourage you to test out the questionnaires you use with a colleague and see if they still feel like a good fit in your new way of working. Choosing an outcome measurement questionnaire always involves balancing a number of considerations and we welcome conversations with CORC members to review or talk though their outcome measure choices or options.
You can find any information we have about specific measures recorded on the page for that measure in our measures hub including any guidance we may have on using it remotely, or any current relaxation in its terms of use. Helpful to note:
- CORE Systems Trust (YP CORE and CORE 10) have shared versions of their measures to support practitioners working remotely at this time. They also have fillable PDF forms for the CORE measures available.
- There are developer-approved video guides available discussing how to administer the Outcome Rating Scale and Session Rating Scale in remote work.
-
To help with using the GBO in online work, Goals in Therapy have developed an editable PDF version that allows you or your client to rate the GBO electronically and share results. The interactive PDF versions rating sheets can be downloaded and used for free here.
-
The National House Project
Staff working for the National House Project are all working remotely and will continue to do so during the lockdown period and methods of working have needed to adapt as a result. As a team we are meeting virtually on a daily basis to review actions, workload and to consider solutions to some of the challenges experienced at this time. The time working remotely has also been used to further develop the House Project Programme and other NHP policies. More widely, we are liaising with Local House Projects via video call. We have increased the support offered to Local House Projects from monthly to fortnightly, and our Community of Practice (bringing together of staff from Local House Projects nationally) has also happened virtually. This would normally involve all staff from projects but is currently taking place just with Project Leads.
These video calls are providing insight into how young people are coping in the current pandemic, oversight of the progress young people are making on the House Project Programme, and provides ongoing support to Local House Project staff.
The Care Leavers National Movement (CLNM) has been campaigning for funding to support young people at risk of internet poverty, especially at this time of social isolation. This campaign (via Just Giving) has raised considerable funds (mainly generated via social media activity) which has allowed Local House Projects to ensure that every young person has access to unlimited data and a device that they can access this from. It is increasingly important to us that young people remain connected at this difficult time.
The Local House Projects continue to use the outcomes and learning framework and hence outputs and outcomes continue to be measured as they would under normal conditions.
Our email address is enquiries@thehouseproject.org if any other services require further information in this area of work.
42nd Street
We have closed our service to new referrals for our face to face services and have substantially extended our online delivery. All staff are working remotely.
All young people on our waiting list have received a phone call from a names identified worker to assess current need, and make the offer of online support or phone support or check-ins where the young person who like to pause therapeutic support until after lockdown.
For those young people awaiting assessment at the time of shut-down, we have continued to deliver assessments via phone (unless there are access needs, our assessments happen this way usually). We already offered online therapeutic support but we have extended this offer to all current young people and have also maintained an open referral offer to all young people. We have delivered training across our whole team to enable us to grow capacity. ROMs are collected via the online services site and we then also record all data on PCMIS.
We are also delivering the vast majority of our group work and social action programmes digitally. Our creative programme, The Horsfall, is also holding regular creative activities via social media. We are operating via Teams for internal/external professional meetings.
Barnet Integrated Clinical Services
Like many others, we have adapted our operations to the requirements of working remotely. This has included among other things considerations for which platforms to use for contact with colleagues and clients, increased staff support, pooling of resources related to the circumstances, and support lines for our referrers; and in the longer term, planning for the recovery phase post-lockdown. We are using POD as well as trying to complete measures over e-mail.
Cornwall Music Service Trust
We are a peripatetic music therapy service who work in schools, hospitals and other settings across Cornwall. We are currently on furlough, as peripatetic workers who mainly go in and out of other services, so much of our work was suddenly not possible. This was the best option for the short term to enable our organisation to survive. Once off furlough... we are planning to upskill for online delivery, speak to funders about adapting delivery plans, create online resources and plan for when we can safely work face to face with clients again.
At present some of our team are voluntarily keeping in touch with clients who they were working with. They can't do any paid work till our furlough period ends and we are very conscious of this while trying to be ethical in terms of not abandoning our clients at a difficult time.
We will ask if there is any way CORC can help us, especially when we are no longer furloughed and trying to do sessions in ways we are not used to.