APCRC Draft Toolkit
APCRC Draft Toolkit
“A study in which research procedures are used in a • from the simple (such as a single survey) to the complex
systematic way to judge the quality or value of a service (using multiple methods);
or intervention, providing evidence that can be used to and it can be used in many different contexts. A central
improve it” (West of England Evaluation Strategy Groups, component to evaluation is one of comparison, either
2013) comparing before, with another service or with itself.
In health and care evaluation tends to be distinguished as either
research or service evaluation. The key distinguishing feature
Monitoring and Evaluation between the two is dependent on the purpose of the study:
We routinely monitor the performance of services however Research aiming to derive “generalisable” new knowledge
monitoring alone can only tell us so much – in that we are doing beyond the setting it took place; where as
things – but it unable to tell us if we are doing the right things, what is
working and what needs improving. However, monitoring data is a A service evaluation is performed to meet a specific local need
useful source of information to help you to evaluate your service. In and is focused on what is (or will form) current care with the
turn planning your evaluation alongside your monitoring framework evidence collected to inform recommendations not intended to
can help to reduce duplication and unnecessary data collection, shape be used beyond the setting it took place (although may be
and improve the monitoring data you plan to collect and identify transferable to other areas).
appropriate baseline and benchmarking data.
This toolkit is focused on helping you to plan and conduct an
appropriate evaluation.
E.valua.tion
Extremely
Valuable
Information2
Navigation
This toolkit provides you with a suite of resources (tips, tools, templates, signposting) to enable you conduct a service evaluation.
You can use the tools and templates as a package or in isolation if you just need support with one aspect.
The questions below will help you to navigate through the toolkit or signpost you to specific areas you would like help with.
Plan • Have you developed your evaluation plan including identifying your: 5
• aims and objectives of the evaluation? 6
• evaluation types and approach 7
• data to be collected? baseline data? 8, 9 and 10
• ethics and governance implications? 11
Question Yes No ?
Do I know what I want to evaluate and why? Consider drivers and audience
Do I already know the answer to my evaluation question? Consider evidence
Will I be able to use the results of the evaluation? Consider context
Will these evaluation results provide me with useful information that will be considered “value for
money”? Do benefits of conducting the evaluation out way costs and consequences.
Do I have the resources and/or skills available to undertake this evaluation? Consider how
Will the information I collect be able to answer my evaluation question (the evaluations aims and
objectives)? Consider quality and accessibility of data
Is it the right time to conduct an evaluation of the service? Consider:
• Stage of development and complexity of the service
• Timescales of the service
• Context the service is operating in
Are there other options to doing an evaluation? Consider:
• Are current project processes such as monitoring enough?
• Should I consider other options such as clinical audit, quality improvement or research?
For more information on “Evaluability Assessment” the following resources will help:
http://betterevaluation.org/themes/evaluability_assessment
Assess Plan Do Review
Purpose What is the purpose of the service? It is important to understand the purpose of your service (i.e. what
What need is it addressing? What changes it intends to make – outcomes – and how it intends to do this)
outcomes will it achieve? and the evaluation (i.e. what answer does it need to address as this will
What is the purpose of the impact on the type of evaluation you need).
evaluation? How will it be used?
Who is your audience?
Evidence What is the evidence base for the Understanding the evidence base for your planned service can help
base planned service and associated inform your evaluation approach and methods: from understanding the
outcomes? strength of the evidence available i.e. if well evidenced then you might
What is the strength of evidence? focus on the implementation (process) evaluation; to identifying
How have similar services previously appropriate outcomes and outcome measures; to looking at how others
been evaluated? have evaluated similar schemes before.
Stage of Is it new? Has it been in place for a Understanding the stage of development of your service will also impact
development while? on the type of evaluation you need to conduct. If it is new and being
developed you will want to take a more formative (improvement)
approach.
Level of Is it a single change? Are there Understanding how complex your service is and the context in which it is
complexity multiple components? What is the operating is also important. How many services do you know trying to
and context context/environment in which it is reduce unplanned admissions?
working?
Timescales What are the timescales? Is it a one Timescales will impact on what you are able to evaluate and how, i.e. if a
year or five year pilot? long term investment you are likely to be able to not only look at
implementation (formative/process evaluation) but also outcomes.
Assess Plan Do Review
Intermediate
Inputs Activities Outputs Impact
Outcomes
To develop your “theory of change” working with your key stakeholders, some flip chart paper, pens and post-it
notes consider the following questions1:
1. Who the service is for? i.e. your case for change will usually set out the population group, their needs and
characteristics, the problem it is trying to address
2. What are the long – term outcomes you want to achieve?
Then working backwards
3. What are the intermediate outcomes (short and medium term) that will lead to these long term outcomes?
4. What are the activities that the service will undertake to deliver these intermediate outcomes?
5. What evidence (from research and local learning) is available to inform and support the links between
activities and outcomes? i.e. what are your assumptions based on the evidence, expertise and learning?
6. What other factors need to be in place to enable this service to work? i.e. what are your enablers?
Resource 2: Understanding your service – Accessing and reviewing the evidence base
It is important to consider what evidence is available to inform your planning and decision making in terms of the service as well as the
evaluation. You will be used to using a broad range of evidence from multiple sources including needs assessments, public health and
performance data, evidence from research and best practice as well as expertise and local learning. This evidence can be used to
inform your evaluation, help identify the outcomes you hope to achieve and the activities (processes) and outputs that will deliver
these outcomes.
Understand what evidence is already available to inform not only your service design, but the type and level of
evaluation you need.
Our evidence toolkit http://evidence.apcrc.nhs.uk can help you with accessing the best
available published evidence from research, evaluation and the grey literature.
Engage all key stakeholders, including the patients and their carers, in the design, delivery and dissemination of the
evaluation
Once you have conducted the stakeholder analysis this can then be used to inform:
• Who to involve in the evaluation
• What expertise and people are available to support the evaluation (including expertise in data collection, data analysis,
communication, patient and public involvement)
• Your project, evaluation and communication plans
N.B. Make sure you consider the needs of those that are seldom heard and are vulnerable. If you have not already, consider
completing an equalities impact assessment for the planned service or service change.
The NHS Institutes tool, part of its quality improvement tools, http
://www.institute.nhs.uk/quality_and_service_improvement_tools/quality_and_service_improvement_tools/stakehol
der_analysis.html
willmore
For help information
you to conduct a stakeholder
about analysis.
involving patients and the public then INVOLVE www.invo.org.uk, who are funded by
the National Institute for Health Research (NIHR) to support public involvement in NHS, public health and social care
research, are a useful resource.
Assess Plan Do Review
Resource 4. Resources
It is important to consider what resources are required for the evaluation from the outset. This may need to be reviewed throughout
the planning process. The resource requirements will be informed by the type of evaluation you need (and quite often impact on the
design of your evaluation). This is an important step and should be considered early when planning and developing your service (i.e.
business case, QIPP scheme, service specification, intervention, programme or project).
Plan your evaluation early in the commissioning cycle. This will enable you to ensure that you use the most appropriate
methodology, collect the right data, allocate adequate resources and set appropriate timescales.
Who should be involved in the evaluation? Who are your key stakeholders? Are there local experts that can help? Discuss lay
Who can help you to deliver this evaluation? involvement with your PPI lead - service users and their families are often well placed to
help evaluate the services.
What resources are available to support the This could be money and/or staff time depending on the evaluation. If you are conducting
evaluation? Include financial resources an internal evaluation then you will need input from staff. If you need to commission an
external evaluation then you will need to also identify a budget (approximately 10% of the
project value). Utilise existing data and project resources where possible and appropriate.
Who will conduct the evaluation? Evaluations can be conducted internally, externally by an independent organisation or a
mixture of the two. What you decide will depend on the purpose of your evaluation (i.e. if it
is to inform and improve then an internal evaluation may be adequate; if it is to demonstrate
a level of accountability then you need to consider an external evaluation), the level of risk
involved to the organisation and the participants and the skills and resources available.
Our service evaluation guide and website tbc provides you with guidance on commissioning an independent external
Evaluation and template to set out your invitation to tender.
Assess Plan Do Review
Resource 5. Evaluation Planning Template
This template sets out the key elements to consider when planning your evaluation.
Don’t forget to build the evaluation into your project plans and processes as much as possible. Make sure it is feasible
and utilises appropriate existing data that is aligned to, and informs, your monitoring plans.
Design and There are multiple approaches to evaluation which can be complex and confusing. A simple approach to overcome this
methods is to consider the types of information and data you need to answer your evaluations aims and objectives. Consider
both qualitative (i.e. narrative data from interviews, focus groups) and quantitative (i.e. numerical data from surveys,
Resource 7 - 10 monitoring forms), the sources of that data (does it already exist or do you need to collect it). Ensure that you baseline
your information where possible and consider utilising benchmarking data.
Ethics and Ensure you have considered what the ethical implications of the evaluation are and how they will be mitigated and
Governance reviewed throughout the project lifecycle.
Resource 9
Resources Outline the resource requirements and include here internal project resources to support the evaluation as well as any
ring-fenced financial resources (and funding available). You have already started to identify these when assessing
Resource 4 whether to do an evaluation.
Outputs and This is a description of what will be produced (e.g. the evaluation report) from the evaluation as well as the intended
impact use and impact. Remember to build this into the services communication plans.
References All sources quoted in the proposal should be acknowledged and correctly referenced.
The Lay To see how the project is improving experience of living with To evaluate the impact of the training on patient knowledge and
the condition in order to influence NICE guideline review skills for enhanced self-care of fibromyalgia
Representative To measure stakeholder satisfaction with the approach
The Is the intervention cost-effective? To understand the potential of the project to effect cost-savings
through a reduction in unnecessary hospital admissions and
Commissioner Does it fit in with the CCG’s strategic priorities? unnecessary GP attendances.
The Service Are the aims of the project met? (ie in this case, do patients’
knowledge and skills to self-manage increase?)
To evaluate the impact of the training on patient knowledge and
skills for enhanced self-care of fibromyalgia .
Provider To measure stakeholder satisfaction with the approach
What do service-users think of the project?
The Voluntary Did volunteers help meet the aims of the project?
What was the experience of Volunteers in the project?
To evaluate the impact of the training on patient knowledge and
skills for enhanced self-care of fibromyalgia .
Sector Partner Do volunteers have the potential to deliver cost-savings? To measure stakeholder satisfaction with the approach
To understand the potential of the project to effect cost-savings
through a reduction in unnecessary hospital admissions and
unnecessary GP attendances.
AIM: to assess the impact of the pilot to make recommendations to commissioners about future funding
Assess Plan Do Review
• Assesses implementation and delivery of the initiative, and whether this was carried out as planned
• Takes place throughout the project implementation and delivery phase.
• Demonstrates to stakeholders that project Objectives are being met
Process • Qualitative methods: Interviews; Focus Groups; Questionnaires
• Quantitative methods: Metrics, Benchmarking, quantifiable Performance, Financial, Service Usage data
• Service Improvement methods: PDSA Cycles; Rapid Evaluation Cycles
•A type of process evaluation that assesses how the intervention / programme can be improved as it is being
implemented
•Takes place before the project starts and throughout pilot phase to inform implementation.
•Shows what worked well and why, and identify challenges to capture lessons learned (continually improving)
Formative •Utilises evaluation theory e.g. Theory-of-Change; Realist Evaluation, Experience-Based Design, Critical Path etc.
•Helps to assess needs; clarify theory-of-change; to identify areas for shared learning and improvement
•Qualitative methods: Literature Review; Interviews; Focus Groups; Observation; Diaries; Questionnaires
•Service Improvement methods: Stakeholder Analysis; Logic Modelling; PDSA Cycles; Rapid Evaluation Cycles
• Was the project successful? Has the met its predefined aims and objectives?
• Looks at the Outcomes (i.e. measures the changes that have occurred as a result of the programme) and/or Impact (i.e.
the longer, deeper, systemic changes).
• Demonstrates to stakeholders if project aims were met.
Summative • Qualitative methods: Interviews; Focus Groups; Questionnaires
• Quantitative methods: Metrics, Benchmarking, quantifiable Performance, Financial, Service Usage data
• Can be Experimental (e.g. randomised controlled trial), Quasi-Experimental e.g. comparing before and after or use
service Improvement methods: PDSA Cycles; Rapid Evaluation Cycles
http://www.ihi.org/resources/Pages/HowtoImprove/default.aspx
http://www.civicpartnerships.org/#!quantitative--qualitative-eval-methods/c1bel
http://www.institute.nhs.uk/research_and_evaluation/general/useful_resources.html
Assess Plan Do Review
Resource 8. Data Collection Planning Template
During the initial planning phase you would have identified the purpose of your evaluation (aims and objectives), the type of evaluation
you might need as well as starting to consider the types of data and information you need to be able to answer your aims and objectives.
Use the information you have identified so far as well as resources 9 and 10 to complete the following table, setting out your objectives
against the data you plan to collect, the sources of that data and who is responsible for collecting and analysing the data.
Identify and utilise existing data and data collection methods, where it is available and of good quality, to avoid duplication
and unnecessary data collection. Make sure you collect baseline data, consider benchmarking and utilise validated tools
where they exist. Consider using multiple (audit, quality improvement) and mixed (qualitative and quantitative) methods.
What are your What are the questions you need to answer to What data do you need to be able to answer your Who will be doing
evaluation objectives? enable you to demonstrate whether you have specific questions / measures? what and when?
achieved your objectives? Where can you source that data from? Or What
data collection tools do you need?
What sample size will you need?
What baseline and benchmarking data is available?
Objective 1
Objective 2
Explore your evidence review to see how others have evaluated similar initiatives including whether there are any validated
tools (for example surveys, patient reported outcome measures, patient reported experience measures) available that you
can use?
Assess Plan Do Review
This decision tree was developed in 2005 by Marsh and Glendenning so remember that with advances in technology there are also other ways
to collect the information such as online surveys, text messaging, video messaging, social media, apps and other mobile technologies.
Assess Plan Do Review
Resource 10: Types of data and information
Having an adequate baseline for your key metrics is extremely important as a key aspect of evaluation is comparison,
whether it is comparing with it self, before or after the service change or with another service. Benchmarking is also a useful
tool to compare with other similar services or with national trends.
• This is additional
• Uses narrative or descriptive data new data collected
QUALITATIVE
rather than numbers. For example, a PRIMARY for the specific
description of the views and attitudes purpose of the
of those receiving/providing a service. evaluation.
http://www.hra-decisiontools.org.uk/research/
We suggest that you keep a record of the result as a part of your evaluation plan as this acts as an audit trail. We also
recommend that you check with your own organisation to ensure that you comply with local arrangements when it
comes to the governance and ethics of your evaluation.
As a service evaluation you do not need any formal ethical approval from an NHS Ethics Committee, however every service evaluation
should ensure they apply best practice when it comes to ethics ensuring that adequate safeguards are put in place and that the
benefits of conducting an evaluation outweigh the risks. This will involve reviewing your evaluation during the design, delivery and
dissemination to ensure any ethical issues are identified and actions are put in place to address them. This will include ensuring
appropriate informed consent is obtained from participants, confidentiality and anonymity are maintained and the study complies
with your organisation‘s information governance and data protection policies and procedures. Particular consideration needs to made
for any participants that may be considered vulnerable such as children and people who lack capacity.
Assess Plan Do Review
12. Reporting Template
The template below outlines the key information you need to provide within your evaluation report. Note: use this as a guide as
your own organisation may already have a reporting template which incorporates your organisations house style.
Heading What to include
Executive summary Provide a high level summary of your evaluation on a page (max 2 pages)
Introduction Outline the purpose of the evaluation, project/evaluation team (who is involved in the evaluation),
structure of the report
Background Provide the National and Local context for the service and evaluation (why it is important, case for
change, evidence base) including information from your business case and evidence review
Aims and objectives Clearly set out the aims and objectives of the evaluation and how they were developed / identified
Methods Using the information in your evaluation plan set out the overall approach to the evaluation,
specific data collection methods/sources (include who, where, when, how and how many) and
approach to data analysis. Note here any limitations of the methods.
Results Present the key results and analysis from the data collection here (you may decide to include some
results in the Appendix if you have lots of data) .
Findings Present an interpretation of your findings here, bring together different data that work together
(triangulation). Include here any limitations of the data and findings and be careful with the
language you use (i.e. you are unlikely to be able to show cause and effect)
Discussion Relate how the key findings fit with the local and national context, what the key learning is from the
evaluation.
Conclusion and Provide evidence based conclusions to make evidence informed recommendations from the
Recommendations evaluation
Appendices Include here your outputs from your worksheets plus any additional data or information to help the
reader understand your evaluation.
To maximise the impact of your evaluation make sure you use your findings to inform your decision making going forward
And identify areas for improvement
Recommendation specific action to help explanation of why Lead Timescale Use your
from the address the the action will lead to responsible for organisations
evaluation recommendation the desired change delivery risk matrix
Note any risks
or issues
associated
with this
activity
Use multiple formats to communicate your findings for the greatest impact and where possible utilise existing forms
and channels of communication (engage your communications experts!)
who you are trying to how you are what your Lead responsible Timescale Use your
reach e.g. Participants going to reach message will look for delivery organisations risk
in the study your like) e.g. Report, matrix
audience) e.g. leaflet, poster etc Note any risks or
Website, Staff issues associated
meeting, with this activity
newsletter
Project resources 4. What resources are available to support this evaluation? What resources do you
think you might need / how much might it cost? Who might fund the evaluation?
5. What level or type of evaluation do you need?
Support for the project: All Stakeholders 9. Who are your key stakeholders?
9. Who needs to be informed?
10. Who do you need to involve in the evaluation planning, delivery and
dissemination?
10. Who has skills, experience and expertise to support you with your project? i.e.
Patient and public involvement, equalities, communications and engagement,
evaluation leads in your own or partner organisations
Support for the project: Service User 14. How will you involve service users, patients, carers and the public in your evaluation?
Involvement Consider this in terms of the design, delivery (data collection) and dissemination
(communicating your findings).
Context: Evidence Base 19. What is the evidence base for the planned service, service change, pilot?
20. How have similar services been evaluated in the past?
Context: Understanding the Service 19. Is it clear who the service is for? (population group, needs and characteristics)
20. Is it clear what the desired intermediate and long term outcomes are and how the activities
of the service or intervention will lead to these?
Scope of the project 19. Have you agreed with your stakeholders the purpose of the evaluation?
20. Are you clear what the evaluation will focus on?
21. Is it clear why you are conducting an evaluation?
Evaluation Planning Checklist (2 of 2) Back
Identify Questions to ask
Aims and Objectives of the 23. Have you engaged your stakeholders to help you identify your evaluations aims (why you are doing
evaluation this evaluation) and objectives (what you are trying to achieve)?
24. Are your aims and objectives SMART?
Evaluation approach 25. What evaluation approach or method are you planning to take?
26. Do you need to commission an external evaluation?
Data requirements 26. What information and data do you already have available to support your evaluation?
27. What additional data collection do you need to undertake to be able to answer the aims and
objectives of your evaluation?
Data Collection, Analysis 26. What data do you need to collect? Will your data collection tools work? Are there any validated
and Reporting tools that can help? Who will collect the data?
27. How will you analyse the data? Who will analyse the data?
28. How and who will write up the findings?
29. Have you identified any training needs to support these activities?
Ethical implications 32. Have you considered the impact of your evaluation on the participants and the service?
33. Have you put adequate safeguards in place to protect the participants in your study? Including
gaining consent and feeding back findings.
Recommendations and 34. Have you agreed your recommendations and how you are going to implement them?
action planning 35. Have you developed your action plan?
Sharing the findings 35. How are you going to feedback the findings from your evaluation to your stakeholders?
36. Have you developed a communication plan to share your findings and recommendations?
Are you still doing an 29. Once you have planned your evaluation recheck whether you are doing a service evaluation or
evaluation? research to ensure that you have the appropriate permissions and approvals for starting the project.
Additional Resources
There are a number of resources available to support your evaluation, some are included within the toolkit to support specific aspects of
your evaluation, this section provides you with other toolkits that may be of use to you.
Evaluation Toolkits
• CLAHRC Evaluation Guide: Developed by NIHR CLAHRC Leicestershire, Northamptonshire and Rutland in 2012 for clinicians and NHS
Managers to help guide them through the process of evaluation
– http://www.clahrc-cp.nihr.ac.uk/wp-content/uploads/2012/07/Evaluation_GUIDE.pdf
• NHS Cambridge Full Evaluation Toolkit: Adapted from the PRIMARY CARE SERVICE EVALUATION TOOLKIT Version 1.5 Peter Marsh and
Robert Glendenning to support NHS Cambridge CCG with Evaluation
– http://clahrc-cp.nihr.ac.uk/wp-content/uploads/2012/07/Full_Evaluation_Toolkit.pdf
Evaluation Guides
• Magenta Book: Developed by HM Treasury to support evaluation of policy
– http://www.hm-treasury.gov.uk/d/magenta_book_combined.pdf
• MRC Framework: Developed by the MRC to help with evaluating complex health interventions
– http://www.mrc.ac.uk/Utilities/Documentrecord/index.htm?d=MRC004871
• Health Foundations Evaluation: What to Consider Guide provides insight into the things you need to consider when planning an
evaluation, the different types of evaluation and different methodological approaches
– http://www.health.org.uk/publication/evaluation-what-consider
Other useful resources
• Charities Evaluation Service Evaluation Tools and Resources
– http://www.ces-vol.org.uk/tools-and-resources
• NPC Evaluation Tools and Resources
– http://www.thinknpc.org/
• Social Value UK (SROI network) guide to Social Return on Investment
– http://socialvalueuk.org/what-is-sroi/the-sroi-guide