PPL Perspectives: Why do evaluations go wrong (part 2)? Tips to help you succeed

PPL Perspectives: Why do evaluations go wrong (part 2)? Tips to help you succeed
posted 13 October 2017

By Vish Valivety, Consultant at PPL

 

In our previous blog we shared some common pitfalls to evaluations. This article shares a few tips to keep in mind to ensure you design and deliver a fit-for-purpose, well-tailored and useful evaluation.

Tip 1: Establish what success looks like for everyone involved

Make sure you take time up front, right at the start of the project, to be clear about what outcomes you expect from your initiative. For most models of care, it’s not just one organisation that delivers these outcomes so it’s important to think about what success means from the various points of view of all stakeholders involved- from the people who commission and provide care to the people who receive it. Once your outcomes are established, you can start identifying key performance indicators (KPIs) that will allow you to measure if you’ve reached your goals.

These goals should be realistic, and you should have a clear plan on when you expect to reach them. 

Tip 2: Design a detailed plan to monitor and measure progress

There’s lots of ways to develop a plan, but there are common elements that should build on each other to get you to a robust design:

  • Evaluation questions- what success means for your service
  • KPIs- the measures you will use answer your evaluation questions
  • Evidence base- the data and information sources to see if your KPIs are met
  • Analytical method- how you get useful insights from your evidence base
  • Project and resource plan- who will be responsible for evaluation tasks and when they will do them

Making sure that you have a detailed design that takes into all these elements can save you a lot of trouble down the line, and help you run a smooth, and useful, evaluation.

Tip 3: Don’t just use numbers

Many times, the term evaluation is felt to mean a purely scientific and quantitative investigation- full of numbers and charts.  While in-depth data analysis can help pull out key insights, it should by no means be your only form of evidence.  Combining insights from numerical data with qualitative questions and methods will help answer the questions on why and how, letting you learn more than using only data collection could.  You’ll also make your evaluation accessible to everyone, not just ‘numbers’ people.

Tip 4: Work with what you’ve got

When designing an evaluation framework, it’s important to avoid reinventing the wheel.  As a general rule, we’ve found that if something is important enough to determine if a service is a success or not, it is probably being captured somewhere already.  Make sure to engage with your data and business intelligence teams so you can uncover exactly what you are looking for without needing to redesign and recapture information. 

And always remember the 80/20 rule:  If somewhat pertinent data is being captured in an existing process, it may be preferable to use this instead of designing a new methodology.

Tip 5: It’s not a one-time thing

Evaluations are often conducted at the end of a pilot period or after a significant amount of time from the start of a service.  Following this pattern ends up severely limiting what the evaluation framework can do for you.  The best type of services are ones that constantly improve, taking what they learn on a regular basis and evolving.  Your evaluation framework can help with this.  Using what we call a ‘continuous improvement cycle’, regular evaluation points at different levels- from operational to strategic- will indicate pain points, successes, opportunities and risks in a quick enough timeframe to make changes and adapt.

Don’t leave your evaluation to the end, it can be the best tool to facilitate a viable and successful service.

There are a lot of tools and guides to help you think about how you can design and run an evaluation. Some are generic in nature and others - like our How To guide on Measuring Success in Integrated Care – relate specifically to health and care, and how to measure performance in complex programmes with multiple agencies and stakeholders involved.

Here at PPL, we design and deliver a wide range of evaluations across health and care. We have evaluated several Vanguard programmes, and have delivered various other organisational, programme and project evaluations relating to new models of care. We also run evaluation training and openly share our tools, frameworks and techniques to help build evaluation capacity and skills in the sector.