Qualitative evidence for evaluations and impact assessments

qualitative evidence for charities

For the last few months we have been working with SANDS Lothians, a local charity offering help and support for families who have lost a baby in miscarriage, stillbirth or soon after birth. They offer amazing services, including counselling, peer discussion groups and advice to health professionals, which can help ease the pain and isolation of a difficult journey.

 

We helped them put together a compilation of qualitative evidence in Quirkos. This has come from a many sources they already have, but putting it together and pulling out some of the key themes means they have a qualitative database they can use for quickly putting together evaluations, reports and impact assessments. Many organisations will have a lot of qualitative data already, and this can easily become really valuable evidence.

 

First, try doing an ‘audit’ for qualitative data you already have. Look though the potential sources listed below (and any other sources you can think of), and find historical evidence you can bring in. Secondly, keep these sources in mind in day-to-day work, and remember to flag them when you see them. If you get a nice e-mail from someone that they liked an event you ran, or a service they use, save it! It’s all evidence, and can help make a convincing case for funders and other supporters in the future.

 

Here are a few potential sources of qualitative feedback (and even quantitative data) you can bring together as evidence for evaluations and future work:

 

 

1.  Feedback from service users:

Feedback from e-mails is probably the easiest to pull together, as it is already typed up. Whenever someone complements your services, thank them and store the comments as feedback for another day. It is easy to build up a virtual ‘guest-book’ in this way, and soon you will have dozens of supportive comments that you can use to show the difference your organisation makes. Even when you get phone calls, try and make notes of important things that people say. It’s not just positive comments too, note suggestions and if people say there is something missing  – this can be evidence to funders that you need extra resources.

You can also specifically ask for stories from users you know well, these can form case studies to base a report around. If you have a specific project in mind, you can do a quick survey. Ask former users to share their experience on an issue, either by contacting people directly, or asking for comments through social media. By collating these responses, you can get quick support for the direction of a project or new service.

 


2. Social media

Comments and messages of support from your Facebook friends, Twitter followers, and pictures of people running marathons for you on Instagram are all evidence of support for the work you do. Pull out the nice messages, and don’t forget, the number of followers and likes you have are evidence of your impact and reach.

 


3. Local (and international) news

A lot of charities are good at running activities that end up in the local news, so keep clippings as evidence of the impact of your events, and the exposure you get. Funders like to work with organisations that are visible, so collect and collate these. There may also be news stories talking about problems in the community that are related to issues you work on, these can show the importance of the work you do.

 


4. Reports from local authority and national organisations

Keep an eye out for reports from local council meetings and public sector organisations that might be relevant to your charity. If there are discussions on an area you work on, it is another source of evidence about the need for your interventions.


There may also be national organisations or local partners that work in similar areas – again they are likely to write reports highlighting the significance of your area, often with great statistics and links to other evidence. Share and collaborate evidence, and together the impact will be stronger!

 

5. Academic evidence

One of the most powerful ways you can add legitimacy to your impact assessment or funding applications is by linking to research on the importance of the problems you are tackling, or the potential benefits of your style of intervention. A quick search in Google Scholar (scholar.google.com) for keywords like ‘obesity’ ‘intervention’ can find dozens of articles that might be relevant. The journal articles themselves will often be behind ‘paywalls’ that mean you can’t read or download the whole paper. However, the summary is free to read, and probably gives you enough information to support your argument one way or another. Just link to the paper, and refer to it as (‘Author’s surname’, ‘Year of Publication’) for example (Turner 2013).

 

It might also be worth seeking out a relationship with a friendly academic at a local university. Look through Google (or ask through your networks) for someone that works in your area, and contact them to ask for help. Researchers have their own impact obligations, so are sometimes interested in partnering with local charities to ensure their research is used more widely. It can be a mutually beneficial relationship…

 

 

 

Hopefully these examples will help you think through all the different things you already have around you that can be turned into qualitative evidence, and some things you can seek out. We will have more blog posts on our work with local charities soon, and how you can use Quirkos to collate and analyse this qualitative evidence.

 

 

Qualitative evaluations: methods, data and analysis

reports on a shelf

Evaluating programmes and projects are an essential part of the feedback loop that should lead to better services. In fact, programmes should be designed with evaluations in mind, to make sure that there are defined and measurable outcomes.

 

While most evaluations generally include numerical analysis, qualitative data is often used along-side the quantitative, to show richness of project impact, and put a human voice in the process. Especially when a project doesn’t meet targets, or have the desired level of impact, comments from project managers and service users usually give the most information into what went wrong (or right) and why.

 

For smaller pilot and feasibility projects, qualitative data is often the mainstay of the evaluation data, when numerical data wouldn’t reach statistical analysis, or when it is too early in a programme to measure intended impact. For example, a programme looking at obesity reductions might not be able to demonstrate a lower number of diabetes referrals at first, but qualitative insight in the first year or few months of the project might show how well messages from the project are being received, or if targeted groups are talking about changing their behaviour. When goals like this are long term (and in public health and community interventions they often are) it’s important to continuously assess the precursors to impact: namely engagement, and this is usually best done in a qualitative way.

 

So, what is best practice for qualitative evaluations? Fortunately, there are some really good guides and overviews that can help teams choose the right qualitative approach. Vaterlaus and Higgenbotham give a great overview of qualitative evaluation methods, while Professor Frank Vanclay talks at a wider level about qualitative evaluations, and innovative ways to capture stories. However, there was also a nice ‘tick-box’ style guide produced by the old Public Health Resource Unit which can still be found at this link. Essentially, the tool suggests 10 questions that can be used to assess the quality of a qualitative based-evaluation – really useful when looking at evaluations that come from other fields or departments.

 

But my contention is that the appraisal tool above is best implemented as a guide for producing qualitative evaluations. If you start by considering the best approach, how you are going to demonstrate rigour, choosing appropriate methods and recruitment, you’ll get a better report at the end of it. I’d like to discuss and expand on some of the questions used to assess the rigour of the qualitative work, because this is something that often worries people about qualitative research, and these steps help demystify good practice.

 

  1. The process: Start by planning the whole evaluation from the outset: What do you plan to do? All the rest will then fall into place.
     
  2. The research questions: what are they and why were these chosen? Are the questions going to give the evaluation the data it needs, and will the methods capture that correctly?
     
  3. Recruitment: who did you choose, and why? Who didn’t take part, and how did you find people? What gaps are there likely to be in representing the target group, and how can you compensate for this? Were there any ethical considerations, how was consent gained, and what was the relationship between the participants and the researcher(s)? Did they have any reason to be biased or not truthful?
     
  4. The data: how did you know that enough had been collected? (Usually when you are starting to hear the same things over and over – saturation) How was it recorded, transcribed, and was it of good quality? Were people willing to give detailed answers?
     
  5. Analysis: make sure you describe how it was done, and what techniques were used (such as discourse or thematic analysis). How does the report choose which quotes to reproduce, and are there contradictions reported in the data? What was the role of the researcher – should they declare a bias, and were multiple views sought in the interpretation of the data?
     
  6. Findings: do they meet the aims and research questions? If not, what needs to be done next time? Are there clear findings and action points, appropriate to improving the project?

 

Then the final step for me is the most important of all: SHARE! Don't let it end up on a dusty shelf! Evaluations are usually seen as a tedious but necessary internal process, but they can be so useful to people as case-studies and learning tools in organisations and groups you might never have thought of. This is especially true if there are things that went wrong, help someone in another local authority not make the same mistakes!

 

At the moment the best UK repositories of evaluations are based around health and economic benefits but that doesn’t stop you putting the report on your organisation’s website – if someone is looking for a similar project, search engines will do the leg work for you. That evaluation might save someone a lot of time and money, and it goes without saying, look for any similar work before you start a project, you might get some good ideas, and stop yourself falling into the same pit-holes!