Qualitative evidence for evaluations and impact assessments

qualitative evidence for charities

For the last few months we have been working with SANDS Lothians, a local charity offering help and support for families who have lost a baby in miscarriage, stillbirth or soon after birth. They offer amazing services, including counselling, peer discussion groups and advice to health professionals, which can help ease the pain and isolation of a difficult journey.

 

We helped them put together a compilation of qualitative evidence in Quirkos. This has come from a many sources they already have, but putting it together and pulling out some of the key themes means they have a qualitative database they can use for quickly putting together evaluations, reports and impact assessments. Many organisations will have a lot of qualitative data already, and this can easily become really valuable evidence.

 

First, try doing an ‘audit’ for qualitative data you already have. Look though the potential sources listed below (and any other sources you can think of), and find historical evidence you can bring in. Secondly, keep these sources in mind in day-to-day work, and remember to flag them when you see them. If you get a nice e-mail from someone that they liked an event you ran, or a service they use, save it! It’s all evidence, and can help make a convincing case for funders and other supporters in the future.

 

Here are a few potential sources of qualitative feedback (and even quantitative data) you can bring together as evidence for evaluations and future work:

 

 

1.  Feedback from service users:

Feedback from e-mails is probably the easiest to pull together, as it is already typed up. Whenever someone complements your services, thank them and store the comments as feedback for another day. It is easy to build up a virtual ‘guest-book’ in this way, and soon you will have dozens of supportive comments that you can use to show the difference your organisation makes. Even when you get phone calls, try and make notes of important things that people say. It’s not just positive comments too, note suggestions and if people say there is something missing  – this can be evidence to funders that you need extra resources.

You can also specifically ask for stories from users you know well, these can form case studies to base a report around. If you have a specific project in mind, you can do a quick survey. Ask former users to share their experience on an issue, either by contacting people directly, or asking for comments through social media. By collating these responses, you can get quick support for the direction of a project or new service.

 


2. Social media

Comments and messages of support from your Facebook friends, Twitter followers, and pictures of people running marathons for you on Instagram are all evidence of support for the work you do. Pull out the nice messages, and don’t forget, the number of followers and likes you have are evidence of your impact and reach.

 


3. Local (and international) news

A lot of charities are good at running activities that end up in the local news, so keep clippings as evidence of the impact of your events, and the exposure you get. Funders like to work with organisations that are visible, so collect and collate these. There may also be news stories talking about problems in the community that are related to issues you work on, these can show the importance of the work you do.

 


4. Reports from local authority and national organisations

Keep an eye out for reports from local council meetings and public sector organisations that might be relevant to your charity. If there are discussions on an area you work on, it is another source of evidence about the need for your interventions.


There may also be national organisations or local partners that work in similar areas – again they are likely to write reports highlighting the significance of your area, often with great statistics and links to other evidence. Share and collaborate evidence, and together the impact will be stronger!

 

5. Academic evidence

One of the most powerful ways you can add legitimacy to your impact assessment or funding applications is by linking to research on the importance of the problems you are tackling, or the potential benefits of your style of intervention. A quick search in Google Scholar (scholar.google.com) for keywords like ‘obesity’ ‘intervention’ can find dozens of articles that might be relevant. The journal articles themselves will often be behind ‘paywalls’ that mean you can’t read or download the whole paper. However, the summary is free to read, and probably gives you enough information to support your argument one way or another. Just link to the paper, and refer to it as (‘Author’s surname’, ‘Year of Publication’) for example (Turner 2013).

 

It might also be worth seeking out a relationship with a friendly academic at a local university. Look through Google (or ask through your networks) for someone that works in your area, and contact them to ask for help. Researchers have their own impact obligations, so are sometimes interested in partnering with local charities to ensure their research is used more widely. It can be a mutually beneficial relationship…

 

 

 

Hopefully these examples will help you think through all the different things you already have around you that can be turned into qualitative evidence, and some things you can seek out. We will have more blog posts on our work with local charities soon, and how you can use Quirkos to collate and analyse this qualitative evidence.

 

 

Qualitative data in the UK Public Sector

queuing for health services

 

The last research project I worked on with the NIHR was a close collaboration between several universities, local authorities and NHS trusts. We were looking at evidence use by managers in the NHS, and one of the common stories we heard was how valuable information often ended up on the shelf, and not used to inform service provision or policy.


It was always a real challenge for local groups, researchers and academics to create research outputs that were in a digestible format so that they could be used by decision makers, who often had very short timescales and limited resources. We also were told of the importance of using examples and case studies of other trusts or departments that had successes: it’s all very well making suggestions to improve services, but most of the time, the battle is getting that into practice. It’s one of the reasons we created a mini-case study guide, short one page summaries of ‘best-practice’ – places where a new approach had worked and made changes.


However, the biggest shock for me was how difficult it was to engage qualitative data in decision making. In many public sector organisations, qualitative data is seen as the poor-cousin of quantitative statistics, only used when figures can’t be found, or the interest group is too small for statistical significant findings.


So many wonderful sources of qualitative data seemed to be sitting around, collecting dust: including research from community organisations, consultations, and feedback from service users – data that had already been collected, and was awaiting analysis for a specific purpose. There was also a lack of confidence in some researchers in how to work with qualitative data, and an understandable sense that it was a very time consuming process. At best, qualitative data was just providing quotes to illustrate reports like JSNAs which were mostly filled with quantitative data.

 

A big part of the problem seemed to be how decision makers, especially from a clinical background, were more comfortable with quantitative data. For managers used to dealing with financial information, RCTs, efficacy trials etc., this is again quite understandable, and they were used to seeing graphs and tests of statistical significance. But there was a real chicken-and-egg problem: because they rarely took into account qualitative data, it was rarely requested, and there was little incentive to improve qualitative analytical skills.


One group we spoke to had produced a lovely report on a health intervention for an ethnic minority group. Their in-depth qualitative interviews and focus groups had revealed exactly why the usual health promotion message wasn’t getting though, and a better approach to engage with this population. However, the first time they went to present their findings to a funding board, the members were confused. The presentation was too long, had too many words, and no graphs. As one of many items on the agenda, they had to make a case in five minutes and a few slides.


So that’s just what they did. They turned all their qualitative data into a few graphs, which supported their case for an intervention in this group. Personally, it was heart-breaking to see all this rich data end up on the cutting-room floor, but evidence is not useful unless it is acted upon. Besides, the knowledge that the team had from this research meant that with their budget now approved, they knew they could craft the right messages for an effective campaign.


This story was often in my mind when we were designing Quirkos – what would the outputs look like that would have an impact on decision makers? It had to produce visual summaries, graphs and quotes that can be put into a PowerPoint presentation. And why couldn’t the interface itself be used to present the data? If the audience asked a question about a particular quote or group, couldn’t the presenter show that to them there and then?

 

Opening the door to make qualitative data easier to work with and visualise is one thing, but a whole culture of change is needed in many organisations to improve the understanding and use of qualitative data. Until this happens, many evidence based decisions are only being made on the basis of a limited style and depth of data, and valuable insights are being missed.

 

With the prospect of sustained and continued cuts to public services in the UK, there are fewer chances to get something right. Qualitative engagement here can tell us not only what needs to be done and how to learn from our mistakes, but how to get it right the first time.