Qualitative evidence for evaluations and impact assessments

qualitative evidence for charities

For the last few months we have been working with SANDS Lothians, a local charity offering help and support for families who have lost a baby in miscarriage, stillbirth or soon after birth. They offer amazing services, including counselling, peer discussion groups and advice to health professionals, which can help ease the pain and isolation of a difficult journey.


We helped them put together a compilation of qualitative evidence in Quirkos. This has come from a many sources they already have, but putting it together and pulling out some of the key themes means they have a qualitative database they can use for quickly putting together evaluations, reports and impact assessments. Many organisations will have a lot of qualitative data already, and this can easily become really valuable evidence.


First, try doing an ‘audit’ for qualitative data you already have. Look though the potential sources listed below (and any other sources you can think of), and find historical evidence you can bring in. Secondly, keep these sources in mind in day-to-day work, and remember to flag them when you see them. If you get a nice e-mail from someone that they liked an event you ran, or a service they use, save it! It’s all evidence, and can help make a convincing case for funders and other supporters in the future.


Here are a few potential sources of qualitative feedback (and even quantitative data) you can bring together as evidence for evaluations and future work:



1.  Feedback from service users:

Feedback from e-mails is probably the easiest to pull together, as it is already typed up. Whenever someone complements your services, thank them and store the comments as feedback for another day. It is easy to build up a virtual ‘guest-book’ in this way, and soon you will have dozens of supportive comments that you can use to show the difference your organisation makes. Even when you get phone calls, try and make notes of important things that people say. It’s not just positive comments too, note suggestions and if people say there is something missing  – this can be evidence to funders that you need extra resources.

You can also specifically ask for stories from users you know well, these can form case studies to base a report around. If you have a specific project in mind, you can do a quick survey. Ask former users to share their experience on an issue, either by contacting people directly, or asking for comments through social media. By collating these responses, you can get quick support for the direction of a project or new service.


2. Social media

Comments and messages of support from your Facebook friends, Twitter followers, and pictures of people running marathons for you on Instagram are all evidence of support for the work you do. Pull out the nice messages, and don’t forget, the number of followers and likes you have are evidence of your impact and reach.


3. Local (and international) news

A lot of charities are good at running activities that end up in the local news, so keep clippings as evidence of the impact of your events, and the exposure you get. Funders like to work with organisations that are visible, so collect and collate these. There may also be news stories talking about problems in the community that are related to issues you work on, these can show the importance of the work you do.


4. Reports from local authority and national organisations

Keep an eye out for reports from local council meetings and public sector organisations that might be relevant to your charity. If there are discussions on an area you work on, it is another source of evidence about the need for your interventions.

There may also be national organisations or local partners that work in similar areas – again they are likely to write reports highlighting the significance of your area, often with great statistics and links to other evidence. Share and collaborate evidence, and together the impact will be stronger!


5. Academic evidence

One of the most powerful ways you can add legitimacy to your impact assessment or funding applications is by linking to research on the importance of the problems you are tackling, or the potential benefits of your style of intervention. A quick search in Google Scholar (scholar.google.com) for keywords like ‘obesity’ ‘intervention’ can find dozens of articles that might be relevant. The journal articles themselves will often be behind ‘paywalls’ that mean you can’t read or download the whole paper. However, the summary is free to read, and probably gives you enough information to support your argument one way or another. Just link to the paper, and refer to it as (‘Author’s surname’, ‘Year of Publication’) for example (Turner 2013).


It might also be worth seeking out a relationship with a friendly academic at a local university. Look through Google (or ask through your networks) for someone that works in your area, and contact them to ask for help. Researchers have their own impact obligations, so are sometimes interested in partnering with local charities to ensure their research is used more widely. It can be a mutually beneficial relationship…




Hopefully these examples will help you think through all the different things you already have around you that can be turned into qualitative evidence, and some things you can seek out. We will have more blog posts on our work with local charities soon, and how you can use Quirkos to collate and analyse this qualitative evidence.



Using Quirkos for Systematic Reviews and Evidence Synthesis

Most of the examples the blog has covered so far have been about using Quirkos for research, especially with interview and participant text sources. However, Quirkos can take any text source you can open on your computer, including text PDFs (but not scanned PDFs where each page is effectively a photograph). So why not use Quirkos like a reference manager, to sort and analyse a large cohort of articles and research? The advantage being that you can not only keep track of references, but also cross-reference the content: analysing common themes across all the articles.

There are two ways to manage this: first you can set the standard information for each source/article that is imported, such as author, year, journal, etc. If you format these as you wish them to appear in the reference (by putting the commas and dots in the value), and order them with the Properties and Value editor, you can create reports that churn out the references in whichever notation you need, such as Harvard or APA. But you can also add any extra values you like at an article level, so you could rank articles out of 10, have a comment property, or categorise them by methodology. This way, you can quickly see only text from articles rated as 8/10 or above, or everything with a sample size between 50 and 100: whatever information you categorise.

Secondly you can categorise text in the article using Quirk bubbles. So as you read through the articles, code sections in any way that is of interest to you: highlight sections on the methodology, bits you aren’t convinced about, or other references you want to check out. Highlight findings and conclusion sections (or just interesting parts of them), and with the properties you can quickly look at all the findings from papers using a particular approach, and compare and contrast them. It’s obviously quite a bit of work to code all your articles, but since you would have to read through all the papers anyway, making your notes digital and searchable in this way makes it much quicker and flexible when pulling it all together.

With qualitative synthesis you can combine multiple pieces of research, and see if there are common themes, or contradictions. Say you have found three articles on parenting, but they are all from different minority ethnic communities. Code them in Quirkos, and in a click you can see all the problems people are having with schools across all groups, or if one community describes more serious issues than another.

Evidence synthesis and systematic reviews like this are often, and quite rightly, mandated by funders and departments before commissioning a new piece of research, to make sure that the research questions add meaningfully to the existing canon. However, it’s also worth noting that, especially with qualitative synthesis taken from published articles, there can be a publication bias by relying only on comments left in the final paper: most of the data set is hidden to secondary researchers. Imagine if you are looking at schooling and parenting, but are taking data from an article on the difficulties of parenting: it’s possible that the researchers did not include quotations on the good aspects of school as it was outside the article’s focus. If possible it’s always worth getting the full data set, but this can often throw up data protection and ethical issues. There’s no simple answer to these problems, except to make sure readers are aware of your sources, and anticipate the likely limitations of your approach. Often with qualitative research, it feels like reflexivity and disclaimers go hand in hand!

Share on Facebook Share on Twitter