Qualitative analysis software for monitoring and evaluation

monitoring and evaluation with qualitative software

 

Developing systems for the monitoring and evaluation of services, interventions and programmes (or programs to use the American English spelling) is a particular skill that requires great flexibility. As each intervention to be investigated is different, and the aims of the project and funders and service users vary, evaluations have to draw on a diverse toolkit of methods.


Qualitative methods are often an important part of this approach. While many evaluations (and service delivery partners) would prefer to demonstrate a quantitative impact such as cost-benefit, things like user satisfaction, behaviour change and expected long-term impacts can be difficult or costly to put a quantitative figure to. Investigating smaller demographic subsets can also be challenging, especially when key groups are represented in too small a number to realistically sample to statistical significance. There are also situations where qualitative methods can give a depth of detail that is invaluable, especially when considering detailed suggestions on improvements.


Too often monitoring and evaluation is an overlooked part of service delivery, tacked on at the end with little budget or time to deliver in. But a short qualitative evaluation can often provide some useful insight without the resources for a detailed assessment with a full quantitative sample and modelling.

 

But managing qualitative data comes with it's own set of challenges. Often monitoring will require looking at data over a long time frame, and the end consumers of evaluations can be sceptical of the validity of qualitative data, and need to be shown how it fits their deliverable criteria. But qualitative analysis software can help on both these points (and more). It can help 'show your working out' and demonstrate how certain statements in the qualitative data support conclusions, manage large amounts of longitudinal data of different types and basically ensure that the evaluation can focus on what they do best – choosing the right approach to collecting monitoring data, and interpreting it for the end users.

 


Let's look at the first aspect here – managing and collating qualitative data. Regardless of the methodology chosen, such as focus groups, interviews and open-ended surveys, qualitative software can be used to keep all the different data together in one place, allowing for cross-referencing across sources, as well as looking at results from one method. But it also makes it much easier to draw in other 'qualitative' documents to provide context, such as project specifications or policy documents. It also can help collate informal sources of data, such as comments and feedback from service users that were collected outside the formal discovery process.


But my favourite aspect that qualitative software helps facilitate is the development and re-application of assessment criteria. Usually there will be fairly standard aspects for evaluation, such as impact, uptake, cost-effectiveness, etc. But funders and commissioners of M&E may have their own special interests (such as engagement with hard-to-reach populations) which need to be demonstrated.


In qualitative software these become the framework for coding the qualitative data: assigning supportive statements or evidence to each aspect. In our qualitative analysis software, Quirkos, this are represented as bubbles you add data to, sometimes called nodes or themes in other software. However, once you have developed and refined a framework that matches set evaluation criteria, you can reuse this in other projects – tweaking slightly to match the specifications of each project.


That way, it is easy to show comments from users, service delivery staff, or other documentation that supports each area. This not only helps the researcher in their work, but also in communicating the results to end users. You can show all (or some) of the data supporting conclusions in each area, as well as contrasts and differences between subsets. In Quirkos, you would use the Query view and the side-by-side comparison view to show how impact was different between groups such as gender or age. The visual overviews that software likes Quirkos creates can help make funders and budget holders get a quick insight into qualitative data, that usually is time consuming to digest in full (it also makes for great visual presentation slides or figures in reports).

 


Of course, all this can be done with more 'traditional' qualitative analysis approaches, such as paper and highlighters, or a huge Excel spreadsheet. But dedicated qualitative software makes sorting and storing data easier, and can save time in the long run by creating reports that help communicate your findings.


I know a lot of evaluators, especially for small projects, feel that learning qualitative analysis tools or setting up a project in qualitative software is not worth the time investment. But if it has been a while since you have tried qualitative software, especially 'new-kids-on-the-block' like Quirkos, it might be worth looking again. Since Quirkos was initially designed for participatory analysis, it can be learnt very quickly, with a visual interface that keeps you close to the data you are investigating.

 

It's also worth noting its limitations: Quirkos is still text only, so exploring multimedia data is not possible, and it takes a very pure-qual philosophy, so there are few built-in tools for quantitative analysis, although it does support exploration of mixed method and discrete data. If you need these extra features you should look at some of the more traditional packages such as Nvivo and Atlas.ti, bearing in mind the extra learning requirement that comes along with more powerful tools.


We think that for most qualitative evaluations Quirkos will have more than enough functionality, with a good trade-off between power and ease of use. There's a free trial you can download, and our licences are some of the cheapest (and most flexible) around. And if you had any specific questions about using Quirkos for monitoring and evaluation, we'd love to hear from you (support@quirkos.com)and are always happy to help you out learning and using the software. For more on using Quirkos in this field, check out our M&E feature overview.

 

 

References and Resources:

 

The AEA (American Evaluation Association) has a rather outdated list of qualitative software packages:
http://www.eval.org/p/cm/ld/fid=81

 

The British CAQDAS Network has independent reviews of qualitative software and training courses:
https://www.surrey.ac.uk/computer-assisted-qualitative-data-analysis/support/choosing

 

Better Evaluation has just one link to a chapter giving very general overview of choosing qualitative software:
http://www.betterevaluation.org/en/resources/choosing_qual_software

 

 

Tools for critical appraisal of qualitative research

appraising qualitative data

I've mentioned before how the general public are very quantitatively literate: we are used to dealing with news containing graphs, percentages, growth rates, and big numbers, and they are common enough that people rarely have trouble engaging with them.

 

In many fields of studies this is also true for researchers and those who use evidence professionally. They become accustomed to p-values, common statistical tests, and plot charts. Lots of research is based on quantitative data, and there is a training and familiarity in these methods and data presentation techniques which create a lingua-franca of researchers across disciplines and regions.

 

However, I've found in previous research that many evidence based decision makers are not comfortable with qualitative research. There are many reasons for this, but I frequently hear people essentially say that they don't know how to appraise it. While they can look at a sample size and recruitment technique and a r-square value and get an idea of the limitations of a study, this is much harder for many practitioners to do with qualitative techniques they are less familiar with.

 

But this needn’t be the case, qualitative research is not rocket science, and there are fundamental common values which can be used to assess the quality of a piece of research. This week, a discussion on appraisal of qualitative research was started on Twitter started by the Mental Health group of the 'National Elf Service’ (@Mental_Elf) - an organisation devoted to collating and summarising health evidence for practitioners.

 

People contributed many great suggestions of guides and toolkits that anyone can use to examine and critique a qualitative study, even if the user is not familiar with qualitative methodologies. I frequently come across this barrier to promoting qualitative research in public sector organisations, so was halfway through putting together these resources when I realised they might be useful to others!

 

First of all, David Nunan (@dnunan79) based at the University of Oxford shared an appraisal tool developed at the Centre for Evidence-Based Medicine (@CebmOxford).

 

Lucy Terry (@LucyACTerry) offered specific guidelines for charities from New Philanthropy Capital, with gives five key quality criteria, that the research should be: Valid, Reliable, Confirmable, Reflexive and Responsible.

 

There’s also an article by Kuper et al (2008) which offers guidance on assessing a study using qualitative evidence. As a starting point, they list 6 questions to ask:

  • Was the sample used in the study appropriate to its research question?
  • Were the data collected appropriately?
  • Were the data analysed appropriately?
  • Can I transfer the results of this study to my own setting?
  • Does the study adequately address potential ethical issues, including reflexivity?
  • Overall: is what the researchers did clear?
     

The International Centre for Allied Health Evidence at the University of South Australia has a list of critical apprasial tools, including ones specific to qualitative research. From these, I quite like the checklist format of one developed by the Critical Appraisal Skills Programme, I can imagine this going down well with health commissioners.

 

Another from the Occupational Therapy Evidence-Based Practice Research Group at McMaster University in Canada is more detailed, and is also available in multiple languages and an editable Word document.

 

Finally, Margaret Roller and Paul Lavrakas have a recent textbook (Applied Qualitative Research Design: A Total Quality Framework Approach 2015) that covers many of these issues in research, and detail the Total Quality Framework that can be used for designing, discussing and evaluating qualitative research. The book contains specific chapters on detailing the application of the framework to different projects and methodologies. Margaret Roller also has an article on her excellent blog on weighing the value of qualitative research, which gives an example of the Total Quality Framework.

 

In short, there are a lot of options to choose from, but the take away message from them is that the questions are simple, short, and largely common sense. However, the process of assessing even just a few pieces of qualitative research in this way will quickly get evidence based practitioners into the habit of asking these questions of most projects they come across, hopefully increasing their comfort level in dealing with qualitative studies.

 

The tools are also useful for students, even if they are familiar with qualitative methodologies, as it helps facilitate a critical reading that can give focus to paper discussion groups or literature reviews. Adopting one of the appraisal techniques here (or modifying one) would also be a great start to a systematic review or meta-analysis.

 

Finally, there are a few sources from the Evidence and Ethnicity in Commissioning project I was involved with that might be useful, but if you have any suggestions please let me know, either in the forum or by e-mailing daniel@quirkos.com and I will add these to the list. Don't forget to find out more about using Quirkos for your qualitative analysis and download the free trial.