Qualitative data in the UK Public Sector

queuing for health services

 

The last research project I worked on with the NIHR was a close collaboration between several universities, local authorities and NHS trusts. We were looking at evidence use by managers in the NHS, and one of the common stories we heard was how valuable information often ended up on the shelf, and not used to inform service provision or policy.


It was always a real challenge for local groups, researchers and academics to create research outputs that were in a digestible format so that they could be used by decision makers, who often had very short timescales and limited resources. We also were told of the importance of using examples and case studies of other trusts or departments that had successes: it’s all very well making suggestions to improve services, but most of the time, the battle is getting that into practice. It’s one of the reasons we created a mini-case study guide, short one page summaries of ‘best-practice’ – places where a new approach had worked and made changes.


However, the biggest shock for me was how difficult it was to engage qualitative data in decision making. In many public sector organisations, qualitative data is seen as the poor-cousin of quantitative statistics, only used when figures can’t be found, or the interest group is too small for statistical significant findings.


So many wonderful sources of qualitative data seemed to be sitting around, collecting dust: including research from community organisations, consultations, and feedback from service users – data that had already been collected, and was awaiting analysis for a specific purpose. There was also a lack of confidence in some researchers in how to work with qualitative data, and an understandable sense that it was a very time consuming process. At best, qualitative data was just providing quotes to illustrate reports like JSNAs which were mostly filled with quantitative data.

 

A big part of the problem seemed to be how decision makers, especially from a clinical background, were more comfortable with quantitative data. For managers used to dealing with financial information, RCTs, efficacy trials etc., this is again quite understandable, and they were used to seeing graphs and tests of statistical significance. But there was a real chicken-and-egg problem: because they rarely took into account qualitative data, it was rarely requested, and there was little incentive to improve qualitative analytical skills.


One group we spoke to had produced a lovely report on a health intervention for an ethnic minority group. Their in-depth qualitative interviews and focus groups had revealed exactly why the usual health promotion message wasn’t getting though, and a better approach to engage with this population. However, the first time they went to present their findings to a funding board, the members were confused. The presentation was too long, had too many words, and no graphs. As one of many items on the agenda, they had to make a case in five minutes and a few slides.


So that’s just what they did. They turned all their qualitative data into a few graphs, which supported their case for an intervention in this group. Personally, it was heart-breaking to see all this rich data end up on the cutting-room floor, but evidence is not useful unless it is acted upon. Besides, the knowledge that the team had from this research meant that with their budget now approved, they knew they could craft the right messages for an effective campaign.


This story was often in my mind when we were designing Quirkos – what would the outputs look like that would have an impact on decision makers? It had to produce visual summaries, graphs and quotes that can be put into a PowerPoint presentation. And why couldn’t the interface itself be used to present the data? If the audience asked a question about a particular quote or group, couldn’t the presenter show that to them there and then?

 

Opening the door to make qualitative data easier to work with and visualise is one thing, but a whole culture of change is needed in many organisations to improve the understanding and use of qualitative data. Until this happens, many evidence based decisions are only being made on the basis of a limited style and depth of data, and valuable insights are being missed.

 

With the prospect of sustained and continued cuts to public services in the UK, there are fewer chances to get something right. Qualitative engagement here can tell us not only what needs to be done and how to learn from our mistakes, but how to get it right the first time.

 



 

Don't share reports with clients, share your data!

When it comes to presenting findings and insight with colleagues and clients, the procedure is usually the same. Create a written summary report, deliver the Powerpoint presentation, field any questions, repeat until everyone is happy.

 

But this approach tends to produce very static uninspiring reports, and presentations that lack interaction. This often necessitates further sessions, if clients or colleagues have questions that can't be directly answered, want additional clarifications, or the data explored in a different way. And the final reports don't always have the life we'd want for them, ending up on a shelf, or buried in a bulging inbox.

 

But what if rather than sharing a static report, you could actually share the whole research project with your clients? If rather than sending a Powerpoint deck, you could send them all of the data, and let them explore it for themselves? That way, if one of the clients is interested in looking at results from a particular demographic group, they can see it themselves, rather than asking for a report to be generated. If another client wants to see all the instances of negative words being used to describe their brand, they can see all the quotes in one click, and in another all the positive words.

 

In many situations, this would seem like an ideal way to engage with clients, but usually it cannot be facilitated. To send clients a copy of all the data in the project, transcripts, nodes, themes and all would be a huge burden for them to process. Researchers would also assume that few clients would be sufficiently versed in qualitative analysis software to be able to navigate the data themselves.

 

But Quirkos takes a different approach, which opens up new possibilities for sharing data with end users. As it is designed to be usable by complete novices at qualitative research, your project file, and the software interface itself can be used as a feedback tool. Send your clients the project data in a Quirkos file, with a copy of the software that runs live from a USB stick. You can even give them an Android tablet with the data on, which they can explore with a touch interface. They can then quickly filter the data however they like, see all the responses you've coded, or even rearrange your themes or nodes in ways that makes sense for them. The research team have collected the data, transcribed and coded it, but clients can get a real sense of the findings, running searches and queries to explore anything of interest to them.

 

And even when you are doing a presentation, while Quirkos will generate visual graphs and overviews of the data to include as static image files in Powerpoint, why not bring up Quirkos itself, and show the data as a live demonstration? You can show how themes are related, run queries for particular demographics segments, and start a really interactive discussion about the data, where you can field answers to queries in real time, generating easy to understand graphical displays on the fly. Finally, you can generate those static PDF or Word reports to share and cement your insights, but they will have come as a the result of the discussion and exploration of the project you did as collaborators.

 

Isn't it time you stopped sharing dry reports, and started sharing answers?