Manuals and Tutorials
Licences and Pricing
Site and group licences
Analyzing Qualitative Data
March 23, 2015
The last research project I worked on with the NIHR was a close collaboration between several universities, local authorities and NHS trusts. We were looking at evidence use by managers in the NHS, and one of the common stories we heard was how valuable information often ended up on the shelf, and not used to inform service provision or policy.
It was always a real challenge for local groups, researchers and academics to create research outputs that were in a digestible format so that they could be used by decision makers, who often had very short timescales and limited resources. We also were told of the importance of using examples and case studies of other trusts or departments that had successes: it’s all very well making suggestions to improve services, but most of the time, the battle is getting that into practice. It’s one of the reasons we created a mini-case study guide, short one page summaries of ‘best-practice’ – places where a new approach had worked and made changes.
However, the biggest shock for me was how difficult it was to engage qualitative data in decision making. In many public sector organisations, qualitative data is seen as the poor-cousin of quantitative statistics, only used when figures can’t be found, or the interest group is too small for statistical significant findings.
So many wonderful sources of qualitative data seemed to be sitting around, collecting dust: including research from community organisations, consultations, and feedback from service users – data that had already been collected, and was awaiting analysis for a specific purpose. There was also a lack of confidence in some researchers in how to work with qualitative data, and an understandable sense that it was a very time consuming process. At best, qualitative data was just providing quotes to illustrate reports like JSNAs which were mostly filled with quantitative data.
A big part of the problem seemed to be how decision makers, especially from a clinical background, were more comfortable with quantitative data. For managers used to dealing with financial information, RCTs, efficacy trials etc., this is again quite understandable, and they were used to seeing graphs and tests of statistical significance. But there was a real chicken-and-egg problem: because they rarely took into account qualitative data, it was rarely requested, and there was little incentive to improve qualitative analytical skills.
One group we spoke to had produced a lovely report on a health intervention for an ethnic minority group. Their in-depth qualitative interviews and focus groups had revealed exactly why the usual health promotion message wasn’t getting though, and a better approach to engage with this population. However, the first time they went to present their findings to a funding board, the members were confused. The presentation was too long, had too many words, and no graphs. As one of many items on the agenda, they had to make a case in five minutes and a few slides.
So that’s just what they did. They turned all their qualitative data into a few graphs, which supported their case for an intervention in this group. Personally, it was heart-breaking to see all this rich data end up on the cutting-room floor, but evidence is not useful unless it is acted upon. Besides, the knowledge that the team had from this research meant that with their budget now approved, they knew they could craft the right messages for an effective campaign.
This story was often in my mind when we were designing Quirkos – what would the outputs look like that would have an impact on decision makers? It had to produce visual summaries, graphs and quotes that can be put into a PowerPoint presentation. And why couldn’t the interface itself be used to present the data? If the audience asked a question about a particular quote or group, couldn’t the presenter show that to them there and then?
Opening the door to make qualitative data easier to work with and visualise is one thing, but a whole culture of change is needed in many organisations to improve the understanding and use of qualitative data. Until this happens, many evidence based decisions are only being made on the basis of a limited style and depth of data, and valuable insights are being missed.
With the prospect of sustained and continued cuts to public services in the UK, there are fewer chances to get something right. Qualitative engagement here can tell us not only what needs to be done and how to learn from our mistakes, but how to get it right the first time.