Handle with Care

I haven’t blogged for a while as I have been immersed in the process of analysing my qualitative data, and trying into pull this together into something that might just be clever enough to call “findings”. However, I thought the process of dealing with qualitative data would be a useful topic to blog about.

I have quite a large data set, 25 interview transcripts in all, each about one hour in length so the logistics of handling such a data set required some consideration. The first decision to make was around transcription of the data itself. Now, many authors suggest that transcribing interview data yourself help “immerse” oneself in the data. I did do this for my masters research which only had seven interviews but studying for a doctorate part-time whilst working full-time and having a young family meant that for me, this was not an option. You see, I was a bit is a tomboy at school, preferring to spend my time drawing side elevations of houses to scale and making a garden trowel in the technology department rather than learning to type with all my girlie peers who aspired to be secretaries. Oh, how I now lament my lack of typing skills. I know that to type out 25 hours (plus) worth of interview data would have taken up an enormous amount of my time and that I would only be able to concentrate on accuracy of the data rather than the content. So I chose to pay a transcription company to do the transcription for me. Having someone do the transcription does not negate listening to the audio along with the files. This still meant I have to listen to each transcript again, going through the transcript making changes for accuracy, but I found it was much more time efficient and a  more productive way of immersing myself in my data.

Once I had the transcripts, I had to analyse the data. Here came another decision point, and one which has raised quite a bit of discussion during my supervision sessions. Should I use a computer assisted analysis product such as NVivo, or process my data manually. As someone who teaches technology enhanced learning, you might assume this is a no brainer; NVivo all the way! I did use NVivo but didn’t like working with my data in this. I found I was jumping ahead and not quite getting the depth of analysis I felt I wanted or needed and felt I could not easily visualise my data. So, instead I reverted to MS word and developed my own system. I even used highlighter pens, paper and flip charts!!!

I am challenged by my supervisors to justify my position in choosing not to use NVivo, yet try as I might, I have found no PhD regulation that states I must show that I have used it, equally in the theses I have read, unless the author specifically states they have used it I would not know if they had used crayons or not. It was good to be challenged in this way though, as it did make me think through exactly what the issues were and why I chose to do something in a certain way. I am happy working with my system and feel confident I can defend my decisions around my analysis yet I do feel as though I have failed slightly in not using NVivo.

My data is organised into theme tables for each participant, detailing the invivo codes from each, interim themes then master themes. They are then brought together in superordinate themes that provide overarching themes for the entire data set. It is detailed, perhaps a little laborious but, for me it works. I consider the analytical process more of a priority and I have used interpretative phenomenological analysis as the approach. The authors of this approach encourage a more manual approach to analysis like the process I have followed, so I am reasonably happy I have been true to the theory.

I would be interested to hear from readers of this blog what challenges they have faced when working with qualitative data, and if they have come across any requirement to use computer aided analysis. Equally, I would be keen to hear of those who have used it successfully.