Big Data in Qualitative Research

Maximizing Insights: Big Data in Qualitative Research

Maximizing Insights: Leveraging Big Data in Qualitative Research

The use of big data in qualitative research has become increasingly popular and relevant in recent years. Big data refers to large and complex data sets that can be difficult to process and analyse using traditional research methods. Qualitative research provides an opportunity to make sense of this data through the application of rigorous and systematic research techniques. By synthesizing disparate sources of data, qualitative research enables researchers to identify meaningful patterns, emerging trends, and insights that can help to drive business decisions. In this way, the combination of big data and qualitative research can offer a powerful toolset for understanding customer behaviour, improving products and services, and driving business success.

With “big qual” also being a hotly debated topic at our Qual360 conferences, we asked our presenters how they are tackling the challenge of big data to generate new, qualitative insights.

Constant iteration, rigorous testing, and quality control are our cornerstones says Marissa Bell from Reddit. We discern the current capabilities of models versus our envisioned future outcomes. The key is forward-thinking—envisioning the conclusions we desire in 2-5 years and crafting models to actualize these results. Concurrently, we adapt our human-made methods to synergize with todays and future computer-generated results, amplifying the impact and influence of our insights.

Defining research objectives clearly and identifying or determining which data sources contain the most valuable qualitative information is also a key prerequisite for Yas Parchizadeh at eBay. Ensuring data is cleaned and pre-processed to remove noise and ensuring high quality is one of the key recommendations from the Foundational Insights Manager. Parchizadeh uses NLP techniques to help extract sentiments, topic modelling and other qualitative insights from big data. One other recommendation is to add contextual analysis to elevate insights into actionable recommendations to stakeholders.

“Constant iteration, rigorous testing, and quality control are our cornerstones” Marissa Bell, Reddit.

Google’s Vanessa Bruns recommends not to rely solely on big data analysis but use a mixed methods approach. She recommends integrating insights from smaller-scale qualitative tools like focus groups, or observations to add depth and context to quantitative findings, and to understand the “why” behind the data.

“For example we see trends in search such as people searching more for terms like ‘best’ when they are shopping, and less for terms like ‘cheap’, but it was only via qualitative research (observing shopping behaviour) that we were able to understand the motivations and behaviours underlying the trend in the data.” says Bruns. Another recommendation from Google’s Insight Manager is to communicate findings in a human-centered way. Translate complex quantitative findings into clear and compelling narratives and storylines that resonate with non-technical audiences, ensuring the insights have real-world impact.

Jennifer Dorman at Babbel recommends a number of tools she uses for different parts of automation and analysis. Dorman breaks her toolbox down into transcription tools, online survey and interview tools as well as communication and sharing platforms. “Efficient transcription is key to making sense of qualitative data, especially at scale or in multiple languages. Platforms offer accurate and fast transcription services, enabling researchers to convert audio and video data into searchable, analyzable text.” says Dorman.

Online platforms for either conducting research or communicating results are also crucial for her. Research platforms help to both ethically and efficiently conduct moderated and unmoderated research, while communication platforms can be  a centralized repository or library where research data, reports, and findings can be stored and accessed by all stakeholders.

In conclusion, the successful implementation of big data in qualitative research requires clear goal setting and a thorough preparation of the data used as well as the desired outcomes. A mixed method approach is currently the most widely favoured solution and like will continue to be for some time. While a number of tools are available to help with anything from transcription to interviewing and the dissemination of results, a human-centered way of translating complex quantitative data into compelling narratives will remain crucial.

If you would like to hear more about the use of big data in qualitative research, check out the upcoming Qual360 Europe and North America editions. All quoted speakers will present live at the Qual360 Europe edition on February 22&23 in Berlin.

Big Data at Qual360 2024

Avatar photo

Jens Cornelissen

Jens Cornelissen has been writing for over two decades – initially for general newspapers in his home country Germany. After receiving an MA degree in Communications, he joined a new media start-up in Amsterdam as consultant on new media technologies and country editor for two daily newsletters. In his current day job, Jens runs the global conference division for Merlien’s dedicated marketing research events. Jens is a trained journalist with a BA in Journalism from Westminster University in London and has authored several media industry reports and articles on mobile and media technology.

View all posts by Jens Cornelissen →

Leave a Reply

Your email address will not be published. Required fields are marked *