The benefits and drawbacks of integrating generative AI into your UX research practice.
Over the past year, it’s all anyone can talk about in the digital product world. But all the buzz we saw focused on what it can do and tips for using it, but not much (at all) about the human implications, motivations, and decision-making happening between people’s ears when it comes to AI.
We wanted to lend our knowledge to this space by learning about consumer behaviors and opinions. We’re experts in generative research and directing our clients on what they should do based on what we learned. So, we conducted a generative consumer study with a follow-up quantitative survey to learn what people think about AI, helping product teams better introduce their own AI features. You can read that report here.
The kicker? We used AI to conduct some of the work. Not the actual research—but we employed AI to try its hand at distillation and analysis, survey writing, summarizing, and drafting. There are so many speculative articles out there about how AI is going to impact UX research and our careers—we wanted to put it into practice to see what happened.
Rather than prioritizing questions such as, “Do we need AI?” or “Is it going to take our jobs?” we wanted to understand: “Can we work alongside it?” and, if so, “What does that relationship look like?” And ultimately, “Does it help us work more efficiently?”
Is it worth it to implement AI into your research process and, if so, where? So where better to use AI than in an experiment about AI?
Let's dig in.
We used Pollfish AI and ChatGPT to draft interview and survey questions for both our qualitative and quantitative efforts.
Typically, we’d write our surveys (and other research materials) using just our brains, what we’ve learned about our client’s objectives, and research best practices.
These tools provided a plethora of question ideas and got us 60% of the way there. While some of the questions were great, they were pretty starter-level questions that most researchers would include. There were a lot of redundant questions and, specifically for the survey, some that just made no sense at all.
If you’re having a hard time getting started or would rather start with a big list and narrow it down, AI can help. But it’s not going to write the whole thing yet. If you’re a seasoned researcher and can churn out a great discussion guide or survey quickly, this is unlikely to save you much time.
We used Dovetail to transcribe our qualitative interviews, which is not new to our process.
We’ve used Dovetail for several years for transcription and it has made a huge difference in our speed of delivery and quality of work. Before Dovetail, we were hand-writing/typing notes and had to have two researchers in each interview session—one interviewer and one notetaker. With Dovetail, we just upload a video of the interview and it transcribes everything for us, meaning everything that was said was captured in exact detail and we save time in the actual interview session.
Sometimes the transcript is incorrect, but it’s nothing more than a quick fix here and there.
Having a tool for transcribing interviews is somewhat of an industry standard and something we will continue to use.
We used ChatGPT to summarize qualitative interview transcripts and then tagged the summaries.
This is something we would normally do manually, taking the time to read all of the raw data and tag it according to important themes and research objectives. Then, we use affinity diagramming within each theme to define it.
Typically we read the transcript and tag it linearly, but with AI we had to copy and paste specific pieces of the transcript into ChatGPT and then tell the AI to summarize the answer to a specific question— if you put the whole transcript in at once, the AI wasn’t able to parse out the answers to all of your questions. There is also a character limit in ChatGPT that is a big barrier to this process; it’s just not able to accept the whole transcript for a 45-minute interview.
We also tried ChatGPT to summarize our open-ended screener and survey questions.
Normally, we would save this process for last, reading through each answer and manually tagging it in Google Sheets, then analyzing the counts of each answer.
This was way faster than reading 400+ open-ended survey responses manually, though pulling out specific, interesting quotes was more of a challenge down the line.
This was worth it given the effort vs. value provided by the responses. The open-ended questions didn’t end up providing any new value to the research, so we saved time by not doing this manually.
We used BigML to analyze our survey data to form persona clusters.
This is something we’ve been doing for years and works great as it can look at thousands of pieces of data and see the groups within it, whereas that would be almost impossible for a human to do with any sort of accuracy.
There can be some trial-and-error with the clustering and figuring out how many to create or what data to use.
We will continue to use tools like BigML to process large amounts of data, specifically for persona creation.
While the industry is embracing AI in a big way, it’s important to recognize where it’s not the same as a human doing the work.
From this experiment, we learned that AI tools will (if they haven’t already) take over the manual labor of research—editing for bias and grammatical error, interview transcription, etc.—alleviating the workload of UX researchers. However, despite this, we don’t ultimately see AI replacing us but, instead, augmenting our skills and making space for what we’re truly good at: strategy, creativity, and empathy.
In some ways, AI made this project faster and showed us where we could be more efficient by using it. It’s imperative to our success and growth as a company—and UX researchers as a profession—to continue to experiment with AI and figure out how to adopt it in a meaningful way. As the quote from Curtis Langlotz, MD, PhD goes:
Artificial intelligence will not replace radiologists … but radiologists who use AI will replace radiologists who don’t.
The tools in this space are always evolving, so we will continue to experiment as they roll out. For us, it’s not a “never” it’s a “not yet.”
So, what do you think? How have you used AI in your research work and what did you learn from that experiment? Share your thoughts with us at hello@zocodesign.com or on our social media!
Have a project you’re working on and need some support? Reach out to us.
Do you just want to chat about product, UX, research, process, and methodologies? We’re down for that too. Let's chat.
Do you want to avoid talking to another human being right now? We get it. Sign up for our Curious Communications newsletter to stay up to date on all things UX and other curiosities. We’ll hit your inbox every few weeks.