How to Speed Up Analysis of Solution Test Interviews

The emotional high of conducting user interviews is often met with the spiritual low of having to analyze the results.

User testing creates a lot of unstructured data: interview notes, video recordings, screener questionnaires and so on.

Teams can easily get overwhelmed with the amount of data they need to analyze.

By using the tips below, teams can spend less time on analysis and still gain valuable insights.

Start with being intentional about what you’re testing.

  1. Focus your user test on a single concept. Don’t test everything at once. Remind colleagues that there will be more user tests.

  2. Create hypotheses as you design the solutions. Finish them before the first user interview.

  3. Conduct the interview around your hypotheses. Don’t just ask a fixed list of questions. Use this script to guide you.

  4. Be sure to collect a verdict from the user. Which solution would they use? (if any)

This analysis-friendly methodology will make it a joy to understand the results.

The raw results of each user interview are the notes taken by the team, the recorded video of the interview, and the insights you’ve gained.

It’s easier to do a final analysis if you organize the user feedback as you conduct each interview, not at the end of the interviews.

To start, have everyone take notes in a shared location. Not on paper. Not on their own computer. A shared virtual whiteboard works best. Something like Miro, Mural, Figjam or similar will do just fine. Right now, I recommend Miro for teams that haven’t used one before.

Avoid using shared docs, spreadsheets, wikis or Notion. These options are more clunky for group note taking. Shared docs tend to bump users around when multiple folks are adding to them at the same time.

Before the interview, create a space on the board for each user so everyone knows where to write notes for each interview. Here’s an example of a block of notes for one user interview. Feel free to copy and paste from my Product Discovery Miro template.

Another benefit of team note taking on a shared whiteboard is that team members will see each other’s notes and avoid typing the same note. So you get some real-time consolidation which also saves on analysis time.

In a solution test, we’re looking to see how each user responds to our solutions. Then we look across the users to see any trends.

I originally created the User Analysis Grid on paper as a way to keep track of in-person user tests. It fits one sheet of paper so you can easily see user feedback trends and it’s a great at-a-glance artifact that you can fill in after each interview

These days with most interviews being virtual we use a shared spreadsheet. Google Sheets even allows us to insert a user photo into a cell to give us a visual reminder of the user.

The User Analysis Grid has two sections:

Section 1. Self-reported data

  • User tester info

  • Relevant demographics

  • Relevant screener questionnaire answers

Section 2. Interview-derived feedback

  • Background data from the beginning of the interview

  • Results of the Primary Hypothesis and Secondary Hypotheses (one hypothesis per row in the grid)

  • Final survey answer

It shouldn’t hold all the information and notes. That’s for the virtual whiteboard. The User Analysis Grid holds only the summarized information. This way, you can quickly analyze user feedback in a side-by-side comparison.

Before the interview, fill out Section 1 (Self-reported data) since you already know that data.

After each interview, mark down the users’ reaction to each hypothesis and their answer to the final survey question in Section 2 (Interview-derived feedback).

How many users preferred Concept 1? How many preferred Concept 2? Why? It’s all there in one place.

With the User Analysis Grid filled in, my teams often complete their final analysis in 30 minutes or less.

These tips reduce the toll of talking to customers and increase the likelihood that Product teams will do continuous Product Discovery. Try them next time you talk to customers.


Jim is a coach for Product Management leaders and teams in early stage startups, tech companies and Fortune 100 corporations.

Jim co-founded PowerReviews which grew to 1,200+ clients and sold for $168 million. He product-managed and architected one of the Internet's first ecommerce systems at Fogdog.com that went IPO at a $450 million valuation.

These days, he coaches companies to find product-market fit and accelerate growth in digital health, financial services, ecommerce, internal platforms, machine learning, computer vision, energy infrastructure and more.

He graduated from Stanford University with a BS in Computer Science. He lectures in Product Management at the University of California at Berkeley.

Previous
Previous

When to be Proactive in a Solution Test Interview

Next
Next

How to Determine a Winner in Solution Test Interviews