When to be Proactive in a Solution Test Interview
For example, if a user doesn’t understand a key phrase in a prototype, then change it…right then and there.
This is a chance to capture another data point.
With a realtime editing tool like Figma, the user does not have to reload their screen to see the change.
One time, my team was testing how to enable a user to manage a subscription shipment via text message.
The initial text message included this:
Type SNOOZE to receive this shipment at a later date
We will contact you again before we send it
The user didn’t understand “SNOOZE” so we edited the prototype to say “PAUSE”. Still didn’t work.
Someone on the team in our private chat suggested “DELAY”. We tried that and it worked better.
We did this again with future users where we tried SNOOZE, PAUSE and then DELAY and they all preferred DELAY.
We learned more quickly by making realtime changes to our prototype.
You need to focus your time on likely users, not professional testers.
Some testers may get through the screening process who are not in your target audience and may start to make things up during user interviews.
This problem is not common. It's less than 2% of all users I've ever interviewed.
Be sure to avoid Yes/No questions which are easy for anyone to answer.
Here are a few telltale signs:
They can't credibly respond to the prompt "Think back to a time..." with a detailed story of their own
They can’t give specific feedback at all (lots of generalities and generic comments such as "that’s great", "that’s interesting")
They lack experience with the problem or opportunity you're trying to solve
They are purposely vague and don’t show an opinion or point of view
You just don't feel like you're learning anything
When I encounter the problems above, I politely end the interview to save everyone’s time. We do not pay users the incentive in these situations.
If I’m using a recruiting service then I will report the user to that service so they can double check whether this user should have been scheduled in the first place.
And I will re-examine my screener questionnaire to see how a user might have bluffed their way through it.
Some interview topics are personally sensitive and require extra care.
This happens when the use of a product connects with race, ethnicity, health conditions, religious beliefs, life events or other personally sensitive matters.
If the interview is about a product to help settle the affairs of a deceased loved one, the research question might be about how to advertise and onboard users to such a product.
Or it could be a scenario to see if someone who is looking for an academic tutor wants to use filters and sorts that select for race, ethnicity, or gender.
Asking these questions might be uncomfortable for the interviewer and answering them might be uncomfortable for the interviewee.
I was not originally good at this and have improved through careful observation of others who are good at it.
Terry Gross of NPR does this well. Study her approach.
Once you've asked the question, be respectful by listening intently to the user's response, be open minded, be respectful of the user's right to not answer and go easy on the followup questions. Not every line of inquiry needs the 5 WHYs.
If you’re nervous about asking a deeper question then find an empathetic colleague to test potential approaches.
Keep the focus on the user’s initial reactions and thoughts.
If the prototype is confusing or the research question is complex, wait until the user understands what’s going on. Focus on their reaction right after they get that understanding.
Initial user reactions are more trustworthy since users (and interviewers) haven't had the time to make justifications.
Long conversations have the possibility of introducing confirmation bias because the user and the interviewer might take that extra time to stretch their thoughts into agreement when it wasn’t originally there.
…if the user says or does something uniquely interesting.
…if the user does an action that represents the common reactions of many users.
These timestamps allow you to more easily go back and review a snippet of the interview.
You can also create a highlight video to demonstrate concepts to colleagues who could not attend the interviews.
This applies only when you’ve gotten agreement from the user to be recorded.
Seeing is believing.
This is why Product teams need to do user research themselves. This is why engineers need to see users interact with their creations.
When we see events with our own eyes, we take them more seriously.
That said, not everyone can attend user interviews. So taking a screenshot of the user’s face makes the user and their opinions more concrete for those who could not be there.
Then make a montage of the users’ faces to create a sense of the volume and diversity of your research panel for your colleagues.
Showing users’ faces helps spread their message.
This applies only when you’ve gotten agreement from the user to be recorded.
Jim is a coach for Product Management leaders and teams in early stage startups, tech companies and Fortune 100 corporations.
Previously, Jim was an engineer, product manager and leader at startups where he developed raw ideas into successful products several times. He co-founded PowerReviews which grew to 1,200+ clients and sold for $168 million. He product-managed and architected one of the first ecommerce systems at Fogdog.com which had a $450 million IPO.
Jim is based in San Francisco and helps clients engage their customers to test and validate ideas in ecommerce, machine learning, reporting/analysis, API development, computer vision, online payments, digital health, marketplaces, and more.
He graduated from Stanford University with a BS in Computer Science.