How to Recruit Users for a Successful Solution Test Interview


600+ Users Interviewed

6 Years

80+ Product Teams


"I don't know how to recruit users"

"It seems like such a hassle"

Read these quick tips for recruiting and screening users


Recruiting sources

Recruit users yourself from these sources:

  • Customers (free)
  • Online interview panels such as UserInterviews.com, Ethn.io, UserZoom.com ($)
  • Recruiting agencies ($$)

Advanced Product teams automatically source users via website or app solicitations that link to a self-signup calendar

Scheduling logistics

Using a recruiting agency is the easiest as they will take care of all of the scheduling logistics and more or less guarantee that users will show up. ($$)

With a self-service online recruiting setup, you link your calendar (and perhaps other teammates' calendars) to the recruiting system. You open up possible time slots. Users choose their time slot from the available options. Then calendar incites and reminders and sent out by the system. ($)

When you do everything yourself, you'll be emailing back and forth with potential user testers trying to figure out open time slots. Using a the Bookings feature in Microsfot 365 or the Appointment Scheduling feature in Google Workspace will alleviate some of this scheduling burden. You will still want to send reminders a week before, the day before, the morning of, an hour before the interview and then right at the interview start time. Users are busy, this is best practice for getting users to show up to interviews. ("free", just takes your time)

Recruit from your own customers

These users are known to be authentic

You can cross-reference prototype feedback with actual user behavior

Link to your screener form from your application

Keep it open at all times for people to sign up (Google, Salesforce, and others do this)

Incentive payouts to user testers

Paying users for interviews is standard practice and does not taint the results as long as conduct high quality interviews with high quality experiments/prototypes.

You will need budget to pay users. Your manager will likely need to figure out how user incentives will be budgeted and paid out. Many employees are not allowed to buy Amazon gift cards (most common incentive) and give them out. I've personally had trouble just buying multiple Amazon gift cards since it seems to trigger some fraud logic and it doesn't ever complete the purchase. In any case, you will need to find a way to pay users.

That said, many teams aren't allowed to pay users for company policy or regulatory reasons in which case they rely on the good will of users to sign up, show up and participate.

There are also situations where companies have great relationships with users and there's no need to pay them for 30 to 60 minutes of their time

The sign that you have to increase incentives to users is when they do NOT respond to you outreach. So always outreach to just a subset of users in the beginning so that you can understand which price point will bring them to the interview.

Common rates for 60 minute interviews are $35-$50 for consumers and $70-$200 for business users.


Finding the right users

Use more than demographics to find users

To validate a solution, find users that have (or have had) the problem or are eligible for the opportunity you're solving for

By recalling current or previous experience, the user can get into the right mindset for the interview

In addition to demographics, screen users based on:

  • behavior
  • experience
  • other factors ... eg, Apple or Android owner
  • (business users) job role
  • (business users) type of company they work at

Screen for experience in specific situations relevant to your offering:

  • Consumer example → Hiring a handyman in the past 6 months
  • Business example → Analyzing regression data sets in the past 3 months
  • Narrow criteria example → What tech stack someone currently uses

Cut down on your time spent reviewing responses

Create a series of multiple choice questions where certain answers will automatically qualify users for a final review

Multiple choice questions are preferred over "Yes/No" questions since users won't be able to just answer "Yes" to everything

Describing their story

In the questionnaire, ask users to briefly write out the situation or experience that relates to the pain point or opportunity that you're interested in

Reading the user's story is usually the final step that I use to accept or reject a candidate


Believability

Recruiting high quality users improves the BELIEVABILITY of your research

If colleagues don't agree with your results, they will immediately doubt WHO you interviewed

Make the content of the prototype relatable to users

For example, if your prototype heavily uses a map, then choose a location on the map where you can recruit users from

One of my teams has several million customers in Los Angeles

Customizing the prototype to be in LA helped users envision our concept as a working application

This increases quality and believability

Find users with RECENT experience in the pain point or opportunity that you're solving for

"Recent" can vary from 30 days to 12 months depending on how memorable and common the experience is that you're solving for


The Effect of Demographics

Balance users you recruit based on demographics

We'll balance by gender in week 1...

...then track who we've interviewed and eventually balance the panel by adjusting other factors (see below)

My most recent 100 user interviews by location

My most recent 100 user interviews by location

I strive to balance user panels by:

  • gender
  • race/ethnicity
  • age (within the range appropriate for the solution)
  • income bracket

I sometimes balance by:

  • geography
  • urban/suburban/rural
  • education level
  • experts vs beginners

Which employment status fits your target users best?

Most teams will be able to select users from:
  • Full-time employed
  • Homemakers
When appropriate, include these users:
  • Students
  • Part-time employed
  • Retired

How deeply do demographics affect a concept's validation or invalidation?

Not as much I as I used to think as most software solutions target a specific behavior...not a whole person

Since people in different demographics can do the same behavior or share a pain point, demographics don't often dominate


How many users to interview

User interviewing gives us evidence but not proof

Negative reactions (invalidation) from users are easier to spot and feel confident about. I make changes quickly when I see users rejecting concepts.

Positive evidence (validation) takes more time. I prefer to be cautiously optimistic and and develop more evidence before declaring validation

recruiting-users.png

Recruit more users in the first week

A larger group of users in the first week helps you quickly understand WHO you are building a solution for...in addition to the solution itself

I like to recruit 6 users so I get at least 5 interviews in the first week

Often one doesn't show or is not quite the right user

Iteration is more important than the actual number of users

However, teams new to interviewing often ask me for a framework to get them started

If you're new to user interviewing, use my 6-3-3-3 framework

First test with 6 users
then 3 users
then 3 users
then 3 users

Speaking with 15 users develops confidence for medium-sized solutions (2 to 4 weeks to build)

Make sure to tweak the concept as you gather evidence. You do not need to keep the concept exactly the same for every interview.

The 6-3-3-3 framework acknowledges that recruiting mistakes happen

...teams make recruiting mistakes and can adjust and improve their recruiting criteria after the first 6 users

Finding the right user can take a couple rounds of screening and interviewing


Filtering out users

Weeding out users...I'm often asked about users who lie

This is not common but it happens enough that teams need to have steps in the process to ensure that you're getting the right user...read on...

First step… HIDE the type of user you're looking for

To find users that have the pain point that your solution targets, ask a general question where only ONE of the answers is your targeted criteria

Avoid a Yes/No question as your screening criteria. It's too easy for potential user testers to just click Yes.

Reject users who select all answers

See image for sample question

Go deeper with this userinterviews.com article

sample-screener-question.png

Second step...ask user to turn on their video

Users might disguise their voice or other traits and participate in a study multiple times (happened to me)

I usually end remote interviews if user refuses to turn on the camera

Facial expressions help determine a solution's value

Third step...trust but verify

At the start of the interview, ask the user to recount a recent experience with the pain point or opportunity that you're solving for

Don't accept generalities

The user must be able to tell their story with enough detail that you believe them

Fourth step...recap the user panel each week

Each week, briefly review all users interviewed with your core Product Discovery team

Determine what holes in user demographics or user experiences need to be recruited for in the set of users in the 6-3-3-3 method


When to revisit users

Should you talk to users a second time?

Yes. A user is a valid test subject as long as they have the pain point or are eligible for the opportunity that you are solving for.

In my experience, repeat interviewing with target users usually leads to deeper insights.

The exception is when you are testing growth concepts like these:

  • landing pages for new users
  • Google/Facebook ads for new users
  • most things related to attracting new users

...then you'll need to find new users each time

Note the users that you'd like to talk to AGAIN

Repeatable users:

  • have relevant experience to the pain point your solution targets
  • represent a group of users important to you
  • effectively think out loud
  • grasp new concepts easily
  • agree to being contacted again

Experts vs Beginners

Experts vs. Beginners

Always determine if the user is an expert or a beginner

Sometimes I even ask the user to self-evaluate

The user's feedback will reflect their expertise

After the interviews, you'll see which level of experience your solution works best for (if at all)


Remember...ANY USER is better than no user

Just the act of showing your idea to others causes you to think more deeply about your concept


Jim knows how to build and scale successful products.

He co-founded PowerReviews which grew to 1,200+ clients and sold for $168 million. He product-managed and architected one of the first ecommerce engines at online retailer Fogdog.com which had a $450 million IPO.

These days, he coaches Product teams and leaders at startups and corporations to use Product Discovery to validate and test their ideas before building them. He’s created a custom curriculum and training program that pulls from his 25 years of experience and the best minds in Product Management. He graduated from Stanford University with a BS in Computer Science.

Jim is based in San Francisco and helps clients engage their customers to test and validate ideas in ecommerce, machine learning, reporting/analysis, API development, computer vision, online payments, digital health, marketplaces, and more.

Previous
Previous

Design to Learn. Beware the Completionist Approach.

Next
Next

How to Plan a Successful Solution Test Interview