Back to All Events

Istanbul, Turkey Conference: Customer Interviewing for Product Teams

BAistanbul Conference: Customer Interviewing for Product Teams

Speaking at the BAistanbul Conference:

Learn about Customer Interviewing. What’s a “bad” interview? What’s a “good” interview? Get actual resources to make your own great interviews

Don’t wait for a UX researcher. You can successfully interview customers…now. Learn when you should speak to customers and how you can get the most out of that precious time.

Speaker page: Jim Morris

Agenda page: BAistanbul 2023 Agenda

Conference page: BAistanbul Conference

Hi, everyone.

Welcome to customer Interviewing.

For product teams, I'm Jim Morris.

I started out as a software engineer. I graduated from Stanford University and immediately jumped into startups.

In my first startup, I built an ecommerce engine with my colleagues and we sold sporting goods back when Amazon only sold books.

We had a successful exit in a $450,000,000 IPO.

After that, I started a company called Power Reviews with a couple colleagues that.

Was all about creating a product reviews platform. We grew from zero to 1200 clients in seven years and then sold to our nearest competitor. The federal government in the United States thought we were a monopoly, and so they split the companies apart. And I again worked in an independent company on a product reviews platform.

It was during this time that I.

Really grew from software engineer into an executive and into somebody who's really focused on the product management side of the business. How could we find the right idea?

What was the right way to bring.

That idea to market, and how could we test it with customers? So I set out on my own to be a product discovery coach. In this time, I've also started lecturing in product management to graduate students at.

University of California at Berkeley.

And then I also participated in a startup focused on helping you have better posture every day.

So I've really practiced this product discovery.

Coaching across a variety of industries.

Consumer focused industries, digital health and then tech, finance, energy.

These practices that I'll share with you today about customer interviewing will work in.

A variety of fields. Let's start with a cautionary tale. I worked with an entrepreneur who had built a website who had spent $50,000 on market research and then $150,000 on.

Actually building the website.

The problem was that there was zero traffic to this website, and he hadn't.

Really learned why there wasn't any traffic.

And so he came to me and.

We started out on a product discovery journey.

We created prototypes, we talked to users. We slowly built up what the most.

Important part of this application would be for these folks. As it turns out, we could build.

That in a low code, no code website for about $150 a month. And we launched it. And we went to places that we learned in the user interviews where people wanted this service.

So they might have been gathering on Facebook, they might have been gathering on Nextdoor.com, and we started to advertise and talk about this website and this service in those areas and draw people to the site. These folks created content. That content was then picked up by Google.

And then from these sources and from.

Google, we started to drive traffic.

Then we started to learn from that traffic. And then the entrepreneur started to pivot the idea, rename the idea, and then.

Eventually grow it into a subscription service focused on this particular area.

The reason I'm telling you.

This is through customer interviewing we were able to very quickly and very cheaply.

Figure out the main purpose of this.

Website and to really check this idea before you had to build it and spend that money. Unfortunately, in this situation, the entrepreneur could have spent that money before building the.

Website and saved quite a bit of money. That's what we'll talk about today is.

How users, through interviewing, can save you.

Time, save you money, and get you.

To market faster with the right product.

But what is success?

Many people feel that success is launching that product.

In reality, I want you to shift your mindset, shift it to get obsessive about adoption of your product and then commercial goals. Are you making money?

Are you saving money?

What are you doing?

What is the return on that investment?

So getting adoption, commercial goals achieved, that's.

What I want you to focus on.

That is closer to success than launching your product. But in discovery we can learn about.

This without actually building the product.

And so really what I want you.

To do is to start here in product discovery and that's where we'll do our customer interviewing. Most of the clients I work with can do product discovery and learn quite a bit without ever writing a line of code. So we've got all these users, of course, so we're going to help you get through and learn from them.

So why do we talk to users?

Here's a great case study from Eric.

Reese and Lean Startup. This is a story about how they wanted to build a chat client in.

Order for folks to interact with each other.

And they thought that the best way.

To do this was to integrate into existing chat systems and existing networks of users.

The problem was that instead of doing some discovery and talking to people, they.

Really thought hard as a group and came up with some great ideas and then started to build them and spent six months building this product and then had a very hard time getting adoption. They fixed all the bugs, but it still wasn't the right product.

And after a while they finally started talking to users. And what could they have done? Well, of course, talk to users sooner. But instead of building those integrations, they could have built just one integration.

They ended up trying to integrate to like six or more of these popular chat systems.

They could have instead of integrating to one chat system, they could have just.

Provided a link to that chat system.

To see if anybody would even click on it. And even before that sort of fake.

Door test, they could just interview folks with a fake prototype, a design prototype, just to gauge their interest.

So again, as you start with, well.

I can build this great idea, let's back into the easiest ways to do it. And customer interviewing is one of the easiest, fastest ways to get a bit of evidence before you go down a six month journey. There's a great book here that talks.

About when you start your company, how can you get some great input from users.

And Alberto Savoya reminds us that skin.

In the game is something that you should ask for, and it gives you.

Confidence that people will really adopt your.

Product and maybe give you commercial goals.

Such as revenue or saving money. Because again, opinions are free. You can get a reaction from users. You can get an opinion. You can get a thoughtful response. What if you decided to ask for something that actually costs something like get the user to actually give you their real email address, their cell phone number.

Their home address, their credit card information? And really, what if they were to.

Pay you a financial deposit? This level of increasing, I would say risk for a user. But again, it's the skin in the game. How badly did they want access to your product? There's a story about Tesla.

In 2016.

Asking users if they'd like to put.

Down $1,000 in return for waiting on.

A waitlist for an electric car that was more affordable. 325,000 users put down a preorder at the cost of $1,000.

These folks waited three years for this product to be built. Tesla could feel very confident that there.

Was going to be a market for this product because of this demand the skin in the game that people had put in. So, as you're thinking about getting opinions from users, also think about what is.

That skin in the game? Now, who should we interview? Well, someone you don't know, but also.

Fits the description of your target customer.

Not your friends, not your family, not your coworkers. If you've already built something, you could.

Do an interview with somebody who's recently.

Used that product or service in a.

Situation where you haven't built it. You can look for that target customer.

Who might use that product that you would build.

Now, it is okay to interview users.

Again if they've got constructive feedback, if.

It'S a first time user experience or the onboarding. Sometimes interviewing the same person over and.

Over again does dilute the feedback. And it's okay to create a user group as long as they're aligned to your vision.

When I had 1200 clients at Power.

Reviews, I had to find the clients that were aligned to my vision, not.

The clients who wanted a custom software solution.

And you'll find users like this when.

You look for users to give you.

Feedback, I do a lot of interviewing over zoom, and you find a lot.

Of interesting folks who were able to.

Give you that feedback. Now, when should we talk to users?

Well, of course, in traditional product development, people don't talk to users.

They have a roadmap item, they have a solution.

They don't do experiments. They don't really expand their set of ideas.

And they just get it done. Don't do this. In modern product development, we want to.

Find some problems and opportunities. We want to act on those solutions with experiments.

So we'll start on the left, we'll.

Expand our set of problems and opportunities.

We'll track down, we'll find a top pain point, we'll sketch and brainstorm solutions.

And we'll work with users to validate a solution and build.

And we might have to iterate, we might have to discard ideas.

But again, expand, contract, expand, contract, a very common model.

And we're going to use customer interviewing during this various phases.

So in the first part we're going to gather and understand problems using a user experience map. The second part we're going to verify a top problem in a success metric.

With an opportunity assessment.

And in the third way we talk to users.

We're going to create these prototypes and.

Do user solution test interviews.

Great. So let's talk about using a user experience map. So, a user experience map is like a journey map. It's simpler. This is a form I use on Mirror. Again, here is this concept of the entrepreneur.

This idea was to make it easier.

To hire a plumber. This is an existing map of the existing process.

We use this to identify problems and opportunities.

And in here we figured out how.

Might we ask neighbors for recommendations on hiring a plumber?

Okay, the interesting idea what service could we provide to get people in touch with their neighbors? There were other solutions on this map.

Including other ideas I should say how might we know which friends have used a plumber recently? How might we collect recommendations from family members?

How might make this easier for the homeowner?

Second, we take this map and our potential future solutions and we interview experts to validate our intuition and to learn more about problems and opportunities.

So let's take this to the experts. You can ask them about the challenge. Where are we wrong? Where do we need additions? Ask the expert to explain the decision processes in the map.

Ask why as your expert explains a.

Certain area and tell them to have you tell you more about it. Okay?

It's a simple interview style, really checking your assumptions.

It's a great way to get to.

Know a part of your market, your target customer or subject matter expert in your market. Other times you can find problems and opportunities. Other touch points would be doing win loss analysis.

Your business customers sign up, you can talk to them.

If they decline, you have to find.

Them and see why they declined.

You can go visit your customers.

A day in the life is a.

Powerful way to see what's going on. You can read your reviews in your app store.

If you have them.

You can put out quantitative surveys. Again, quantitative surveys, be careful about them.

Because they can have some bias.

If you're paying for users that aren't.

Your customers, to fill them out. And you may word the questions in a way that can introduce bias. A great way to gather problems and.

Opportunities is to sit in on calls that are already happening at your company.

With sales, customer success and support could also go on a client listening tour.

And for my 1200 clients, at one point I visited 100 of them in.

One year, two a week, to hear.

What was going on with them.

With regard to our product, this process of using Experience Map and talking to.

Experts, you can learn more about this in the Sprint book. It's a great resource and I've got.

A link there that has to do with my templates that you saw for User Experience Maps.

So again, we talked about User Experience Maps as a place to interview customers. Let's talk about this pain point.

You think you've got the right pain point? Well, let's go talk to customers about it. So, another cautionary tale I told you.

About visiting customers two per week.

Well, this was my roadmap.

Of course it's sort of blurred out.

But you get the sense of I went to my customers and I said.

Hey, we are doing these great things.

They proceeded to explain to me that.

They were having trouble in mobile and that our system was not compatible with mobile and we needed to fix that. In fact, their ecommerce websites were not that compatible with mobile and more and more people were using mobile. After several of these customer interviews and meeting lots of these customers, we decided.

To pivot and we pivoted our entire roadmap. We threw it away and we used these customer interactions about pain points to.

Alter our focus and just focus on mobile.

And this was tremendous.

Our users loved it and they really.

Thanked us for listening to them and we actually became a resource for them, for mobile excellence. So the cautionary tale is don't proceed.

With solutions until you select a problem and a success metric and talk to.

Your customers about it. And this is that opportunity assessment. It's very simple.

You can think of a business objective here.

Achieve mobile compatibility, a key result product.

Review completion on mobile improves from ten to 50%. Who is my customer? It's the website customer.

And what's their problem?

It's too difficult to write a review.

On a mobile device. And so as you're talking to folks and you're listening, you're going to be generating what is that top pain point.

And what is that metric for success? And you can talk to your clients.

Your target customers about this.

It works quite well in business.

You just want to make sure that.

When you're talking to folks about business software that you're clear on who the buyer is and who the user is. We don't want to mistake the buyer for the user. We might have product marketing for our.

Buyers, but we want to make sure.

That product management is for our end users. So in B, two B product development.

We will be talking to our buyers quite a bit, but we may be.

Solving problems for our end users.

So this is a great technique and.

It'S a great place to interview your business buyers and users about.

With an Opportunity Assessment, you can verify.

All of those aspects. What is the target customer? What is the problem? When you're doing those Opportunity Assessments, you can ask, what's wrong?

What's incomplete? Again, I've got a link for creating these opportunity Assessments.

The concept of the Opportunity Assessment is.

Quite simple, and I grabbed it from Marty Kagan's inspired book.

So again, we've talked about that first part of interviewing, gathering problems. That second part, verifying a problem and a metric.

And then one of the most important.

Aspects is to user test your solutions with prototypes. Because really you're trying to avoid this product discovery.

Value of death. What is that value of death? Well, you might go and do a lot of discovery.

You might learn about problems. You might go back to your desk and your team.

You might create these solutions and not.

Test them at all with users. And then come out the other side.

And do a big reveal with your users. Don't do this. Very scary.

And it's very risky to spend months on a solution without testing with people. Because it's one thing to talk about a solution, but you want people to interact with it.

And when they interact with it, you.

Can reduce that risk. So here in that orange line, you'll see the waterfall method where we don't test solutions, we just keep building and we hope that it's going to work. And week after week after week, you build that risk up that people may not use it, may not want it.

There may be all other kinds of risk. So create a build, measure, learn loop.

In fact, I would prefer you to do learn than build and measure, because we can actually test with users without ever writing code. So solution test. Making a great solution test is probably.

More powerful than being a great interviewer. Just why I teach everybody in my product teams product manager, designer, tech, lead.

Other folks, subject matter experts, data scientists, how to interview, because you can do it. Because really it's about creating a great.

Experiment, not about being a perfect interviewer.

So when you think about that prototype.

Interview, it's quicker than writing code. And just remember that cautionary tale about.

You can learn a lot before you spend that $200,000.

Okay, so let's think about a very.

Quick way to test something. Testing multiple ways to improve Zoom breakout rooms.

Okay, this is the example. Let's say you've got Zoom on your.

Phone and Zoom decides and you want to test multiple ideas quickly. So let's look at a list of many ideas.

And we're going to use rapid Discovery.

To find out which items will produce the most value.

Okay, so you've got Zoom on your.

Phone, and they send you a push.

Message, and they're advertising a new feature. Would you swipe to learn more about this feature, or would you just dismiss it?

Zoom offers advanced features to manage breakout rooms. As I teach classes, a class at Berkeley, I get a sense of I have to go in and out of breakout rooms, and I want to make sure that there's good conversation going in, going on there.

Also, here's another feature. I've just got a second idea that.

I'm going to send to the user as the push message. And if they don't have Zoom on their phone, I can also in the interview, just provide a phone, or I.

Can provide a prototype that has these push messages on it. Here's another feature. And again, I can test a fourth feature. I can go through my backlog, and.

I can just sort of list these out because people who have zoom in their phone, maybe you're targeting people who.

Have used, who created breakout rooms in the past. Or in your interview screening, you found.

Lecturers, professors, people who run workshops, really.

Focus on breakout rooms. When you look at these four areas.

You'Ll get a sense of which feature people wanted.

So that's a great way to use customer interviews in a very quick solution test. I call those micro prototypes.

Here's a video example.

And this is about testing multiple ways.

To improve classroom technology.

Okay, so we're in a classroom here in San Francisco, and we're interviewing teachers. And what we're going to learn in the upcoming interview is which feature is.

The user actually excited about?

And when we set up the interview.

You'Ll notice that the user is controlling the prototype, just like that picture you saw. And the prototype is very simple, like these text messages.

Okay, to make sure that this is.

Sharing correctly.

Here we go.

New apps or features we're going to ask you to go through. And then, like I said, give us your feedback. We are not testing you. We are testing the experience with the app. Okay, so there's nothing but just the app that we're interested in.

Got it.

So let's say it's after school, just like this school is out and you're using Dojo and you've gotten this message, or you're just looking at your phone. Go ahead and read that.

Try setting up an exit ticket for tomorrow in Toolkit slide to reply, go ahead and tap. Okay.

Do you know what exit tickets are?

I do.

Okay, so let's say that it's 04:00. You're ready to get you're done with all the immediate stuff, and you're thinking.

About.

Maybe planning and whatnot. Then you get this message from Dojo. Go ahead and read that and tell.

Me what you think.

Try out Class Pals, a new way to connect your students with other classrooms around the world. Yes, I would do that. I would love to do that. I actually am in the process of doing that for Skype with another classroom for social studies. Yes, but if it's already on class dojo, I would definitely use this.

Why is it interesting to you?

Because I want my kids to realize that it's not just them and their little San Francisco bubble, and there's a lot of things happening in the world around them. And to see a different first grade classroom with different first grade kids, I think it's a way to really awesomely connect the curriculum, because in first grade, they're learning about the continents, and we're learning about countries and oceans and be able to see that maybe it's a different country.

The all right, so as we think about that video, I want you to.

Tell me I want you to think.

About which of those prototype ideas, which.

Of those concepts did she really love?

And you can tell from the emotional reaction, the tone of her voice, that.

It was really the second one. So I don't know if you're thinking.

About what it was, but it's that.

Second concept that.

She could identify with.

She could tell us an example of.

How she was using it.

It was really interesting to see how the first one was like, okay, I get it. And the second one was really an.

Emotional reaction, and I want to be.

After testing with lots of users, I can tell you that when people tell you, I like this, that's interesting.

I'd use this. This is not the best feedback, really.

People might even be lying to you.

Or being nice to you. So I call this like, meth feedback. What you're really looking for is what.

That teacher was saying was, OOH, oh.

Wow, is this available?

These are the types of things that give us confidence that people are interested in our services or products.

And so you can see that tone.

Of voice change for her and that excitement about that second idea, and that's.

What you're looking for.

So let's think about another solution. Test example here.

Go back to the breakout rooms. I'm going to do a bad interview here, and this should be entertaining.

So when it comes to well, I like you to get in touch with folks.

So in any case, if you're talking.

To folks, it's not necessarily bad. But here's what I would consider a.

Not so good interview, and I'll be.

Using you, the audience, as my customer.

Hey, audience, let's say that you want.

To do improve zoom breakout rooms.

Here's a screen that shows an overview of three breakout rooms. The goal here is to find out.

Which room I should enter as the.

Professor to make the conversation better.

Which of the rooms would you enter? What do you think about these screens?

What do you see here? You, the audience would be giving me some feedback.

We don't have a live interview now.

But you get the idea.

Okay, well, let me show you another idea. Here's a second screen we have.

What do you see here? What do you think of these screens? And you would be giving me a response and I said, well here, look at this room, look at this concept.

It has buttons that say Enter room, listen in, chat with the room. There's word spoken, there's a grammar level. There's some indications of a hand raise.

This screen has got more to it. What do you think? So there were some concepts in there.

That I want you to avoid and that was the bad interview. I don't want you to talk so much. I don't want you to narrate those options like I was talking about those buttons.

I don't want you to fill the silence and I want you to avoid helping users. I want you to wait and don't.

Say what do you see?

Describe the screen or what do you.

Think about the screen? It's really not about the screen.

It's not about what they're seeing.

It's about the experience.

What would they do, what is the benefit to them, what's the value to them? Okay?

And avoid saying what's missing here. It's a very complicated question for users to sort of take your prototype, take their problem and figure out what's missing. So you're not going to get very good answers for that. And also on a separate prototype, don't ask people to read an email. You can often just show a document to them for 5 seconds, have them scan it and then you can get.

A sense of it. Okay? So these are some common things I.

See in customer interviewing that I want you to avoid. Let's do a good interview. And of course there's no one here to interview, so we'll have to kind.

Of mock this up.

Everyone here can open this interview script. There's the link down below, PDG info script.

Okay?

And it'll be repeated so you can open it while you're listening here.

The first thing you want to do.

In an interview is make the user comfortable. I just kind of jumped into those.

Mockups and started talking to you. Here's a variety of things.

In fact, I usually just read this.

As is and it's about soothing the.

User and putting them at ease and making them comfortable.

Then you really want to get the.

User to control the prototype themselves.

So you want to give them the.

Link to the prototype.

You can see it here and you.

Want them to share their screen back to you. This might take a couple of minutes in zoom or teams, but it's worth.

It because when the user controls it.

You'Ll see if they have any confusion.

Moments and you'll get a sense for.

Whether they've got value.

Can they and would they use it.

So let me show you what the.

Prototype is here's that home screen in the prototype. Again, you can go to this link. And if you were to zoom out.

I built this in figma myself just to show you that you can do this yourself. This is kind of what it looks like.

We're not going to go through the whole prototype. I just want you to sense of it.

Once you have got the user sharing.

Their screen to you, I want you.

To collect a relevant user story. This is about them actually making breakout rooms and trying to monitor the conversation.

And make and improve the breakout rooms. If they can tell you a story.

About doing that, then they are an authentic user and you can continue the interview. If they've never opened a breakout room, this may not be the user for.

You to talk to. So collecting a story is a way to authenticate the user and then it.

Also gets them in that mindset at the last time they actually did this breakout room activity. So now they're ready for an improvement on that breakout room activity.

So we'll prompt the user to start. Go ahead and click into that option.

Think back to a time. Again, we're trying to get them to go back in time. We don't want them to imagine a future state.

We want them to think back to a time. And then we give them this phrase.

That we might repeat during the interview. You're a facilitator. Go and create the rooms. Figure out which breakout room needs you the most.

That's our prompt.

And we just wait. And usually through that prototype, they can find the breakout room's length. They can open the breakout rooms. It's a mockup. It's not the real Zoom.

Just mockups of screens.

And then how do you actually ask the questions? Well, on each screen you've got some hypotheses. And really it's that last screen I.

Showed you with those three rooms where.

I've got some hypotheses about what they're going to do.

And this is a hypothesis driven interview.

Not a list of questions.

Okay? It's a big difference because if you.

Do ask question after question after question.

It'S a little tedious and you want.

To follow the user.

You want to see how they interact with your prototype.

So after you do option one, return to the beginning. Guide the user through option two.

Option three, and then a compare and contrast.

You want to find that user's preference and a compare and contrast moment is a great way to do that.

They can look at these different options.

And they can decide, oh yeah, I like this part about this one, I like that.

And then you can ask them a.

Qualitative question such as, how would you feel if you never had access to this advanced way to manage Zoom breakout rooms? This disappointment question was pioneered by Sean Ellis.

He's got a book called Hacking Growth.

Where he describes this question. You can look at it, look it up online. It's a pretty useful way. To get a feedback about a user's preference. Then you can also, like I mentioned before, ask for skin in the game.

Would you join a waitlist or a discount?

Would you put down a deposit? Here's a link to enter your credit card information. They don't actually have to enter the credit card information. You can watch them start to enter it and you could stop them at that point.

You don't really need their credit card.

Since we're just talking about a prototype. But how strongly do they feel about.

This topic, this concept you've shown them during the interview? If the user becomes silent, say, go ahead and think out loud.

If the user gets stuck, remind them of that. Prompt, that sentence about which room needs you the most. Or ask them, what would you do next? Much better than what do you see?

So what would you do?

If they ask you about how something works, ask them how you think it should work. And then, of course, discourage users from talking about others. That's called hearsay. We want to hear about their opinion.

So what is that difference between good and bad?

In the good interview, the user controls the prototype.

The user talks more, I talk less.

There are setup screens to give the user context. You'll see that they kind of get.

Their way into those breakout room summary screens and we have the user tell us a relevant story to that pain point. The user won't have experienced our solutions, but they should have that pain point.

And this is authenticating the user and we're hypothesis focused, not just asking question after question after question.

And at the end we want to.

Ask a survey question about their preference.

And we want to ask for skin in the game.

So use a script, use a clickable prototype. Make the user comfortable.

There are obviously experiments and prototypes out.

There that are not clickable prototypes. You might have explainer videos, you might have physical objects.

So not everything is a clickable prototype.

Have the user control the concept.

Collect that story.

End the interview if you don't think.

The user is relevant.

Collect the background information as you go. You don't have to get it all up front.

Let the hypotheses guide your question.

Remind users about the prompt and then follow that user as they explore your concept.

And then at the end, get that user preference.

While you're watching the interview, take notes together on a shared board, like mirror or mural or fig jam or something.

So this is a template I use in mirror. It makes it easier to really gather.

Those notes in the end. In fact, when you see other people writing that note, maybe you don't write that note.

And so it kind of collates the.

Notes while the interview is happening. How many users should you talk to? There's a lot out there, right? So let's start with five.

Okay, so why do you only need to type test with five.

Well, 20 years ago, Jacob Nielsen wrote this article why you only need to test with five users. It's fairly appropriate to what we do, and you can learn quite a bit.

From five users in a usability test. You'll find that you learn quite a.

Bit in the first three or four users and then it starts to tail off as you get to that fifth.

And further user, called diminishing marginal returns. As you add more users, you learn less and less because you see the.

Same things again and again. Better to distribute your budget for user testing across many small tests instead of blowing everything on a single study.

And of course, if you don't do.

A study, the striking truth of the curve is with zero users, you get zero insight. So it is worthwhile to test with folks because you need that customer feedback.

To really check your assumptions.

So why are we working with users? Why do these interviews happen? It's about these risks. And if I take the risk from Marty Kagan of Silicon Valley Product Group.

I'll start with Usability the Commoners.

Are we confusing people?

Can they use it? That business viability.

Should we build it? Is it ethical? Is it legal?

Does it tie into business goals?

Are we going to make money?

Is it technically feasible?

Can we actually build this at all or in the time frame we need?

And of course, what the most important.

Part about this whole set of risks is the value. Meaning, let's say we solve every other problem. Does anybody want to use it?

Will they use it?

Nielsen is clearly in this usability area.

Where you do see repeated feedback after.

A couple of users. In the value area that we're in, sometimes you don't get that repetition as fast. So I actually recommend a few more users.

The value testing is not as simple as usability testing.

We have to really find those users that have the problem or opportunity. We can't just go test with people in the hallway. Negative feedback is typically easier to believe early in testing. So if you get five people and.

They all hate it, and they are.

Part of your target customer, your target demographic, that's a problem. If all those people really like it, I'm still going to do some reinforcement and double checking.

So for my busy teams, most of.

My teams, I recommend this sort of regimen for testing. Five, then three, then three, then three.

And you're tweaking and changing the concept.

Between week one, week two, week three, week four. Each time you go through a set of users, in fact, the first set of users, you might realize you have the wrong users. So you might have to recruit different types of users.

And then you want to do a demographic check.

The testers represent the gender, the race, the age, the income, the employment status, the factors that are important to you the factors that exist in your target customer group. A health insurance company, client of mine.

Has customers in all of these areas here.

And so we actually need to test with more people before we validate an idea, because we want to make sure we cover ranges of gender, age, race, income, employment status, and other factors. So start with five. Change the prototype, change the set of users. Go to three, then three, then three. This is a common methodology for me and my teams. Keep in mind, you have to apply common sense. You might need more than these users, or you might have a product or feature that has zero risk. You know, it's a great idea.

Because.

It'S been shown in other ways, and you may not need to do testing. So always apply common sense as to.

How many people you need to test with.

So how do we find that user preference? Again, these questions, these qualitative questions, there aren't a lot of them that we can use, but I really like this.

Disappointment question that I mentioned earlier.

Sean Ellis has done a lot of research and found that successful products typically have 40% in the very disappointed category.

That's strong demand, somewhat disappointed, weaker demand.

And usually you believe that you might be able to change the product a little bit to convince that group to really get into the very disappointed bucket. Not disappointed might be people that will never come around. So you don't necessarily spend that much time trying to convince the not disappointed. So if you have a high not disappointed, that could be a problem. Net Promoter Score is a very common.

Qualitative guide, and it's based on the recommendation of the product between friends or colleagues. Now, some people like this, some people.

Don'T like this, but it's useful in the sense that for products that are.

Recommendable, higher scores do indicate that people like this product. And after tests like this or after.

A webinar presentation, you'll see that you'll.

Get an NPS request from me.

It works for things that are recommendable. If it's a health condition like diabetes or cardiac rehab and you don't have that condition, it may not be that useful for someone to recommend an app that helps with one of those conditions.

So this doesn't necessarily apply all the time.

When you're looking at smaller parts of the application, you're not going to ask.

That disappointment question about every piece of the application.

You're just looking for some consensus. You will often just look at the user's activities and listen to their words to determine whether they prefer this part of the app or that part of the app, the prototype. And in this situation, we're really looking for four out of five, eight out of ten, some kind of consensus. Again, you've got to use your common sense here, but you want to get.

A high number in this consensus area.

Now, the caveat is you don't always find a winner, but you always learn.

So what do you do with this information?

You're going to iterate, meaning I'm going.

To fix it and keep going at it.

Maybe after a while I feel comfortable and confident to spend time engineering time.

On and build it. And a great sign that you're stretching.

Your imagination and creativity and really exploring your users preferences is that you are discarding things. You found things that your users don't like.

It's really important. High functioning advanced product teams find ideas.

That they discard and they publish this idea, failing alongside their iterations and their validations. So you pull this together, the hypotheses, the qualitative question, the consensus, and this.

Iterate validate or discard concept into this grid.

And this is a very simple way to analyze user interviews. And so you've done this hard work of interviewing folks. Here are five interviewees, plus our hypotheses. And let's say we fill this out as a group. And this probably takes about a half.

An hour, 30 minutes to do if.

You'Ve done the hypotheses in advance and you've built a good solution prototype and a good test.

So in this primary hypothesis, 40% are very disappointed. So it's good. But I want to get more than.

Just five people, so I want to.

Continue to iterate here.

People prefer the map view. People often prefer the map view over.

List view, but there's 80%.

So this is pretty valid small part of the app, and we feel pretty.

Confident about it here.

And this is people looking for cheap parking.

In this app, we didn't quite get.

Four out of five. So we really want to iterate like, is this really that valuable? Are there other things that might be more important? And then here the free filter.

For some reason, people didn't really like this. So we're going to discard this idea.

And move on our efforts onto something else. One of the things we learned about this parking app is that people were really concerned about the security of their car and the security of themselves about.

Where they were parking.

So while we were testing all these hypotheses, we learned other things.

And this happens all the time in customer interviewing. So when you're putting all of this.

Together and you're trying to do interviews, I want you to be sure you.

Don'T gather insights by yourself, right?

The anti pattern is to do a.

Lot of these handoffs between folks, the waterfall.

Maybe you're doing some things together, but you're not including the engineers.

Don't just throw it over the wall. A better process is to learn these.

Insights together, conduct the interviews, everyone's there on the call, just have your videos off and be muting if you're not the interviewer. Learn the insights together, take notes together, do the analysis together.

This is the core product discovery team.

You don't necessarily have to do it with everybody in the engineering team involved. But find three to five people that will go through this discovery process together.

Do collaborative solutioning, and then really make sure there's an engineer involved so that.

You can get that innovation concept into your ideas.

So, as a recap, we're going to.

Interview experts with our user experience map. We're going to interview business buyers and users with our Opportunity Assessment, and we're going to conduct solution tests with users.

And we're going to cover these three.

Areas in our sort of double diamond product management process. There are a lot more resources here. I've written a lot of articles about solution test interviewing to really cover that end of that process where you're testing solutions. And I just want to remind you, do you need a professional UX researcher?

No.

And how many customers should you be talking to a week? Well, I'll tell you three to five. It doesn't have to be with solution test. You can just get on the phone and tech support. You need to be getting that context, and the various people on your team should be listening to customers on a regular basis.

These are my favorite tools.

You've seen some of them here. These are not the only tools that do this. But people always ask me about my.

Favorite tools and definitely stay in touch.

Reach out to me. I love meeting new people who are interested in these topics and sharing ideas. And then finally, please leave some feedback.

Back about this presentation.

I'd love to get feedback and I make changes and I hear from people about what they learned and what I could improve.

So thank you very much.


Previous
Previous
February 21

Virtual Online ProductWorld 2023 Presentation

Next
Next
April 3

Campfire Capitalism: Why Product Discovery is Key to Sustainable Startups with Jim Morris