Tag Archives: customer validation

The Top Strategies For Doing Customer Interviews That Get You Real Insights

We all know a core responsibility of a product manager is to constantly and consistently bring the customer perspective into the business.

That means customer interviewing is a critical skill for every product manager.

I must admit, earlier in my career I totally sucked at it.

I had no plan. I often just winged it. I asked the wrong questions. I was a poor listener. I talked more and listened less. I tended to get too excited about my product idea and go into pitch mode instead of focusing on the customer’s problems. And I was very guilty of confirmation bias.

Fortunately, with lots of practice, over time I got better. As I did, I accumulated a list of the best strategies that I use even today to conduct an interview that helps me extract real customer insights.

Get the Entire List of 25 Customer Interview Strategies >>

So in this video I share 5 of the best customer interview strategies:

Use these 5 powerful tips to supercharge your customer interviews to extract every ounce of goodness from them:

  1. Set an objective. Know what you want to get out of the interview — that is, what is it you want to learn and come away with?
  2. Have a script. Having a script will help guide your interview with the customer, be better prepared and purposeful in your interviewing, and make sure you achieve the objective you’ve set out.
  3. Ask “Why?” — a lot. This is by far the most important question in your arsenal. It allows you to go deep, get inside the mind of your customer, understand their motivations, get to the true nature of the problem they’re trying to solve, and how they actually value their solutions.
  4. Ask for examples. Asking for an example forces your customer to get specific rather than speak in generalities, giving you tangible evidence of a customer’s problem, and allowing you to momentarily “live in their world”.
  5. Focus on listening. Cannot stress this enough! The biggest thing you can do is to TALK LESS AND LISTEN MORE. Your purpose is to get THEIR perspective — not promote your point of view, explain yourself, defend your idea or debate. The only times you should look to speak is when you need to clarify something or need to ask a follow-up question.

Get the Entire List of 25 Customer Interview Strategies >>

Follow these strategies and you’ll become a pro at conducting truly insightful customer interviews.

Watch the video above, download the full list of customer interview strategies, and please share in the comments below what strategies you’ve developed to conduct truly insightful customer interviews.

How To Get Quality Customer Insights (Spoiler: It’s Not About Time)

A Sales Exec and I were debating whether amount of time spent with a customer is a measure of the quality of the customer interaction.

His argument:

Just curious — do you see 23 touches for 5 mins the same as 23 touches for 1 hour each — I think it is different and material.

From my perspective, I need to allocated a certain number of hours per week with customers, and believe that number in a normal week is likely 5 hours. I am not sure if that is right, but 5 hours out of 40 per se seems about right. To me, a 3-hour dinner with a customer will be very revealing.

Time matters… Speaks to quality, don’t you think?

His assumption here is that there’s no meaningful insight to be gained in a 5-minute conversation. He’s also arbitrarily picked 5 hours per week. (Why not 4? Or 2? Or 8?)

My response:

Here’s how we should think about it.

Why are we seeking VOC? What’s the purpose?

At the most fundamental level, in any situation, there are three kinds of customer insights we’re trying to gain:

  • Problem Discovery
  • Problem Hypothesis Testing
  • Solution Hypothesis Testing

There are many techniques to accomplishing these. The specific ones we use depends on the situation.

For problem discovery, a series of 1-on-1 interviews is best. These take time. 45-60 minutes an interview across 10, 20, 50 customers, for example.

For hypothesis testing a broad problem domain, again, 1:1 interviews may be best to start with.

So in this case, I’ll opt for 23 phone conversations of 30-60 mins each, because I know neither email nor a survey nor a 5-min phone conversation will provide me the richness of insight I need.

For solution hypothesis testing, it can vary.

If I’m validating a mockup of a potential solution the customer has never seen before, a 20-60 min interview may be needed.

In all the above situations, I want the richness of the fluidity of the conversation, have an opportunity to ask follow-up questions, and feel the customer’s emotional responses.

But let’s say I’m trying to validate the need or usefulness of a specific feature — i.e., will it or does it solve a specific, previously identified customer problem.

In this case, using an email or an automated tool would enable 23 touches in 5 minutes, and work perfectly. I could do it via phone calls, which would have taken longer, but ultimately would have given me the same quality of insight.

Another example is having weekly meetings with the Sales team.

The meeting doesn’t preclude the opportunity for a Product Manager to get on sales calls and demos. However, there’s great efficiency and value to these meetings, because in 30 mins I get buyer feedback across all our products from 9-10 folks who are making a large number of customer calls every day.

Which technique to use in any given situation depends on the type of VOC one is seeking, the purpose of seeking it, ability to access the customer quickly, the time available to seek and act on it, and of course reasonable judgment.

With respect to allocating time for yourself to obtain VOC, that’s a different thing.

You’re a busy person, so it’s perfectly reasonable you need to figure out how much time to allocate for VOC vs. other tasks.

But that’s solving your problem — your problem of time allocation.

It’s different than the how, what and why of obtaining the VOC itself.

And provides no guarantee of the quality of customer insight that you may gain.

Why It’s Better To Be Smaller When Implementing Agile In A Large Company

Having done many waterfall projects, I was recently part of an effort to move a large organization to an agile software delivery process after years of following waterfall. I’ll be blunt: it was downright painful. That said, I’d pick agile over the mutlti-staged, paper intensive, meeting heavy, PMO driven waterfall process I encountered when I joined the organization.

Although the shift was painful, it was a terrific educational experience. Based on lessons learned, we adopted certain principles to guide our approach to implementing agile in the organization.

Dream big. Think smaller.

This means having a vision for what the solution will look like and the benefits it will provide customers, but then boiling it down to specifics to be able to execute. For example, at one of my former gigs, we had identified the need to make improvements to our online payments process, and captured over 20 different enhancements on a single slide under the title of “Payment Enhancements”. (Yes, in very tiny font, like 8-point.) Those enhancements were beyond simple things like improving copy or the layout of elements. Each enhancement would have involved material impacts to back-end processes. As such, “Payment Enhancements” is not an epic, or at least, it’s a super big and super nebulous one that cannot be measured. Rather, I argued that each bullet on that 1-pager could be considered an epic in and of itself that could be placed on the roadmap and would need to be further broken down into stories for execution purposes.

Thinking smaller also means considering launching the capability to a smaller subset of customers. Even when pursuing an enhancement to an existing product, it’s important to ask whether the enhancement will truly benefit all customers using your product or whether it needs to be made available to all customers on day 1. Benefits of identifying an early adopter segment: (1) get code out faster, (2) lower customer impact, (3) get customer feedback sooner that can be acted on.

Be sharp and ruthless about defining the MVP.

Lean Startup defines MVP (Minimum Viable Product) as “that version of the product that allows the team to collect the maximum amount of validated learning from customers”.

(We think) we know the problem. We don’t know for certain the solution. We have only a vision and point-of-view on what it could be. We will only know for certain we have a viable solution when customers tell us so because they use it. So identify what are the top customer problems we’re trying to solve, the underlying assumptions in our proposed solution, and what we really need to learn from our customers. Then formulate testable hypotheses and use that to define our MVP.

Make validated learning the measure

In the war of SDLCs, I’m no blanket waterfall basher nor true believer of agile. But from having done a number of waterfall projects I’ve observed that it’s typically been managed by what I call “management by date”, or more often than not, make-believe date.

As human beings, we like certainty. A date is certain. So setting a date is something that we feel can be measured, in part because a date feels real, it gives us a target, and in part probably because over decades we’ve become so accustomed to using date-driven project management to drive our product development efforts. The problem becomes that this gets us into the classic scope-time-budget headache, which means we’re now using those elements as the measure of our progress.

The thing is, scope, time and budget mean absolutely nothing to the customer. What really matters is whether customers find value in the solution we are trying to provide them. Traditional product development and project management practices don’t allow us to measure that until product launch, by which time it may be too late.

So we need to make learning the primary goal, not simply hitting a release date, which is really a check-the-box exercise and means nothing. Nothing beats direct customer feedback. We don’t know what the solution is until customers can get their hands on it. So instead of working like crazy to hit a release date, work like crazy to get customer validation. That allows us to validate our solution (MVP) and pivot as necessary.

Focus always, always on delivering a great user experience

Better to have less functionality that delivers a resonating experience than more that compromises usability. A poor UX directly impacts the value proposition of our solution. We need look no further than Apple’s stumble on the iPhone 5 Maps app. (Ironic.)

Continuous deployment applies not just to agile delivery, but also the roadmap

Over four years ago, Saeed Khan posted a nice piece on roadmaps where he said:

A roadmap is a planned future, laid out in broad strokes — i.e. planned or proposed product releases, listing high level functionality or release themes, laid out in rough timeframes — usually the target calendar or fiscal quarter — for a period usually extending for 2 or 3 significant feature releases into the future.

The roadmap is just that: a high-level map to achieve a vision. Not a calendar of arbitrary dates to hit. Too many roadmaps seem to suffer from the same date-driven project management approach.

For most established software products, I typically advocate having at least a 12-month roadmap that communicates the direction to be taken to achieve the vision and big business goals. It identifies targeted epics to achieve that vision. The vision is boiled down to a more tangible 3-month roadmap. That’s the stuff we want to get done in the next 3 months and what the agile teams need to work on.

Create an accountable person or body that actively looks at the roadmap on a monthly and quarterly basis. On a monthly basis, this body helps the agile Product Owner(s) prioritize the backlog against the 3-month roadmap. On a quarterly basis, this body evaluates overall progress against the 12-month roadmap. As such, 12 months is a rolling period, not an annual calendar of unsubstantiated promises of delivery.

What has your experience been implementing agile in your organization? What principles does your organization follow in executing an agile process?