Tag Archives: roadmap

No, I Can’t Give You A Roadmap For Our New Product (Yet)

Cartoon by Roger LathamA fellow product manager that’s working on a new product idea recently wrote to me:

“Common feedback I receive from our our engineers and executives is they don’t have a good grasp of the product vision. They say, “OK, that’s great, we can build that. But where are we going with this if we find the hypothesis to be true? What’s the long term vision for the product?” In essence, they’re asking what’s the end goal in 2-5 years, and if you show me that I’ll have a better sense of the architecture and tools I need to account for.”

This product manager is right at the very initial stages of their product idea, where he still needs to test the problem and solution hypotheses. But he’s already being asked for a long-term product roadmap! Sound familiar?

While the request may seem perfectly reasonable, it’s misplaced at such an early stage. The question about architecture and tools may also seem perfectly reasonable on the surface, but it’s a scale question, and is not the right one to be focusing on before you know if you’ve even identified the right customer problem and have proof that your solution approach is viable to solving that problem.

Execs are trying to assess the potential market opportunity, the underlying investment that will be needed, and the speed to achieving ROI. So naturally, they want to see the long-term roadmap. But at such an early stage, you’re likely in no position to be able to answer the question.

Even at the conceptual stage, you may have a list of potential features in your mind. You could prioritize them using one of the many scorecarding techniques written about by seasoned product practitioners. (See this, this, this, and this, to reference just a few.) These are all very valid techniques written by product folks who really know their stuff.

Doing that so early is a waste of time, though. Creating a product roadmap is predicated on having a coherent product strategy, which is predicated on having a validated understanding of who are your customers, what are their pain points, and whether they’ll find your solution valuable. If you don’t even know if customers will buy your solution, what’s the point in having a roadmap?

So when do you develop a roadmap for a new product?

For a startup product, the first step is always to identify the customer segment and customer problem. Quickly capture your product vision, formulate your customer, problem and solution hypotheses, and systematically test them. As you go along, you need to identify early adopters to whom you can deliver your solution — typically you build the product for these folks first. If practicable, test pricing at this stage as well.

Figure what you absolutely must deliver to these folks to solve their #1 problem, and work like hell to deliver it as quickly as possible. All other features get cut from scope and sit in the backlog.

After delivering this minimum viable product (MVP) you need to actively gain feedback from these early customers. You’re using your delivered product to gain deeper insights into the customer’s problem, and you’re trying to understand what you need to improve in the product to (1) get these customers to stick, and (2) attract more new customers.

In addition, now that you have an initial set of engaged customers, you can also try to test their second level set of problems or discover new ones. Understanding those problems may identify new enhancements and features. You’ll now be armed with a set of improvements, fixes and new ideas that you can put into the backlog.

If you have a sales force and have armed them to sell your MVP, make sure you’re actively gathering feedback from them as well. You may uncover opportunities to evolve your sales messaging and positioning. You may also uncover feature gaps. If so, you can put the into the backlog as well to earmark for further validation. Pay particular attention to customer feedback that’s preventing a customer sale.

You’ll have a pretty good backlog at this point, so you can now start building an initial roadmap. Start by prioritizing the backlog based on a reasonable customer-centric set of criteria. I typically skew my priorities heavily toward voice of customer (VOC) feedback. While at any stage of the product lifecycle, features should solve tangible customer problems, it’s even more important at this early stage.

Also factor in the company’s strategic goals. For example, if the company’s focus is retention, features that create stickiness may carry more weight; if the focus is growth through customer acquisition, then sellable features may be more important; if it’s expansion revenue — i.e., greater revenue from existing customers — features that drive engagement and up-sells may take priority.

Make some allowance for operational issues. You may not necessarily have a scale problem yet, so these type of issues should not take precedence over VOC or driving revenue; however, you don’t want to completely ignore technical debt or reasonable operational fixes.

Once you have a prioritized list, socialize it. (Read this post by Bruce McCarthy on using “shuttle diplomacy” to get buy-in.) For the top priority items on the list, get t-shirt sizing from Engineering, and make a final call to sequence out the items based on customer and business value vs. feasibility. Now you’ve got a validated product in the marketplace with a decent first-pass roadmap that you can build upon. Go forth and conquer!

Roadmaps Are Not A Popularity Contest

By Bruce McCarthy

I write a blog called User>Driven and my Twitter handle is @d8a_driven, so you would think I like the idea of setting roadmap priorities based primarily on customer requests. You would be wrong.

Dell Idea Storm

Back in 2007, Dell created an online customer forum where people could post “innovative” ideas for how Dell could improve their offering and others could vote on those ideas. I wrote at the time that I thought it was a very bad idea.

In my opinion, Dell had fundamentally misunderstood the uses of customer data and the best ways to collect it. The proof of that was that the voting had been hijacked by a small number of very vocal open source advocates. Checking back in this week, I see it mostly consists of >complaints and suggestions for minor tweaks to design and packaging. Interestingly, they have made it impossible to see which ideas have the most votes.

Tracking Customer Requests

Over 20 years as a product person, I’ve developed an objective scoring methodology for prioritizing ideas. (See my presentation on Slideshare.) It focuses on ranking proposed improvements or changes as to how much they contribute to strategic goals compared to their implementation cost.

So am I arguing for ignoring customer input? Not at all. If improving your customer retention rate is one of your strategic goals, it might make sense to include number of customer requests in your scoring. I know more than one product person who has tracked popularity as input to their planning and tools like UserVoice make this easier than ever.

Your Customers Can Hold You Back

Tracking customer requests can easily create a false sense of security for a product person, though. You’re collecting real, live market data, right? It’s all neatly tallied and summarized in your PowerPoint, right? Ok, fine, but bear these very real limitations in mind.

  • As Dell Idea Storm showed, you have to recognize that not all votes are equal. Vocal customers are not necessarily paying customers. And even if you could get evenly-distributed input, most customers are small. Should you be focused on their needs or on the needs of your largest customers? Tailor your data collection to your strategic goals and weight the input accordingly.
  • Expect incremental input only. Henry Ford famously said that if he’d asked customers what they wanted, they would have answered: “a faster horse.” Customers will give you ‘better, faster, cheaper’ advice, but are unlikely to propose a break-through idea that uses technology to solve problems in new ways.
  • Beware “featuritis.” We all complain about the lengthy list of features in Microsoft products that we never use but make things complex, crash-prone and slow. Where does all of this bloat come from? Customer requests. Beware of adding features that only a few will use but many will have to put up with.
  • Existing customers tell you nothing about non-buyers. People who are in your target market but don’t respond to your messaging or solution are your best source for information on how to expand your appeal. Reach out to people who evaluated your solution but decided to go with the competition or do nothing (this is called “win/loss analysis”) and ask them about their business, their problems, and why they went a different direction.
  • Existing customers won’t help you penetrate new markets. If you want to grow beyond your current market niche into other verticals or up to larger customers, engage with those people directly and, rather than talk about your solution, ask them about their needs and wants. Develop an understanding of the personas in that space and what they need and you will be on your way to designing a solution that will capture that market.

One Way To Use Popularity To Prioritize

Ok, yes, I did create a forum where people could vote on what a roadmapping tool for product people should do when setting out to design Reqqs, my forthcoming tool for product people. But that was different. Really. Let me tell you why.

Before putting up the forum, I spent a lot of time interviewing other product people, asking them about their jobs, what were they trying to achieve, and what were their main obstacles and frustrations. I did this until it was pretty easy to guess what the next person would say. That kind of qualitative data helped me define the essence of the product as a roadmapping tool for product people.

The next step was to figure out which of the 10 or so problems I had uncovered were the most common and most painful. UserVoice’s voting feature is perfect for that. I set up the forum, pre-populated it with my 10 things, set up each user with 10 votes to allocate however they liked, and began promoting it on product discussion boards like those on LinkedIn. This limited my audience to qualified prospects, all of whom were equal in my eyes since none were yet a customer.

With the hard work of using interview data to develop innovative ideas done, a survey within my target market provided the quantitative data needed to determine the top few of those ideas that would form the MVP. As it happens, potential customers highlighted prioritization as the number one thing they wanted help with, and that will be the core feature of the initial release of Reqqs.

Customer Requests Are But One Input

So, yes, customer input can be a powerful tool to aid in decision-making. But like all superpowers, this one must be used responsibly. Set your strategic goals first, then decide which items from your utility belt will help you toward those goals.

Bruce

Bruce McCarthy is a serial entrepreneur, 20-year product person, and sought-after speaker on roadmapping and prioritization. By day, he is VP of Product for NetProspex, but only you know his secret identity as Chief Product Person for Reqqs, the smart roadmap tool for product people (forthcoming). He is available for advice on product management topics of all kinds.

Why It’s Better To Be Smaller When Implementing Agile In A Large Company

Having done many waterfall projects, I was recently part of an effort to move a large organization to an agile software delivery process after years of following waterfall. I’ll be blunt: it was downright painful. That said, I’d pick agile over the mutlti-staged, paper intensive, meeting heavy, PMO driven waterfall process I encountered when I joined the organization.

Although the shift was painful, it was a terrific educational experience. Based on lessons learned, we adopted certain principles to guide our approach to implementing agile in the organization.

Dream big. Think smaller.

This means having a vision for what the solution will look like and the benefits it will provide customers, but then boiling it down to specifics to be able to execute. For example, at one of my former gigs, we had identified the need to make improvements to our online payments process, and captured over 20 different enhancements on a single slide under the title of “Payment Enhancements”. (Yes, in very tiny font, like 8-point.) Those enhancements were beyond simple things like improving copy or the layout of elements. Each enhancement would have involved material impacts to back-end processes. As such, “Payment Enhancements” is not an epic, or at least, it’s a super big and super nebulous one that cannot be measured. Rather, I argued that each bullet on that 1-pager could be considered an epic in and of itself that could be placed on the roadmap and would need to be further broken down into stories for execution purposes.

Thinking smaller also means considering launching the capability to a smaller subset of customers. Even when pursuing an enhancement to an existing product, it’s important to ask whether the enhancement will truly benefit all customers using your product or whether it needs to be made available to all customers on day 1. Benefits of identifying an early adopter segment: (1) get code out faster, (2) lower customer impact, (3) get customer feedback sooner that can be acted on.

Be sharp and ruthless about defining the MVP.

Lean Startup defines MVP (Minimum Viable Product) as “that version of the product that allows the team to collect the maximum amount of validated learning from customers”.

(We think) we know the problem. We don’t know for certain the solution. We have only a vision and point-of-view on what it could be. We will only know for certain we have a viable solution when customers tell us so because they use it. So identify what are the top customer problems we’re trying to solve, the underlying assumptions in our proposed solution, and what we really need to learn from our customers. Then formulate testable hypotheses and use that to define our MVP.

Make validated learning the measure

In the war of SDLCs, I’m no blanket waterfall basher nor true believer of agile. But from having done a number of waterfall projects I’ve observed that it’s typically been managed by what I call “management by date”, or more often than not, make-believe date.

As human beings, we like certainty. A date is certain. So setting a date is something that we feel can be measured, in part because a date feels real, it gives us a target, and in part probably because over decades we’ve become so accustomed to using date-driven project management to drive our product development efforts. The problem becomes that this gets us into the classic scope-time-budget headache, which means we’re now using those elements as the measure of our progress.

The thing is, scope, time and budget mean absolutely nothing to the customer. What really matters is whether customers find value in the solution we are trying to provide them. Traditional product development and project management practices don’t allow us to measure that until product launch, by which time it may be too late.

So we need to make learning the primary goal, not simply hitting a release date, which is really a check-the-box exercise and means nothing. Nothing beats direct customer feedback. We don’t know what the solution is until customers can get their hands on it. So instead of working like crazy to hit a release date, work like crazy to get customer validation. That allows us to validate our solution (MVP) and pivot as necessary.

Focus always, always on delivering a great user experience

Better to have less functionality that delivers a resonating experience than more that compromises usability. A poor UX directly impacts the value proposition of our solution. We need look no further than Apple’s stumble on the iPhone 5 Maps app. (Ironic.)

Continuous deployment applies not just to agile delivery, but also the roadmap

Over four years ago, Saeed Khan posted a nice piece on roadmaps where he said:

A roadmap is a planned future, laid out in broad strokes — i.e. planned or proposed product releases, listing high level functionality or release themes, laid out in rough timeframes — usually the target calendar or fiscal quarter — for a period usually extending for 2 or 3 significant feature releases into the future.

The roadmap is just that: a high-level map to achieve a vision. Not a calendar of arbitrary dates to hit. Too many roadmaps seem to suffer from the same date-driven project management approach.

For most established software products, I typically advocate having at least a 12-month roadmap that communicates the direction to be taken to achieve the vision and big business goals. It identifies targeted epics to achieve that vision. The vision is boiled down to a more tangible 3-month roadmap. That’s the stuff we want to get done in the next 3 months and what the agile teams need to work on.

Create an accountable person or body that actively looks at the roadmap on a monthly and quarterly basis. On a monthly basis, this body helps the agile Product Owner(s) prioritize the backlog against the 3-month roadmap. On a quarterly basis, this body evaluates overall progress against the 12-month roadmap. As such, 12 months is a rolling period, not an annual calendar of unsubstantiated promises of delivery.

What has your experience been implementing agile in your organization? What principles does your organization follow in executing an agile process?