The lean product development playbook with Dan Olsen

Join Murray Robinson and Shane Gibson in a conversation with Dan Olson, author of the lean product development playbook. A practical guide to building products that customers love. Most new products fail on most new features are rarely used. Dan describes a practical six step process to improve your chances of building successful products.

Recommended Books

Podcast Transcript

Read along you will

In this episode, we talk to Dan Olson, author of the lean product development playbook. A practical guide to building products that customers love. Most new products fail on most new features are rarely used. Dan describes a practical six step process to improve your chances of building successful products.

Shane: Welcome to the No Nonsense Agile podcast. I’m Shane Gibson. 

Murray: And I’m Murray Robinson. 

Dan: And I’m Dan Olson.

Murray: Hi Dan. Thanks for coming on today. 

Dan: My pleasure. Nice to be here. 

Murray: So we want to talk to you about your book, the Lean Product Playbook. But why don’t you start by telling us a bit about who you are and what your experience is. 

Dan: Sure. Yeah. So I’ve been working in product management for quite some time now. I started my career there at Intuit, which is a US company that has successful products like QuickBooks for small business accounting and TurboTax for people to do their taxes and quicken for personal finance. So When I was coming outta business school product management wasn’t as well known as it was today and people told me that Intuit was a great place to learn it and I’d never done it. So I started there. Really enjoyed that, learned a lot. And then after that, went to be a product leader at several startups. I was also the co-founder of my own startup. Then I stumbled into being a product management consultant and helping companies, usually post series A startups as an interim VP of product for them. I was seeing so many different teams and products and what was working well and what wasn’t working well. It naturally led to my book, The lean Product Playbook, which was published in 2015. And since the book has been out, I’ve transitioned to less hands on consulting and more training product teams, both public workshops as well as private workshops. A lot of private workshops, speaking at events.

And then I also Have a community of over 11,000 product people called Lean Product Meetup, where each month for almost nine years now, been hosting a top speaker so we can get together and share best practices. And we put those videos up on our YouTube channel. So that’s what I’m up to these days. Luckily in the last three plus years, product management has really exploded. And so it’s a big demand for training all these new product people. And so it’s, been a lot of fun. 

Murray: And how is this related to Eric Reese’s the lean startup. 

Dan: Yeah. So if you read the reviews of my book on Amazon people say that the Lean Startup is a good book to express high level principles and to get you started and excited about this. My book is really a double click on that same topic area in the world of products. So if you actually wanna know, okay, great, I wanna apply Lean Star principles, how do I do. That’s why it’s called a playbook. It’s a six step process of how to go from an idea that you have for a new product or a new feature to actually validating it and creating it and getting it launched. So my book’s, more of a how to book is like if you actually wanna roll up your sleeves and do it. That’s where my book fits in. 

One thing I left out is I have a technical background. I started electrical engineering as undergrad. I actually got a master’s in industrial engineering while I was doing my first job out, of, college. And I studied lean manufacturing. So when Lean Startup came out, it made a lot of sense to me, basically in, how it’s helped product development and product management advanced the state of the art. 

Murray: Great. Why don’t you give us an overview of the steps in the process, and then we’ll walk through them one by one.

Dan: Sure. So basically one of the terms that the Lean Start Up movement made popular was product market fit and is actually coined by Mark Andreessen back in 2007 in a blog post that he wrote. But it didn’t really take off until Lean Startup Movement. And what I found is a lot of people were talking about it, but there wasn’t really a good definition of it. So the foundation of everything in the book is a framework called the product Market Fit Pyramid. And it’s basically a five layer pyramid. The bottom two layers are the market. 

So the first layer is defining who your target customer is. And the second layer up from that is for that customer, what are their underserved needs? And those two together are the market. And if you look in an economics or marketing textbook, it’ll tell you a market is a group of people that share a set of common needs and like a real pyramid. The idea is each layer builds on the layers beneath it. 

And then there are three product layers. The first of which is your value proposition, which is What benefits are you gonna offer and how are you gonna do so in a way that’s better than the competition? That’s where product strategy lives.

The second product layer is your feature set. So what functionality should you build to support those benefits? And then the final layer is user experience. So the way to think about those three proc layers is if customer interacts with the user experience to use the features in your feature set to get the benefits in your value proposition. And so product market fit is just, Hey, we’re going after this market. With this value prop feature set in ux, how well does it resonate? And so then once I had the product market fit Pyramid, I just created a lean product process to guide people through working from the bottom, of the pyramid up to the top.

So step one of lean product processes defining who is your tire customer. Step two is defining what are their underserved needs. Step three is determining what your value proposition should be, which is, again, how are you gonna meet those needs in a way that’s better than The competition. The next step is your feature set. And this is where the concept of minimum viable product comes in. We don’t wanna over scope our functionality before we confirm whether we’re heading in the direction or not. Step five is user experience. And then in the process of just one final step where we, we go from the top of the pyramid back to the base and we test our user experience MVP with customers to see where we’re at with product market fits. Those are the six steps of the lean product process. 

Murray: Is this an iterative. process or do you just do it once for each product? 

Dan: It’s definitely iterative. So another way to think about the product market Fit Pyramid is the five hypotheses that you need to get up to here enough in order to achieve product market fit. And nobody gets it up to here the first time. Nobody nails it the first time, not even the big tech companies. And so it is iterative. And so from Lean Startup there’s an iterative learning loop called Build Measure learn, which was good cuz it got people recognizing the importance of iteration. But I have my own version that I prefer that’s hypothesized, design, test and learn. And the reason why is people see build, measure, learn, and they start off by building sometimes when they misunderstand it, that wasn’t the intent. But sometimes people just go by the shorthand And the building is actually one of the most expensive things you can do. And myself and a lot of other experts advocate trying to do as much testing and validation as you can with prototypes before you do any building. So that’s what I advocate. So certainly it is iterative. The whole point is to get to that initial prototype as quickly as you can get it in front of customers. Cuz only then can you take your learning and refinement to the next level. Once you get that first prototype in front of customers, you’re gonna realize what they like, what they don’t like which assumptions you got wrong. And then you can start iterating from there. And in the book I have several examples of how you iterate different things , by using that feedback from customers. 

Murray: So there’s a core problem here, isn’t there? We know that most products fail in the market. 

Dan: That’s up to here. 

Murray: And also research from Pendo and Standish Chaos Group shows that something like 70% of product features are rarely or never used. 

Dan: That’s right. 

Murray: So we’re, putting an enormous amount of investment into building things that people don’t use. 

Dan: That’s up to here. and that’s why in the very first section, before we even get into the process, We cover a really important concept, which is problem space versus solution space. And I think that that’s a big part of why those statistics are so high. It’s natural human behavior to just jump to solutions and say, Hey, I think we should build X feature, or I think we should build Y, and when you do that, you run the risk of not being clear on well, what problem does it solve? Is it actually solving a real customer problem? And so one of the best things you can do to improve your odds of product success is to actually make sure you start in the problem space before you go to solution space.

So problem space is where you’re thinking cleanly about what is the customer benefit or problem that we’re trying to address independent of any solution at all. And then later on you can brainstorm solutions for how to meet those needs. But it’s a bit like ready, aim, fire. I feel like many teams, they go and they just build something that they think is gonna be useful, but because they haven’t articulated their hypotheses about the problem who’s the customer, what problem do they have?

And actually done any validation at all , that’s where the risk comes in basically. And you’re, you spend All this time building a product, you launch it and then the odds of it succeeding are very, very low if you haven’t taken those steps. 

Murray: Yeah, and I think it’s also very common for product managers to be order takers for senior management. 

Dan: Yeah a lot of times we try to say, you’re not a short order chef. Not a short ordered cook. There’s a couple things there. One is, yes, are the key stakeholders dictating or determining what is going to be built. Couple problems with that. One is usually they are also doing what I call The, solution space thinking solution items. What they’re dictating is solutions. For example, one of the solutions to De’ Jure is blockchain. Senior executive may say, we need a Blockchain feature. It used to be AI. If you take those stats and think of all the money and time that have been spent on ai and what do we have to show for it We’re starting to see some interesting things come up recently. But before that it was chat bots and before that it was vr. There’s always this hot new wave of technology and people get so excited they fixate on the technology and the solutions and they don’t, start with well, what’s the customer problem we’re trying to solve? So that’s the thing.

So key stakeholders usually dictate solutions when they’re placing their orders. I work with a lot of product people and they’re like well, I’m just a product manager that’s a vp. They must know, they must have done all the research. They must know what they’re talking about. And it’s unfortunate that the emperor offering has no clothes. They just listened to some podcasts that talked about how cool blockchain was on the way to work and they came in and said, we need blockchain all of a sudden.

So the other thing is sometimes product people get so pulled into the day-to-day of the agile scrum execution, that they don’t have the bandwidth to like actually go and talk to customers. So who in the organization is doing that discovery work. That’s a framework that we can divide up. I call it the four D’s discovery definition, design and delivery. In most companies, most people’s time is spent on delivery, running the agile process, and building and shipping the thing. If you’ve got ux, you might be spending some time on design. But a lot of times the discovery and the definition, get skipped or ignored. And discovery basically means do we really understand who our customers are And. Do we really understand what their needs are? So whose job is it to really make sure we know who the customer is and what their needs are? It falls on product management, so if product managers aren’t doing that, then they’re not providing their unique contribution to that cross-functional team. 

Shane: I get the impression that a large number of the product teams aren’t following recognized agile practices. What are you seeing? Are you seeing that a lot of product teams are following those agile practices, or are you seeing they’re using different ways of working? 

Dan: No, I think a lot of people are practicing Scrum. Some people are very dogmatic about it. Like, If you’re not fully complying with all the Scrum guidelines and you’re not doing Scrum, I think that’s a little excessive, so I’m, I’m always very pragmatic about it, and I think part of why Scrum is the most popular agile methodology is cuz it is prescriptive. If I just give you a set of five principles and say, go for it, you’re not gonna know what to do. That’s part of why people like my book. It’s a playbook. It tells you the six steps to follow. Obviously the details of what you’re gonna do are different, but it’s a playbook. If you’re trying to learn something, having a playbook is great. You do see some variance. Like different teams will pick different sprint lengths. I was just teaching a workshop the other day to startup, had one week sprints and that was pretty cool. It was pretty fast. Two weeks is by far the most common. And then three weeks is probably the second most common.

Some large companies that are coming off waterfall and like quarterly releases are doing maybe four weeks sprints. So it’s always a journey, as we all know. It’s a journey. And they may not do a demo every sprint or may not do a retro every sprint, but for the most part they’re doing the standups. They may not do the standup every day. Maybe it’s three days, four days instead of five days. People, say we’re doing Scrum and they’re doing it. What happens though, just to be clear, is the product people may be staffed against three or four different Scrum teams, so you’ve got unique developers on the three or four teams, but you’ve got one product manager. Well, That poor product managers just going to have their day busy. Just keeping up with, all the questions that come up the backlog, refinement, writing the stories, They’re not gonna have any time to get outta the building and talk to customers. They’re not gonna have time to do that. 

So that’s what I see. They’re following Scrum. They just don’t have time to go and interact with customers to do the discovery work that we’re talking about. That’s a Common symptom that I see. 

Murray: I wanted to. Dig into your playbook a bit further. So why don’t we go through each of the six points starting with determining your target customers. How do you do that? 

Dan: Yeah. Well, And it’s funny cuz if you ask most companies and say, Hey, are you customer centric? They’ll say, yeah, but then the rubber meets the road when you say, okay, well let’s go through each person on your product team and ask when was the last thing they actually spoke firsthand to a customer. It can be a long time again, because we’re so busy with the day-to-day inside our company that we don’t actually talk to customers. And the other thing is, a lot of times we’ll be like, oh, we know who our customer is and they give you some high level answer on who the customer is. Yeah. We’re going after small business owners.

Well, The reality is, if you think about products that create a lot of value and are very successful, the teams have gotten way more specific than that in defining their tar customers. Not all small business owners are the same. One of the examples I give is I was working with a team and I, who’s your TAR customer and I said millennials And at first it sounds specific, but then you think about, it’s like, gosh, there are millions of millennials who are not all the same. So what you wanna do is what I call pure the onion. And again, it’s iterative back to what we were saying earlier. You’re not gonna know your detailed target customer up to here away.

You start out with an initial hypothesis, Hey, we think of small business owners. You go out and you do some discovery research and you start to realize how to segment that market. How do we divide up small business owners into the ones that are in our target market And the ones that aren’t in our target market?

And so that really involves market segmentation techniques. And there are four main market segmentation techniques. There’s demographic, psychographic, behavioral, and needs based, and so the point is you wanna use these different techniques And build up over time a persona of who is that to our customer.

And a lot of these artifacts you wanna have in writing. We don’t wanna write a long research report. I’m a big advocate of a one page document that captures who your target market is. Usually that’s a persona. with the salient attributes listed there so that we’re all on the same page. Because if we’re in a scrum meeting and we’re debating which feature is more important or which design is better, and where we don’t have the same target customer in mind, that’s gonna be part of why we’re gonna have different opinions, if we don’t really understand the same customer. So part of it is getting aligned on the layers of the product market fit pyramid. Also help a team get aligned so that they can make better decisions. So you, You form an initial hypothesis about who your customer is. Hopefully you do some discovery research to learn over time and you’re gonna improve over time, and one of the things is if you’re doing customer research, say you show your prototype to 10 target customers and five like it, and five don’t. That is a hint. That’s a clue that you have not probably refined your target market enough. You still have to answer the question. Why do these five like, and these five not like it. You’re missing some salient segmentation attribute. And as you improve that target market definition, your success rate with those prototypes is gonna go up. Cuz there’s two parts of the product market fit, there’s the product side, but there’s also the market side. Who are the up to here type people that are gonna value what we’re building? 

Shane: Do you differentiate the difference between, the buyer and the user?

Dan: Well we do, so basically, what happens in the product world is sometimes people use specific terms, like, oh, client versus user versus this, and the unfortunate thing is because people have worked at different companies, they don’t all agree on what those terms mean. So in general, what I like to do is, high level, all-encompassing terms. So when I use the word customer, it includes both the people using your product as well as the people that are making the purchase decisions to use it as well. Now the reality is those are very different people. You actually need a pyramid for each one. So that’s the thing is the foundation of each pyramid is the tar customer. You need to have a pyramid for the buyer And you need to have a pyramid for the user. And, And some of their needs may overlap, but they’re gonna probably have distinct needs. And it’s not just B2B where you’ve got buyer versus end user.

Where this occurs on two-sided markets, this comes up too. If you think about Uber or Lyft, they’ve got rider on the one hand and they’ve got drivers on the other hand. And so they’ve got two different pyramids. And sometimes the pyramids go all the way up to the top separately. Sometimes they combine at some point.

But if you think about like Uber or Lyft, if you go to the app store, they have separate apps for the rider and the driver. All the way to the UX layer, the features, the ux, all that. It’s all distinct, obviously on the backend database. When you hail a car, it connects you with the driver and it all makes sense.

And another example would be Salesforce. that, Who’s buying it? Probably the head of sales, who’s using it while the salespeople on that person’s team are using it way more and in different ways than they’re going to use it. And then there’s actually a third important user for Salesforce, which is the admin. They’re a super user, so they need all the things that the user has, but then they also need to be able to set up and configure things as well. 

And so I define a customer as anyone you need to provide value for. Even if you’re building tools for internal employees, I’ll give a lot of private workshop to someone. Say well, my job is to build tools for internal employees. Well, they’re employees, but they’re your customers. You’re on the hook for creating value to them. So I like to use it in the broadest sense. And if you’ve got multiple customers, you’ve gotta have multiple pyramids. And one of the books that I actually referenced in my book is called The Inmates Are Running the Asylum by Alan Cooper. He’s a big proponent of personas and so he has some advice on when you’ve got multiple personas, how to prioritize among them. 

Shane: Yeah, I think with the Salesforce one, I’d add in one more persona or, pyramid in there, which is the consulting companies, cuz you know, it is a two-sided marketplace cuz they’re caning it, configuring Salesforce so that it actually does what you want it to do. So Probably another pyramid for them.

Dan: Totally agree. Good point. Yep. Definitely. A lot of times and B2B you can often have a complex ecosystem with multiple actors like that. For sure. 

Murray: Okay, so how do we test our target market? How do we find out whether our hypotheses about our target market are correct or not?

Dan: Yeah well, there’s two ways to do it, one is you, you have to articulate them. And then you go and try to find those people, and then you sit down and talk to ’em. And I will say that step one and two are very related. So step two is what problems do they have, what underserved means do they have? So when you go and talk to 10 small business people, you’re gonna have some hypotheses about what needs you think are underserved for them, and you’re gonna ask ’em about it. You may talk to 10 small business people and say, how is your inventory management going for you? Are You happy with it? Are you not? We think that’s a problem where inventory management isn’t as easy as it should be, and we wanna make that better. You may go out and talk to 10 small business or some e-commerce people and find that five of ’em say, yeah I’m using this system and here’s the problems with it doesn’t meet my need. And the other five may say, nah, that’s not really a big important problem for. Your goal now is to figure out what is different about the people that agree with your hypothesis and the people that don’t agree with your hypothesis. So you can add some salient attributes and you can only go so far at this level.

You can do some of it at this level. The real part where it really comes in is when you get to step six and now you have a prototype . When we close the loop from the top of the pyramid to the bottom, we have to recruit people in our target market. And then when you show ’em the prototype, that’s when the conversations with the customers can really go to a much higher level of depth because now they can interact instead of just talking in the abstract about problems, they can see what you’re proposing and they can tell you, oh yeah, this makes sense, this is gonna make this easier.

Or, no, that’s not so there’s two main points. One is you can do discovery research before you have a prototype to refine it as much as you can your target market description, but then it really goes to the next level. Once you have your prototype and you can show it to people, that’s when you’re gonna be able to really refine your target market to the next level. 

Murray: So how do we identify those underserved needs? Are we using jobs to be done or something like that?

Dan: Well, It’s very similar. there’s All kinds of different methodologies out there, design thinking, jobs to be done. I think most of them have more in common than they have different, some of times it’s just semantics, so my approach is very consistent with jobs to be done.

In fact, just yesterday I did a webinar with Tony Ulwick, who is a friend of mine. He’s like one of the pioneers of jobs to be done. So the key thing, no matter what methodology is, is getting clear on what’s the customer need or problem instead of the solution. And since we’re talking about Agile, I always bring up the agile user story type, but as a blank type of user, I want to be able to do blank so That I can enjoy blank benefit, everyone knows that. That is a good template to follow, to stay in the problem. And it helps you be customer centric, cuz you start off with as a blank type of customer.

But that’s so I can, that’s really the essence of the problem space. Like why is it gonna be valuable to a user? Why do they care about this, how’s it gonna benefit them? That’s the essence of it, so what we do is we brainstorm different problems based statements that we do, and Tony and I both agree when we do problem based statements, , we don’t write out a full on Agile user story. We don’t say as a blank, I want a blank so I can blank. That’s a good format to follow later on when you’re getting more specific. The, as Alan customer, it’s implied., there’s only one customer for all the problems that we’re doing for a given pyramid. So we both like it to start with the verb, like, Save me time in this task, save me money, whatever. So verb makes it active. So , we basically do what I call exploring the problem space. Let’s say we were Airbnb, like what are all the different ways we can help people with our accommodations? We would just brainstorm all the different ways that we could. It would be a little messy. After we brainstormed, we would go back and clean it up, de-dupe it and clean it up. And then you can organize it. You’ll find that certain benefits are actually related and you can organize them into, a hierarchy, what I call benefit ladders. And then once you’ve done that, then you have what I call problem-based definition. So you have your atomic level benefits organized by high level benefits. 

And the example that I give a lot is TurboTax, so TurboTax is in the United States, every person that makes above a certain minimal amount of money is responsible for filing their income taxes each year. Now you can pay someone to do it, but a lot of people do it themselves. And Turbo Tax is the software solution to help people do that Well, the high level benefits that TurboTax provides are saving you time. It’s gonna take you less time to do your taxes, less time to file your taxes. It can save you money. Cause instead of having to pay an accountant, it’s gonna be cheaper, or it may even reduce your tax bill by finding things you didn’t know you could deduct. And then the third one is a different benefit, which is an emotional like a, like a confidence. I, I feel more confident in doing my taxes because I didn’t really know what I was doing.

So that’s an example of like the high three high level benefits of TurboTax. And beneath those three will have more detailed benefits. That’s what I call problem-based definition. And you can do a lot of that without talking to customers. It’s all hypothesis. And then again, when you do the discovery interviews, that’s when you’ll talk to people And see if they have problems. that you didn’t think of and then eventually it comes time to prioritize these problems. It’s like, okay, out of all these opportunities, which ones are gonna create the most value , if we can solve them adequately, and that’s where the underserved part comes in. And that’s where I have a framework of importance versus satisfaction.

Where for each of the customer needs, you get clarity on how important is this to the customer, and the higher the importance than the higher the value. And how satisfied are they with how they’re getting it met today? Whatever they’re using today, whatever solution they’re met today. If they’re highly satisfied with what they’re using today, there’s not as big an opportunity. There isn’t really a problem, you may be able to do disruptive innovation and try to disrupt that person, but that’s a lot riskier. If you can find things in my importance versus satisfaction that are in the upper left quadrant, which is high importance. So it’s really important to the customer, but low satisfaction, that’s where like really big opportunities lie. And to tie it back to what you said about the, the horrible success rates of new products and features, I think , if teams just one got clear on the problems before they worked on the solutions and validated those problems with customers, and two, made sure that those problems were high importance, low satisfaction situations that would increase the odds significantly.

Those are the top two pieces of advice I have of nothing else from my book. I do those two things. Your odds of success are gonna go up a lot, so that helps you clarify which customer problems are the ones worth solving.

Murray: Yeah, we interviewed Tony Ulwick a while ago about jobs to be done and I think the message I got was that a lot of companies going in with a me too product idea, and when they test it, they find that there’s usually some much narrower problem space, which is underserved. 

Dan: Yeah. I think that’s the thing. In fact, I was just teaching a workshop the day , and someone’s like, well, why don’t you, just copy the competition? I can understand how some teams might do that if they don’t have the skills in-house to really determine customer needs and come up with unique solutions and nice designs. I guess you fall back to just copying, but there’s no way you’re gonna have a differentiated product , and you’re also making the assumption, which is probably not up to here, that they’ve done all this awesome legwork and research to validate things.

No one’s really doing that. The percentage of companies and teams that are following best practices are pretty low. So everyone’s just trying to just ship stuff as quick as they can. And they’re doing it with risky assumptions or hypotheses without checking them. And that’s why you have a lot of products out there that aren’t making people happy.

So there’s a lot of opportunity out there. One of the things I do too, back when I would be more hands on consulting is say someone wanted to enter a certain space, I would analyze existing products. And when you do a detailed analysis of the existing products, say there’s two or three products, you’ll find, hey, these guys, this product does this particular part better than the other two, but they’re really horrible at these other parts.

And you can create a superset it’s kind like a Frankenstein where you basically look across the mall and, and I wouldn’t even stop there. I’d still say, how can we, and sometimes there’s like something that none of them do well, and that’s a real big opportunity, but at a minimum you can find what their weaknesses are and do better than what their weaknesses are. I’ve done that a lot of times on the UX front, especially. 

Shane: In the B2B software space. We seem to re-engineer these horrible systems that are highly complex and not that easy to use and require lots of training and that stuff. Are you seeing that, are you seeing a lot of adoption of consumer patterns in the b2B space because they have value. 

Dan: Luckily we’re seeing more of that. So one of my early clients was box. So Box was known for having a much better UI. In general, like historically B2B had like embarrassing uI and usability. It was just like, Hey, salespeople are selling it, executives are buying it, people gotta use it to get the key business test done. Tough. And it, wasn’t like they knew there was good UI and they were deliberately not doing it. Just the bar was super low. And I think that , with the iPhone, iOS app store the bar has been slowly raising in B2B software.

I still, use some system that one of my clients has for like accounting or invoicing or something. And I’m like, oh man, you can tell when this was built. This was clearly built like back in 2010 or something. If you’re really successful in the B2B space, then you’re around for a long time. Well then you’re a legacy system. Your code base is old. You can see the UX fashions frozen in time and they’ve been successful. So it’s still there. 

Shane: We see that in the organizational structures as well, though. We can look at the hierarchy and the job titles in an organization and have a guess at when that organization was founded based on the fashion of their team topologies. 

Dan: Yeah, definitely. So, it really, comes down to how much of the CEO and executives value the importance of ux design and usability, and how do they express that it’s, what’s the headcount and budget for ux? So you can just cut through the bologna and be like, Hey, how many designers do you have? And that’ll tell you how important UX is to that company. 

Murray: All right, so you’ve done your market segmentation and identified your underserved customer needs. What’s the next step 

Dan: The next step is to say, okay, let’s craft our value proposition, which is we’ve been talking about these different needs that we could address. Let’s actually commit to which customer problems we plan to solve with our product. Or new feature, and let’s get really clear on how we’re gonna solve those in a way that’s better. This is where we get clear on how we’re gonna create more value. And so this is your value proposition. So when I studied that master’s program in industrial engineering, I learned a really helpful model called the Kano Model, and it talks about customer needs and it also talks about customer satisfaction. It is really a categorization scheme for customer needs. There are performance benefits and features where more is better and less is worse.

So the example I give there is like, if we were in the microprocessor chip business and everybody else’s microprocessor was operating at two gigahertz clock speed and ours was three giger, Then we’d be outperforming everybody by one gigahertz. Most of the competition happens on the performance dimensions, it’s like, hey, we’re 10% faster, we’re 5% cheaper. Whatever. It’s 15% higher quality, whatever it is, something you can quantify. With performance the higher their performance, the higher the customer satisfaction, the lower the performance, the lower the customer satisfaction. 

The second categories must haves. You don’t actually make anybody happy with the must have. So if you fully meet the must have need, it doesn’t make anybody happy. It’s only when you fail to fully meet the must have need that you make people unhappy. So if we were doing like a healthcare product, we’d have to comply in the US you’d have to comply with HIPAA laws, HIPAA regulations, but if you had HIPAA in your product, it’s not like a customer would be like, oh my gosh, they have hipaa. Sign up for it today. It’s amazing that you just need to have it . Whatever it is, Two-factor authentication. Just these things that you just have to have. Now I will say sometimes the term must have gets thrown around pretty loosely in a company by key stakeholders saying we must have this feature. But this is a very specific conno model definition, which is, hey, if you do not have the must have feature, people will not buy your product. 

Murray: Yeah, I’ve used the Moscow prioritization method before with people, and I found my stakeholders made 80% of the requirements must have. So It’s not a real must have.

Dan: Exactly. Yeah. And then the third category is delighters. And so not having a delighter doesn’t cause a problem cuz people aren’t expecting it. It’s kinda like a bonus. But having a delighter can create a lot of positive value. Usually when teams brainstorm, delight they think of these huge scope like automagical things like, oh, what if they automatically knew what you wanted to order and ordered it for you? It’s like things that may or not even be feasible, but it can be little things. So the other day I was on Spotify and when you listen to music on Spotify or any other music listening thing, there’s always like a progress bar to show you how much of the song is played, it’s a three little minute long song or one minute into it. Well if you listen to a Star Wars song on Spotify, that progress bar gets replaced with the lightsaber and the lightsaber blade grows on the progress bar as the song played. That’s an example of what the lighter, that didn’t take a lot of time reference.

So anyway, you wanna use these three categories must have performance delighter, and you create a grid, a simple grid where a table where you list one benefit per row. Like you start out with the must haves. What are the must have benefits for our product category One per row. What are the performance benefits? One per row. And then what are the delighters one per. And then you create a column for each of your key competitors. And then the final column is for your product. And then the next thing you do is you just score how good a job is each competitive product doing on each of these benefits. And honestly, high, medium, low is adequate. If you did have some numerical metric, like gigahertz or something, you could put the numbers in there if you wanted to. But honestly, high, medium, low usually works. And then you take a step back and you look and see where can we be higher? Like where can we deliver higher satisfaction? Which of these rows, can we outperform these people on?

And that’s where you have to have debates inside your company about what’s feasible, within the timeframe, the planning horizon. Usually we’re talking about a two or three year planning horizon for this value prop. And so if there’s a row where nobody else is high, then there’s a chance to potentially dominate that.

Or if there’s a row where someone’s high, but you think you can be higher, , then you can go for that. So what you’re trying to do here is find the one performance benefit where you’re convinced that your team can deliver higher satisfaction. And then if you’re fortunate, you also can identify at least one delighter. , and delighters tend to be unique, like Spotify has the light saver progress bar. Anybody else can copy it, but they haven’t yet. So usually delighters tend to be unique , and that’s really the secret, is getting really clear on what’s the performance benefit, where you’re gonna be the best.

And any delighters you have, those are called your unique differentiators and we wanna know what those are. Cuz then the very next step of the process is we’re gonna get clear on our feature set and functionality. Well, If we’re betting on we’re gonna be the best at this, and we’re gonna delight with this, then our MVP better have features to back up those benefits. That’s what we’re actually testing. 

Shane: And I suppose the key there is be honest with yourself, we’ve all seen the vendors that publish the The Matrix where they’re the only ones with all the green ticks and everybody else has got the Red Cross.

Dan: Yeah, exactly.

Murray: So is that your product strategy? That we’re going to be better on these performance factors and these delighters, and we’re going to serve this unserved market niche. 

Dan: That’s exactly it. And usually It’s just one, one performance row where you’re gonna be the best. It’s enough to just be on one. And if you could have one, be the best on one performance benefit, and also have a unique delighter that nobody else has, that’s usually creates so much more value that you’ll usually be number one in that market. And there are examples that I give about Instagram, Airbnb and Uber as well. Sometimes. You may think, ah, they just got lucky. All these companies are throwing spaghetti at the wall and these guys just happen to get lucky. But I’ve curated documents from some of these companies where you can tell from their very first pitch deck where they raised their first round of funding, they knew exactly what they were doing and how they’re gonna be different and better than the competition. So Uber and Instagram are two examples of that. 

Murray: Okay. So given you’ve done that work, Dan, let’s talk about the minimum viable product. How do you decide what that is? 

Dan: Yeah. So now when we go from value proposition to feature set, we’re finally transitioning from problem space to solution space. But we have a solid foundation. We have a story of, Hey, we’re going to try to deliver value for this customer. We’re gonna solve this need. and we think we can do it better than the competition by outperforming them and out delighting them. 

Now let’s take those key unique differentiating benefits from our value prop, and now let’s brainstorm all the solution ideas that we have for how we can deliver those benefits, so now is when you map from the problems that you’ve come up with to solution ideas, and you brainstorm again. You brainstorm, if we wanna save people time on their taxes, what are all the different solutions and features we have for how we can save people time on their taxes, so you brainstorm all the features and then we haven’t yet taken into account the engineering effort yet. So now is where we do a second type of prioritization. So we did a first prioritization in the problem space with the importance and satisfaction to make sure that any of the problems we plan to address are gonna create enough value.

Now we do an ROI prioritization where we say, okay, how much customer value is this feature gonna deliver? And how much effort is it? And by the way, I also like the agile concept of breaking things down. Usually when you first brainstorm a feature idea, it’s bigger than it needs to be.

You can usually circle back and find ways to break it down. So we wanna break things down, we wanna run ’em through roi. And then I have a roadmap visualization where we basically take the benefits and our value proposition. Those are the swim lanes for our roadmap. And we start building out our mvp with the highest priority features that address the must haves that address the performance benefit that we plan to be the best on, and address the unique delighters that we have.

And then we have a discussion about where do we draw the cut line? Do we need the first one feature, the first two features in the swim lane, where do we draw the cut line, basically, and so that one page roadmap visualization is a great way for people to get on the same page and unfortunately one of the top mistakes I see organizations make, it’s ironic because the whole point of MVP thinking is to not over scope your MVP as they over scope their mvp.

And the way that usually works is a team may propose a lean mvp that ticks off the boxes. Hey, it’s got the haves, it’s got our top feature idea for how we’re gonna outperform and it’s got our top delighter in there, but we’re kind of waiting on the other stuff and gonna do that in version 1.1 or 1.5 or whatever.

What happens often is some key stakeholder catches wind of this and goes, wait a minute, you’re gonna defer feature X until , postlaunch. Oh my gosh, I think our MVP’s gonna be horrible if it doesn’t have that feature. And we’re gonna be the laughing stock. Big Client X is gonna throw hissy face. A lot of hand waving and noise and now you’re in this political battle. So usually the team just takes the hit and they put it in, but Nobody talks about well, we just took a eight week hit to our launch date because of that. Nobody ever talks about the scope, time, trade off. So in a lot of my workshops, I’ll talk about the fundamental trade-offs of like time, resources, scope, and quality. And how in the short run, really scope and time are the only tradeoffs you have. You can’t suddenly snap your fingers and have a bunch of developers ramped up and ready to go and no one’s gonna cut quality on purpose.

Murray: What is the purpose of an mvp? 

Dan: Ah. I’m glad you asked that because out of all the terms, I think MVP is the most divisive And people have the biggest debates about it. If I ask a group of people, what’s an mvp? What’s the purpose of mvp? Someone will say well, it’s basically like the first version of your product. It’s got enough functionality so you can test with customers.

And then someone in the audience will say well, no, it’s just a way to test your hypotheses. You don’t actually have to build anything. It can just be a prototype or something. You can just a way to test your hypotheses. And then usually the former group goes no, no, no. It’s gotta be a product. Don’t you know what the P in MVP means? It means product. And that’s the source of the debate. There’s two very different fundamental interpretations of it. One is, hey, anything you learn from can be an mvp. And the other extreme is no, no, it’s gotta actually be a product like live working coat, and so I’ll have people vote and they’re never gonna agree. The latest thing I have is this like spectrum. And on the far end is like, hey, is an mvp, something that a customer can learn about your product, like a landing page. And that usually gets the lowest percentage of people on board that agree that that’s an mvp. then I go up to, is it something that a user can see what your product’s gonna look like, like a set of mock-ups. And then I’ll get a few more stragglers. And then the next one up is, is it something that they can interact with to see how your product’s gonna work? Like an interactive prototype. And then I’ll get some more people at that point.

And then the next one is, is it something that a customer can actually use, like an alpha or a proof of concept? And usually that’s where more of the hardcore people jump on. But not to be outdone, there is one more pip on that spectrum, which is, does it have to be something a customer can pay you money for?

And especially the B2B companies, a lot of people in those companies, they hold out for that final end of the spectrum. So that’s the difference that you see. So to clean up the terminology in my book, I mentioned trying to use the word MVP test or experiment as an umbrella term that can include not only the true MVPs that the hardcore people and sister MVPs, but also the experimental ones. But lately I’ve been using the term MVP product and then like V prototype for what the MVP is. The real thinking work is what is the scope of functionality that needs to be in the mvp?

Whether we prototype it or we build it, that is the real discussion or debate is what absolutely needs to be in there And what can wait. It’s a mindset and it’s not just when you’re building a V1 product, any product milestone, you’ve gotta decide what absolutely has to be in scope and what can wait. Any sprint that you’re planning, you have to decide what stories absolutely need to be in scope and which ones can wait. What can we say no to or not now to and and this is something that a lot of product teams struggle with. And It comes from a place of why is that stakeholder saying, oh my gosh, I think you gotta put feature X in. It comes from a place of, well I’m worried that’s our product’s gonna be inadequate. And the more stuff we pack in there, the higher the odds something’s gonna stick. It’s like you’re throwing spaghetti at the wall. It’s like the more spaghetti you put in your hand, the better the chance something’s gonna stick. That is not a very high skill argument or mindset, instead like, let’s be precise about, no, we’ve done the discovery research, we know this is a customer and we know that this is underserved problems. We know we can beat the competition this way. We think this is an adequate mvp. So when we have those debates with key stakeholders pounding on the table saying, we gotta have this feature in there.

This often comes up when you’re re-platforming, if you do have a legacy platform that’s been around for 15 years. And it’s on like Fortran and Cobalt and it’s falling down and it’s a pain in the butt to maintain.

And it’s like everyone’s saying, okay, it’s time to bite the bullet and rewrite the stack in a modern languages. How long is it gonna take to build the platform? Well, How long did it take to build the old one? It’s probably gonna take about the same amount of time. , maybe we know a little bit more than we did. If the new platform has to do everything the old one did and then some. You’re never gonna finish building that thing, so the only way outta this trap is MVP thinking, where you say, okay, we’re gonna support 30% of the functionality at first with this limited audience, and then we’ll increase from there.

Well, in that Context, and somebody propose what that limited MVP of the new platform would be. Someone from sales was pounding on the table saying, I can’t believe you’re not gonna have This feature. IBM depends on this feature. This is critical feature for IBM One of the engineers went off and checked the logs and saw that IBM had never used that feature, ever.

So a lot of time it’s just, noise and concern and opinions, and what I like to do is how would we test that? Let’s say we had a legitimate disagreement. How might we test it? If you just suck it up and put the feature in, you’ve taken the eight week hit on your launch date, like that’s, you haven’t tested anything.

So what I like to do, cuz usually the stakeholder is so certain they feel so strongly about it, we can be like, okay, how might we test this? I like to say, Hey, how about this? We’ll do a quick wire frame. We’re not gonna spend a lot of time making this prototype look perfect or anything.

We’ll do a quick wire frame without the feature. You are so convinced, like this is gonna be horrible and every customer’s gonna complain. Let’s go talk to 10 customers. And I’ll tell you what, if eight or more proactively complain without us bringing it up about this feature missing, we’ll be the first to say, mayor Copa. and we’ll put it in. But can you agree that if less than eight people complain about, you can set up some rules and you do a prototype test. And that’s the way outta that trap, cuz it’s gonna be really hard to win that political debate. 

Shane: So with the mvp does it have to have some, must-have some performers and some delighters because there’s a natural reaction to go just to the must ends, you’re trying to scope it down to smaller as. Possible. Get it out there, get some feedback. The must-haves of what you focus on, but often it’s the performers or the delighters, that is where the goal comes from, where you find that niche that nobody else is hitting. 

Dan: I totally agree. And if you could visualize, if you could see the cono model, you would see that must haves don’t provide any positive customer value. It’s the absence of must haves that creates dissatisfaction. So by definition, launching a must have only product is not gonna create customer value. It can’t. Now what happens is people start out with a roadmap and a plan to launch. The must have. The must haves have to be in there and they like have a little bit of performance and delighter, but usually what happens is they pick a launch date and then stuff happens and we’re running behind. And back to that triangle software development.

If you’re not willing to flex on the deadline, then the scope’s gonna give. And so they rationalize it. I mentioned, the first biggest mistake is the people over scope their MVPs, which is ironic. The second biggest mistake is what you touched on here, Shane, which is basically that we launched an mvp, but it’s just like we use the MVP as an excuse to cut corners. So there’s another pyramid that I mentioned that comes from Aaron Walters and MailChimp, where it talks about for a given product, How functional is it? How reliable is it? How usable is it, how delightful is it? And so there’s a great visualization where he is like, how do you slice that pyramid when you launch your mvp?

And if all you do is slice off a little functionality but you ignore reliability, use billion delight, that’s not gonna do well. And so you wanna slice it more like the edge of the pyramid where for whatever subset of functionality you are buttoning off, it’s reliable enough it’s not gonna be bug free. Usable enough and delightful. Cuz the whole point of the MVP is to test your value proposition. And so what happens in a lot of organizations that are trying to become agile or embrace lean, They say, okay, we’re gonna switch from waterfall. , stick to this fixed date development. They underestimate. Stuff comes up and so all they do is they, to hit the date, they launch it with just the must haves. They test it with customers. And then, not surprisingly, it doesn’t do well with customers cuz it’s actually do anything really. Hey look, you can log in. Yeah, it’s great. Like you can log in and, it’s like, hey, you can change your profile. I was like, that’s great. 

Murray: We’ve gone from talking about deciding on your mvp, I think, to talking about designing, and testing it. Is there anything specific about the designing and testing that we need to draw out?

Dan: Oh, Definitely. Yeah. so when we agree on the MVP feature set it is words. We’ve brainstormed the feature and here’s like the here’s the features we plan to do. This is where I like to use user stories at this point, once we’ve like got a high level roadmap, we might have a box that says load inventory. That’s like two words. So part of that definition is then we define it with a set of user stories. And then the set of user stories, we identify which of those are in scope or outta scope, that’s the defined part. So even if we write a bunch of user stories, we even further define what’s in scope and out scope.

Once we have that high level roadmap of what big rocks are in this mvp and we have the set of user stories that are in scope for each of those big rocks on the roadmap, then we go to UX design and that’s where the designers will design what that user experience will be. And if for a given set of features for a given set of user stories, you could have a million different UX designs, so that is definitely an important step to work through. And then the goal of that is to create something that’s usable obviously, and hopefully delightful. And to have an artifact to work out the design, cuz we have to tell the front end developers what to build, that we can then go and close the loop with customers.

Cuz that’s where again, the next real learning happens. We’ve had all this internal discussion about value prop and features and then we create a prototype, and then we test that and that’s when the real learning happens and people go, ah, that’s not really So important. This is important. So there’s definitely a specific UX design step. And then on the testing step, there is an art to that of running good user sessions, not asking leading questions. Like If I show you the prototype and go, Hey, that was easy to use, wasn’t it Marie?

That’s like leading question, or closed-ended questions. Did you like that feature? Yes or no? You may feel good if they say yes, but you didn’t really learn anything. So you want to ask open-ended questions like, how did you like that? What did you think of that? You’re gonna go a lot more information in the book, I go into details of how to categorize the feedback across functionality, user experience and messaging, and how to like pattern match and decide what you need to change.

Because again, the first prototype, there’s gonna be things you have to change and iterate. So we want a pattern match. Talk to small batches like five to eight customers at a time with the prototype. Stop pattern match the top feedback, go back through that hypothesized design, test, learn loop, revise our hypotheses, tweak the prototype and go back with a fresh batch of five to eight and rinse and repeat until you get to the point where there’s no major concerns or questions that are coming up.

And I’ve done this dozens of times. Your first prototype’s gonna have a lot of rough edges. And then doing this process, you’re gonna smooth out those rough edges, iterate it, and then you get to the point where the problems are concerns that people were bringing up in previous rounds of iteration are no longer coming up cuz you’ve adequately solved. And then people start going, wow, this is actually pretty useful. I could use it. Then you can confidently proceed to building it at that point. And sometimes you have to pivot. Hopefully you can iterate your way to good product market fit, but sometimes you have to pivot. And a pivot basically means You’re changing one of those key assumptions key hypothesis from the pyramid. You’re either changing what your value props all about you’re changing which need you’re trying to solve, or you’re changing the target customer that you’re going after. And the further down the pyramid you have to pivot, the more it disrupts everything that you’ve done.

But Pivots happen all the time. You just gotta get out there with the prototype. And the longer I do this, I feel like if you do this well, if you use your prototype and you listen to what customers tell you, they will direct you. They will pull you in the direction of creating more customer value and higher product market fit. 

Murray: Yeah, I’ve done quite a bit of prototype testing , with customers In a digital agency. And what we found is it actually doesn’t take many customers to get this feedback. I, the guideline I think is five to seven one hour user interview, and by the time you get to number five, people are just saying the same things. 

Dan: Exactly. That’s exactly up to here. yeah, there’s that well known. Number five. I recommend like five to eight. There’s a big difference between qualitative research and quantitative and some people are so quantitative it’s never gonna be statistically significant, but as I like to jokingly say, if we show our new login flow, our new user flow to 10 users and eight of ’em can’t get through it, do we go, hold on, We need to get a thousand more tests to see if this is, significant. Or we say, just use common sense and go, gosh, 80% of people couldn’t figure it out. We got a problem here. So it’s really that latter math that we’re doing on this pattern matching. And to your point, the biggest rough patches are at the very beginning. And so everyone’s gonna get tripped up and hit those speed bumps. If you’re testing with five or six people all of them or all of them, but one may, hit that speed bump.

And it’s funny, until you remove that speed bump, you can’t find the smaller speed bumps, so it’s like, you’re polishing this thing, you’re getting rid of the rough patches, and then you find the next set of rough patches. I think of it very much like whittling some wood and toy out of this gnarly piece of wood and at first it’s all bumpy and it’s got bark on it, and then you work off those rough edges and you refine it, and you polish it and then you just don’t see people bringing up those issues anymore. 

Murray: I found this very helpful to do with a cross-functional team. So you’ve got the product manager, a user experience designer, and a prototype developer. You work together, you have a couple of interviews in the morning, then you make changes in the afternoon, then you have more the next morning, changes in the afternoon, and so on. You can make a lot of changes quite quickly in an afternoon if your customers have said to you, I don’t understand this, what’s happening here?

Dan: I totally agree. I think that speed is amazing. Even if it’s like you’ve got a day in between the tests to have a little more breathing room, if the team isn’t used to that speed. But a couple things, having a prototyper, obviously having a prototyper is key to this, if you’re all down for doing what I’m saying. But you’re like, that’s cool, but we don’t have anybody who can knock out those UX prototypes. You’re dead in the water and you have to go find someone. And so sometimes people wanna do it, but they’re missing that key skillset. When I did my startup A lot of the feedback we got on the first MVP was actually wording, like they didn’t understand the wording we were using like, oh you’re saying this, but I don’t really know what that means. That’s the easiest thing to change. It’s just text. But if it doesn’t resonate or people like, Hey, I need some examples here and you can see what the top questions are and then proactively address them with little tips on the side or whatever and some of the things are really quick and easy to change if it’s like, oh, we just had the button in the wrong place, people didn’t see this or that.

Some other things may be more fundamental. Like, Gosh, the way we sequence the steps in this workflow aren’t resonating with people that might take more than a an hour or two to rethink how you’re gonna do. But there’s definitely a lot of low hanging fruit in the early days that you can change quickly. I recently did one of these for this like e-commerce browser plugin that when you’re shopping it would pop out. And there were all these different ways, like do we show you similar products organized this way? Do we sort it by price?

What do we do? And so instead of just prototyping one, we up prototyping like five different options along the spectrum. Here’s like a lightweight one with minimal information and then you click through to see the rest. And then let’s like pull in more information into that browser side panel. You don’t wanna go crazy and have 20 different directions, but if a team is like thinking, hey, there’s really two or three major different UX design directions we can go, that’s what I call exploring the solution space. And so you can test those three and then sometimes what you find is, hey, this idea from this really worked and this one over here. And then you combine them into like some mega one for the next round and, and it can be really interesting when you do that. 

Murray: Now people often ask At this point, what tools should I be using? Is it mural? Something else? 

Dan: So the first tool that’s great is a whiteboard mainly for internal team to get aligned. And I, I wanna echo what you said about having a cross-functional team do this together. Way better than just having one team member do it, because it’s great for the product developer to see what’s going on, not only the designer. The designer who designed it should be there, obviously, and the PM so we can all see what we see because there’s a big value in seeing it firsthand, the number of PMs who are the only ones in the user test, and then they go to the engineering and go, Hey Dave people were having trouble using your thing.

Dave, the skeptical engineer, just going, that’s what you’re saying. How do I know? It’s like hearsay. it’s like a court of law. when You have the video clip or they can see it then they see it, then it’s not like, oh, Dan’s just making this stuff up, or Dan’s interpreting it a certain way. So that’s very powerful. So whiteboard mural, digital tool paper pen and paper is great. 

Then the next level up, and I like to categorize these by fidelity, is actually like a wire framing tool. Like balsamic. Balsamic has always deliberately stayed low fidelity. So it’s a great way to bang out a low fidelity quick and dirty thing just to get like, what’s the flow gonna be? What’s the high level UX gonna be, and what are the key features gonna be or not, so that’s where, like I mentioned, if we’re debating about some big feature, whether it needs to be in scope or not, I will do a balsamic wire frame without it. And balsamic supports interactivity where you can click here and it’ll go to a different wire frame. So you can string together a set of wire frames. Before Figma was out, your designer would use a tool like illustrator, Photoshop or Sketch and create the high fidelity mockups. But those wouldn’t be intrinsically interactive. So best practices would be to get the images exported, and upload those to a tool envision , or something similar where you could go as a non-technical person, non-designer could go in and say, okay, I’m gonna create a little invisible rectangle hotspot around this button. And when they click that, it’s gonna go here. 

So Figma it’s basically like the balsamic for high fidelity now, where it has intrinsic clickability tap ability in the objects. And you’re doing the creation of the assets and the interactive producting screen together all in the same tool in the cloud, so it’s collaborative. The older tools were desktop tools, so if you wanted to try to edit my thing, we get into this version control issue with the software.

If you’re wondering why is Figma so popular, it’s because it combines the high fidelity design tools that the UX designers need. It combines the let’s string this together with the interactivity. When you tap here, it goes here and does this thing and it’s collaborative in the cloud. And there are some other tools that are more specific for mobile, although Figma does pretty good job on mobile as well. But yeah, that’s pretty much what you wanna be able to do and by the time you get to that high fidelity where it’s pretty good design, pretty representative of what it’s gonna be with the colors and images, and you’ve got that interactivity of clicking and tapping going around at that point the suspension of disbelief goes away enough that people get into it and they start using it and interacting with it. 

Murray: The other question that people ask a lot is, how do I recruit customers? How do I get people to come in for these tests? 

Dan: Yeah. back in the old days there were brick and mortar places that would have panels of people in the local area that had opted in to be called for research. So we would. Write a, a list of criteria and then they would call all these people on the phone and they would recruit ’em. We need to pay the agency to recruit them a certain amount of money. And then you need to pay the person for their time. The good news is these days there are digital equivalence now of that, where you can just say what the criteria of people you want to recruit are and just recruit them. So user interviews.com Is a tool that I like to use. User testing. They basically not only have the panels, but then they have a bunch of software downstream to like set up your tests, record your tests, all that jazz. So the good news is there are a lot of solutions out there these days, and historically it’s always been easier to do b2c, like consumer recruit but if you wanted to find a director of marketing, it was harder. Increasingly, over the last few years, all these tools have adapted to that need where you can search by job title for example. So you can put in criteria and things like that. Luckily, it’s never been easier to , find people and usually just have to give them an incentive for their time to do it. 

Murray: I think we’ve talked through the whole process from beginning to end. We haven’t really talked about launching it into the market, but I don’t think you really cover product marketing in your book do you? 

Dan: I certainly realize the importance of marking, have separate talks about that. But yeah, basically the idea is after we’ve worked off all the rough spots with the prototypes and we have this nice shiny prototype that everyone loves, then we proceed to development.

At this point, now we have, the, the high fidelity mockup prototype that’s been validated with customers. We make sure we have a corresponding set of user stories that goes with that. That’s what we give to dev. Dev builds it in an agile manner. And then we launch it. And what I do cover post-launch in my book, I don’t talk about the marketing aspects of launch, but I talk about the importance of post-launch, closing the feedback loop with analytics and with qualitative feedback, because you wanna test with prototypes before you build, and that will work off most of the rough edges. But once you launch your real product in the wild that allows the level of validation that you can do to go to even higher levels. You see who actually uses your product, how they use it now that it’s actually working in live. And so it’s really important post-launch to get that feedback of the analytics and qualitative. 

Murray: So most companies at this point hand it over to a development team and the UX designers go off to do something else and the customer testing stops. But that’s really not what we should be doing at all, is it? How do we do this discovery work during development? 

Dan: There’s two ways to interpret that. One is what I covered, which is post launch analytics. But I think what you’re asking is if the dev team’s gonna be heads down building what we just validated, how do we go on and do discovery about the next thing? And if we say who’s responsible for discovery it’s not development, it’s not design, it’s product management. So it’s a balancing act that the product managers have to do of what’s the next set of customer problems that we need to be addressing? What’s the next set of functionality we need to be building to address those problems while we are managing the development of what we had previously specked out. So depending on staffing rate, that’s why I’m saying the number one impact of that is staffing ratios. And so in the extreme case, your PMs are stretched so thin that they don’t have any time to do discovery. So it really comes down to the ratio of PMs to devs. And if your average PM has like 20 developers or 15 developers that they’re trying to work with that’s a really high number. It’s gonna probably prevent them 

Murray: Well, this is where product owners probably come in quite handy because if you have a product owner in each team, they can be almost like a junior product manager. 

Dan: yeah. Well, And that’s one of the big debates out there, is should you have separate product owners and product managers, or should they be one in the same person, the pros and cons are for companies that are struggling with what I just decided, they say, okay, why don’t we split the role and we’ll have POS just manage the scrum process and we’ll have PMs that are out there doing more figuring out what we should build next.

That sounds great. The way that falls down is when the PO And the PM are not like in synch. And what happens is the PM gets decoupled from the dev team, the PM comes up with some ideas and then it’s like they’re throwing it over the wall and then the PO and the dev team are like, where’d this come from?

What’s this? We weren’t involved, we’re not bought in. So theoretically it sounds good, and I’m not saying it can’t work, but it really relies on that PO and the pm. So what does work best is what you’re saying, where there’s some like reporting relationship, in there they’re more of a junior PM or something where there’s a reporting relationship that helps ensure that.

And a lot of times what’ll happen regardless of how you split PM or PO or not, or you keep it the same, it tends to be the more senior PMs in group PMs that tend to do more of the higher level strategic discovery work. So there’s a natural moving up like, okay you’re, you’re splitting your time between delivery and discovery.

I would argue even at the junior PM level, I wouldn’t want ’em be spending zero on discovery. But as you move up, you’re gonna be spending more time on discovery. Some teams will say, when there’s a big new investment area they want to go into, they will create a separate team just to pursue that. It may have like just a few dev resources assigned so that they’re in the loop and can weigh in on what’s feasible and not, but we know we need to do a bunch of upfront definition and discovery before we even develop this thing. So that’s a way to splinter off a parallel set of resources and go do this upfront research to figure out a new idea , so that your ongoing teams don’t have to worry about that. 

Murray: Yeah, we’re gonna be talking to Theresa Torres about continuous discovery soon, And I think she’s got a whole process to make sure that you keep doing that. 

Dan: So , there’s no reason that when Deb is heads down building the thing you just validated, you can’t start working on the next one. It really comes down to how are the teams being tasked if they have more longer term objectives and goals. If the duration of the team, like what I call the durability of the team, if it’s long, then they can keep doing that discovery within their own team. But if everyone’s getting re-tasked to different objectives all the time, then that’s really hard. 

Murray: I think the assumption here is that we have a ongoing product team that’s made up of Product manager, user experience designer a very senior engineer, then engineering team members and testers and whoever else needs to be in there that just works together for a long period of time. 

Dan: That is the goal, but there’s a couple of extremes. One can be back to what we were saying is. They are pretty much told what to build, or they’re told, for the next two sprints, work on this, but then we’re gonna change you to this and then we’re gonna change you to this. What I mean by retasking. Even if it’s the same people, if the problem that they’re trying to solve gets changed very frequently, they’re not gonna be able to do the continuous discovery work. So the best thing you can do is give them a long-term objective or problem to be solved.

And that’s where Marty Kagan’s concept of empowered product teams come in. If they’re more empowered and they have steady assignments and they have long-term objectives, then they’re gonna be able to do this on an ongoing basis. 

Murray: Yep. So Shane, why don’t we go to summary since you have to go. 

Shane: Okay. Summaries. So for me this whole episode was just full of patents that I need to go and, learn more. 

I like the idea of the permit. I like the idea of what’s a group of customers we’re going after? What are the problems they have or benefits they need? What are the features we think will support them, achieve those benefits or solve those problems? What’s the user experience? We can wrap around that to make it delightful, and then let’s go out and test it. 

I’m interested in your use of the Cato model. I found it useful from a product point of view but I tend To coach data and analytics teams and I found it incredibly difficult to use it from a, data point of view, what’s must have versus delighter. I love the examples of Uber, rider and drivers. They’re different pyramids, different apps, different conversations. And again, Salesforce, buyers, salespeople, admin, consult. Again, four different pyramids, because it’s a different thing we’re focusing on. 

I also love the idea of, if you talk to 10 people and five people think your product’s great, and five people think it suck is it just that there’s actually five of them are the ones you should be talking to? Or is there something you’re just missing that will get you all. 

I love the idea of doing value prioritization for the customer first. So this idea of focusing on things that are high importance and load of satisfaction, and those are the ones that you want to go after first. And then a second prioritization which was around return on investment, effort versus build benefit. 

I love the idea of analyzing existing products in the marketplace with that must have performance and delight behavior and then them versus us to find the gaps, whereas the niche you think you’re gonna go after. And then obviously going back and doing it time and time again, as those other products or competitors breach into areas that you’ve got. Then you gotta reevaluate where you sit, where your value profits. 

And then the last one is that mvp. I’m gonna have to go and listen to this one again and make some more notes because I didn’t make enough and there’s so much for me to unpack. So that was me. Murray, what do you got? 

Murray: I found a lot of what you’re saying. Very practical, really useful. My only concern is the concept of a pyramid and I would like you to change it to a concept of a circle. Because , just about every company is gonna see this and go, oh great, we only need to do this once. We’ll spend 12 months going through the pyramid. They will hand it over to the engineering team and go on to what we’re gonna do next year, 12 months from now. Whereas the reality is, this stuff still needs to be tested. You still need to learn. And I would really prefer if we took the idea of let’s just spend one month, maybe even two weeks. Or do two design sprints, go through the process And then build, measure, learn, and continue to do your whole discovery assumption testing iteratively and incrementally. I just think too many people are gonna see this and go, yeah that’s a top down managers tell people what to do type of approach. 

Dan: Too bad we’re on a podcast cuz we can’t see the actual image. So when you work your way from the bottom to the top, there is a circle there. There’s a loop that connects the top to the bottom. So there’s no exit path actually. So ,if you take it literally, you’re just gonna keep looping around forever and ever. So yeah, 

Murray: Yeah. Okay, good. Well, Cuz you did talk about a pyramid. 

Dan: The pyramid explains product market fit, but then the process adds that loop from the top of the pyramid to the bottom basically, is what it 

Murray: Yeah, I think it’s very important for us all to understand that a lot of these things we were doing are hypotheses and really we need to just pick off the most important hypotheses and run them through a test measure, learn cycle. Will learn a lot from the prototype, but then we’ll also learn a lot when people are actually using it and buying it. People like it in the prototype, but they won’t buy it. It’s quite a common issue, for example. Or, they use it in different way than they said they would. would 

Dan: Yeah. Well, The thing, one thing I’ll jump in on there for what it’s worth is you wanna watch out for false positives when you test it, because most people are nice. Oh, this. is a lovely product. Oh yeah, this is great. My favorite is well, I wouldn’t use it, but I’m sure someone else would really like this. I won’t get into it, but there’s advice called skin in the game. How do you put skin in the game to minimize those false positives when you’re testing with people? 

Murray: Yeah. I think as long as people come away from this thinking, this is a cycle we go through continuously. We have a combined product team. It’s not product on one side and engineering on the other side, and we just hand it over. We work together and we’re doing this at the same time as we’re developing. Building is very expensive. It’s not the best way to learn. So let’s go through the whole research interviewing, getting outta the building testing prototypes thing, but let’s combine it with the building so it becomes a circular thing that just keeps going on.

Dan: Yeah. That’s the intent. If you look at the logo from my meetup, it’s a Venn diagram for product management, dev and ux, and that’s how I was trained into it. So the best product companies in the world have that strong collaboration across the different groups, and they don’t treat it as silos or they don’t treat it as two highly different phases of two different processes. and that’s the balancing act. That’s So many companies, have the delivery part down where most companies are weak is the discovery part. And some of it is, not having the resources but balancing those two. Cuz in the end of the day, shipping is gonna trump, most companies their mentality is shipping trump’s anything else. Even if we’re shipping the wrong stuff for stuff we haven’t validated, 

Murray: Yeah, Yeah. And most companies don’t even measure outcomes or what customers want. All right. How can people find you? Obviously there’s your book, the Lean Product Playbook, which is available on Amazon. You also mentioned a meetup that you do once a month. 

Dan: Yeah, it’s just meetup.com/lean hyphen product. So You can join the group for free on meetup.com. Hopefully a lot of people are familiar with Meetup. It used to be in person here in Silicon Valley where I live, but with covid, it’s, we took it online and it’s been great. We actually have a lot of people from New Zealand and Australia dial in because the time zone works out really well. Plus all over the states central South America and even people in Europe. And then we put those Videos on the YouTube channel. It’s just YouTube dot com slash Dan Olson. We have over a hundred videos there. We just hit 12,000 subscribers, so it’s one of the best product channels out there. 

My videos are there. Way more than my videos are all the speakers that I host at the meetup. And the main place people can find me is that my name it’s dan hyphen olson.com, d a n hyphen o lse n.com, and that’s got links to the meetup and the YouTube channel and to my book, which it’s available in Kindle and Hardcover and Audible.

Murray: And you run public training courses.

Dan: I run public training courses. I try to do ’em like four or five times a year, but what I mainly spend my time doing is private training courses. So companies that wanna apply the things that we just talked about.

Usually after they’ve gone through their agile transformation is when they ping me. They’re like, okay, we’ve got our agile transformation going. Now we realize it really matters. What do we ask the agile teams to build. and that’s when they bring me in. Sometimes I’ve trained just to the product managers, but what I really enjoy is when we do cross-functional training. So it’ll be the product managers plus the designers, plus the developers, and then we do all the exercises with the scrum teams, as a team working together. I do speaking at private events also with covid ending, a lot of people are doing a lot of company get-togethers and offsites and strategies sessions. So a lot of times I’ll speak at those or teach workshops with those. So yeah, private and public, training product teams. What I love to do,

Murray: All right, Dan. That’s awesome. Thanks for coming on and talking to us.

Dan: thanks a lot. It was great talking with you all 

Murray: That was a no nonsense. Agile podcast from Murray Robinson and Shane Gibson. If you’d like help to build great teams that create high value digital products and services, contact Murray evolve.co. That’s evolve with a zero. Thanks for listening.