Continuous discovery with Teresa Torres
Join Murray Robinson and Shane Gibson as they converse with special guest, Teresa Torres, a renowned product coach, speaker, and author of “Continuous Discovery Habits.” In this episode:
-
Torres elaborates on continuous discovery, a crucial method for product managers, designers, and software engineers, ensuring the development of the right product with continuous improvement.
-
Gain insight into how to strike the balance between action and research – taking effective steps forward without getting sidetracked by possible mistakes.
Stay tuned to this conversation and learn how to adopt continuous discovery habits to consistently develop and enhance your product, ensuring its alignment with market needs.
Resources
Subscribe
| Spotify | Apple Podcasts | Google Podcasts | iHeart Radio | PlayerFM | Amazon Music | Listen Notes | TuneIn | Audible | Podchaser |
Recommended Books
Podcast Transcript
Read along you will
In this episode, we talked to Teresa Torres, a product coach, speaker, and author of continuous discovery habits. She explains how product managers, designers, and software engineers can use continuous discovery to ensure that they’re building the right product and continuously improving it over time. In this interview, you’ll learn to balance action with research so that you can get started without being blindsided by what you don’t get right. If you want to discover products that customers love that deliver valuable business results. Then this interview is for you.
Shane: Welcome to the No Nonsense Agile Podcast. I’m Shane gibson.
Murray: And I’m Murray Robinson.
Teresa: And I’m Teresa Torres.
Murray: Hi Theresa. Thanks for coming on.
Teresa: Thanks for having me. I’m excited to be here.
Murray: We wanna talk to you about your book Continuous Discovery Today. But first we’d like? to know a bit more about you and How you got to this point in your life, in your career.
Teresa: Yeah. So I work as a product discovery coach. It basically means I help teams figure out if they’re building the right stuff. Since this is a Agile podcast, I’ll connect it to the Agile Manifesto. One of the things in the manifesto is this idea of frequent small batches, frequent customer feedback. And I think that’s really at the heart of discovery is how do we make sure that as we decide what to build, we’re including good feedback loops with the customer.
Murray: But what has your career been. How did you get into this
Teresa: So before, coaching, I worked in a variety of product management and design roles, mostly at early stage startups. That was early in my career. Went the typical management path, ran product and design teams. Was a startup CEO for a few years. I don’t really recommend that, and eventually saw the same problems everywhere and decided I wanted to focus on helping teams spend more time with their customers instead of building products directly.
Murray: What was the startup you did?
Teresa: I actually didn’t found a startup. I became the c e o of a company that I joined as an employee. That company was called Affinity Circles. We, ran online communities for university alumni associations and then eventually got into the recruiting space and helped companies, recruit from those communities.
Murray: Okay, so you pivoted.
Teresa: Actually, it was just sort of An add-on. It was a way to have a more reliable business model than relying on universities for our revenue source
Murray: what is continuous discovery to start with?
Teresa: So I’ll start with just a simple answer, which is when we talk about discovery, it’s just represents the work we’re doing when we’re deciding what to build. So everybody on the planet that builds a product is doing some discovery. Good discovery includes the customer in that decision making process, and then continuous discovery is just this idea that digital products are never done. We’re continuously evolving them. We’re continuously developing them. And so therefore, I think we should continuously do discovery and make good continuous decisions about what to build with the customer involved. So it’s really just about continuously engaging with customers in a way that helps us make better decisions about what to build.
Murray: So there’s plenty of research that shows that a large amount of the features that we build in products are never used, like something like 70%. Pendo did some research and then Standish chaos report. So why is that?
Teresa: I think it’s that we hold really strong beliefs in things that aren’t true. When an idea feels like a good idea we tend to suffer from confirmation bias. We see all the evidence that says our idea is fantastic and we miss the evidence that says our idea is flawed. Some of it is just business culture. We’re used to working inside of our business, inside the company building with our coworkers and we don’t spend as much time outside the building, as Steve Blank likes to say, and getting feedback from customers. and some of it is cuz it’s easier, it’s easier to just build, what’s in our head. For a lot of people that build software, we like to spend all day at home on our computers, it’s not necessarily our favorite thing to get out and talk with people. We’re in this industry cuz we like building stuff, and so we sometimes forget, we also have to take the time to talk with folks.
Murray: I think it is very common for executives to tell product managers this is what we’re gonna build, and then it just goes onto this assembly line and eventually you find out whether people like it or not.
Teresa: Some of this is rooted, I think in business culture, so if we look at before the internet, how did companies make products? They went out and did market research. They came up with their best guess. They built a product. And really what determined whether that product sold or not was whether they could get it on store shelves at eye level, it was all about distribution. The internet has changed that a little bit, like distribution obviously still matters. but we see a lot of companies get traction by having a better product. But to get to a better product, we have to have different methods for how do we determine what’s better, how do we test what’s better? But a lot of the executives in our world grew up in this old business model, and I think people, all of us, are really slow to change and we’re still in the middle of this glacial change of how business might work or could work in an internet era.
Murray: I think also that. people who are very sure of themselves are more likely to get promoted.
Teresa: I think there’s definitely some of that. A lot of executives got to where they are because they were right in the past. So for them to say, I’m not sure, or I don’t know, or let’s test it, it’s not always safe. It’s not always this most savvy career path. I think we’re starting to see that change. Silicon Valley has long beat this drum of celebrate failure. I don’t love that framing. I’ve been at a startup that mostly failed and it was not fun at all, and I don’t wanna celebrate it. But I think the underlying ethos is good, which is we have to try a lot of things. It’s better to learn that something’s gonna fail much earlier in the process than much later in the process. And the fact that this is becoming part of our language in business, I think is really helpful. And we’re now in an era where I think even traditional companies are starting to act like internet companies. So we see banks and insurance companies and airlines and healthcare companies trying to act like the Googles and Facebooks and Netflixes of the world.
Murray: I agree. I think that. Most of what product managers get told or come up with should be treated as hypotheses that need to be tested before we build them because we’re wrong so often, even when we are sure we are right.
Teresa: You know, I think most things in business should be treated as hypotheses. And that’s even things like who’s our customer? What should our goal this quarter be? What should we build? What impact will what we build have. All of these things are guesses. Every person in business, a salesperson will this prospect close? Will our marketing campaign work. There’s so much uncertainty in all of the things that we do. And I think historically, The business environment was simpler and it was easier to make predictions. Maybe we were never good at it, but at least now we can instrument things and see in black and white that we aren’t good at it and so I think people are starting to recognize that, hey, we gotta change our methods.
Shane: in the past, I don’t think we’re any better at guesstimating or estimating. Or predicting. I think what it was was the feedback loop was slower.
Teresa: a lot slower.
Shane: It was harder to realize we got it wrong, whereas now the feedback loops are far more dynamic. They’re far more visible. So we are getting that feedback a lot earlier that what we estimated or what we predicted. Didn’t work.
And I’m moving away from the idea of using the word hypothesis to using the word bet. Bet has has a flavor to, it has a feel that we are gonna throw some money out there and we’ve got a 50 50 charts of winning, which is really what we are doing, because we are having a bet that this thing’s gonna work and we may lose the money and get some learnings. But it truly is a bit. And if we think of it that way, I think culturally we’re gonna approach it differently than treating it as a research project in a hypothesis, which yes, we get some additional reduction in uncertainty when we do that, but we’re still making a bet, we’re still not confident that it’s gonna work. So let’s get that bet out there. Let’s spend as little as
Teresa: I love that you said that. I also have moved away from the hypothesis language. It sort of sets the expectation that we’re scientists and that we’re gonna rerun these double blind, randomized controlled studies. And if you are not doing that, people are gonna pick apart your results. I’d like to talk about assumption tests. So what assumptions are we making and how can we test those? And then I also like the bet language as well. And one thing I tell teams is that, every solution idea you have is dependent upon a set of assumptions. How do we surface those and test those? But even after doing a lot of assumption testing, you’re still making a bet. There’s still risk in the idea. And the goal of assumption testing is how do we remove some of that risk? And then at the end of the day, we have to make a bet.
Murray: So what’s the best way to include customers in decision making?
Teresa: I think about this in a couple different ways. So I think there’s the generative side, which is how do we just go out and learn about our customers and what’s going on in their world? What are they trying to do? What’s the environment they operate in? So I like customer interviews for that. So just how do I get out and talk to customers on a regular basis and learn about what’s going on in their lives as it relates to what I’m building? There’s an evaluative part of it, which is I need to build a thing and I need to know if the thing I’m building is gonna work or not. And that’s where I like to use assumption testing. So as we have ideas, as we consider different options, how do we get really clear about what assumptions depend upon?
And then how do we rapidly get feedback on those assumptions? It sounds simple, but there’s this hidden power behind it. When we test our ideas we tend to wanna build everything. In the Lean startup there’s this build measure, learn loop, and everybody thinks I have to build the whole solution before I can learn. Whereas when we move to thinking about assumptions. Assumptions tend to be smaller atomic bits that you can test faster so you build less while you’re learning and you ramp up on learning just a ton.
Murray: So we were talking to Annabela Cesraio at OutSystems about continuous discovery cause her team had been on your course. And she was talking quite a bit about the opportunity solution tree Can you explain what that is?
Teresa: So, an opportunity solution tree is just a visual that helps a team externalize their thinking in a way that helps them consider the best options for how they could drive an outcome. So there’s some jargon in there. Let’s break it down. A lot of product teams now are being asked to move some metric.
So in the old days, they were asked to deliver a list of features. Here’s a roadmap, deliver these features. Now companies are starting to recognize we can’t predict the future. We don’t know what you should build, but whatever you should build it should have this impact. So we’re gonna measure that impact with the outcome. It’s things like increasing number of subscribers or increasing subscriber retention, or someone like Netflix. Increased viewing engagement. For B2B products, it’s often like, improve the productivity of your end users. Some number that they’re trying to move that indicates they’re having success with their customers should be tied to business success. And then the opportunity Solution Tree helps you start with an outcome. Look at what are the customer needs, pain points and desires that we could address to help us reach that outcome. I call those opportunities. and then what are the solutions that we can align with those opportunities. And so I really believe discovery is a team sport. We should be doing it together, product managers, designers, software engineers. But it’s hard to work on a messy, complex problem with lots of people. And so the Opportunity Solution Tree helps you kind of externalize and visualize some of that mess so you can stay aligned and make better decisions as a team.
Murray: Can you give us a practical example of a opportunity solution Tree?
Teresa: It’s a little bit hard to verbalize cuz it is a tree structure. It can branch out pretty quickly, but I’ll give an idea of it. As an example. Let’s say we’re a product team at Netflix and we’re trying to increase viewing engagement. There’s a lot of ways we could do this. I think we should do it in a customer centric way. So where I wanna start is I wanna interview some customers and learn about what motivates them to watch Netflix, what’s preventing them from watching Netflix. And as I do that, I might hear people say things like, I can’t find something to watch. I can’t find the movie I wanna watch, I can’t get back to the show I was watching. My Internet connection is too slow and it buffers forever. These are all customer needs and pain points that I can start to map out. And what that does is it helps me get a big picture of you, of how am I gonna reach my outcome? how am I gonna get more people to engage? Well, there’s all these sort of paint points, needs and desires I can address. Mapping them on this visual helps you take an inventory of everything rather than overreacting into the last thing you heard. But also this, the tree structure in particular helps you break down these big hard evergreen challenges. Like, I can’t find something to watch into easier and easier opportunities, that you can deliver value on, on a short timeframe. So it also kind of helps unlock this continuous cadence.
Murray: So you are doing this through interviews, and I know you’ve talked about an interview snapshot, so what’s that?
Teresa: So historically when we’ve done kind of qualitative research, we’ve interviewed like a dozen customers and then we, synthesize everything we’re learning and we create a research deck and we share it with our teams. Nobody has time for that anymore. Nobody can take two weeks off, really a month off to recruit, interview, create this shiny research deck.
So what continuous discovery teams are doing is they’re just talking to a customer or two every week. And so since we’re doing that continuously, there’s never this stopping point where we’re gonna stop and synthesize what we’re learning. They have to synthesize as they go. So an interview snapshot is a one page template that helps you synthesize what did you hear in your last interview, and how do you synthesize that in a way that’s really actionable?
So it includes the opportunities you heard in the interview that later will end up on your opportunity solution tree. And then it includes a lot of other things to help you remember what you heard in that interview. So like a photo of the participant, a key emotional quote other insights that came outta that interview and experience map of their story. But it’s just a way to quickly synthesize what you heard, so that as you continuously interview , you don’t have this enormous research debt that you’re accruing where you gotta stop and synthesize what you’re learning.
Murray: Can you just run us through what you mean by an experience map of their story? is that like a user story map?
Teresa: It’s a little different. So user story maps are usually maps of a solution. How is a customer getting value from a solution? One of the things I teach in my interviewing class is when you’re interviewing a customer to ask for a real world story of something they did recently.
So again, if I stick with my Netflix example, I might ask you, tell me about the last time you watched Netflix, or Tell me about the last time you engaged in some home entertainment. And then I’m collecting that story. There’s a lot of reasons why I wanna do that. It’s gonna help avoid cognitive biases.
I’m gonna get actual behavior. But also opportunities, needs, pain points and desires start to emerge from those stories. So one of the things I like to capture on the interview, snap. It’s almost like a flow diagram of what happened in that story. So the goal of it is not like the design thinking, feeling, doing messy experience map.
It’s more of a structural experience map of first this happened, then this happened, then this happened. So I’m trying to pull out the underlying spine of the story. And what I see in practice is it helps teams find commonalities across what seem like unique stories when they do that.
Murray: Okay, so is this like a customer had this goal and they took these steps and they had these pain points?
Teresa: Yeah. So it could be an experience map for that. Maybe I heard a story about somebody who was at work. They had a coworker tell ’em about a really compelling movie. They went home and talked to their spouse about it. They had a negotiation about whether they’re gonna watch that movie or something else. They had a hard time finding it. They didn’t know what streaming service it was. They had to try a few different places. They finally found it, they watched the movie, and then neither one of them liked it. Really simple story. The experience map would just be like those discreet moments.
Murray: So if you are interviewing people every week, how do you find these people? How do you get them to come in for interviews?
Teresa: One of the most effective ways is to recruit people while they’re using your product or service. I’ve actually worked with teams to fully automate this so that the interview ends up on their calendar without them having to do anything, which is pretty great. It makes interviewing just like any other meeting that recurs on your calendar every week. The idea is, let’s say you go to a website. We often see these embedded NPS surveys or little one question surveys that pop up. It’s the same idea, but you can just ask , Hey, do you have 20 minutes to talk to us? You can offer an incentive and then when people say, yes, you can pair it with scheduling software where the participant can select a time on your calendar. the goal with this is to make it easier to interview than to not interview.
Murray: Yeah. I’ve done this thing in the past with teams where I’ve set up Thursday morning we just do three interviews and we book them in ahead of time. I did that as part of design Sprints, so we didn’t do it for a long period of time, but I know it’s part of the design sprint methodology as well.
Teresa: Yeah. There’s a lot of overlap.
Murray: All right. So you’ve got them in. How do you structure the interview to get good results?
Teresa: Yeah. So we’ve already touched on the key idea. The key idea is to keep the participant grounded in a specific story about their past behavior. That sounds so simple, but it’s actually hard in practice. So if I ask you, like, tell me about the last time you watched Netflix, you’re probably gonna say, I watched a movie last night after dinner. it’s not a very good. So the interviewer has to actually do the work to excavate the story. I actually stole this from the Pixar folks. They talk about when developing a, movie storyline, they’re excavating the story. And I think it’s a great analogy.
So I gotta help situate you back in that moment. Like, where were you when you decided to watch something? Tell me about that. What was that like? What happened next? I can ask you to set the scene so I can get a sense for, are you in your living room? Are you on your iPad? Are you on a computer? And then I really just wanna go step by step and help you tell every little last detail of the story.
And the reason for this is like most product teams, they wanna take shortcuts. They wanna just say, well, what do you like to watch? What device do you watch on? Who do you watch with? The challenge is your brain is gonna give you a fast answer to all of those questions, but those answers aren’t necessarily gonna reflect your actual behavior.
And again, that’s just cuz of the way our brains are wired, right? We have all these cognitive biases that interfere. We think we are better humans than we are. We suffer from recency bias. If you like action movies, comedies and documentaries, you’re probably gonna tell me about the documentaries cuz you aspire to watch more of them.
There’s just a lot of reasons why our answers to direct questions outta context are not very good. So we really wanna work to learn how to collect a good story.
Murray: What about showing customers prototypes of potential product features?
Teresa: Yeah. I don’t consider that interviewing, I put that more on the assumption testing side. So I do think we should test prototypes and we should get feedback from customers, but I think it’s a fundamentally different activity. It’s more evaluative. Whereas collecting customer stories is more generative. In the interviewing case, we’re looking for insights, we’re looking for opportunities, we’re looking for problems to solve. On the assumption testing side, we’re evaluating which ideas might work.
Murray: So you wouldn’t mix them up, spend the first half talking about the problem to be solved, and then the second half talking about potential solutions.
Teresa: I think you certainly could do both in the same session. And I might be splitting hairs here, trying to distinguish it from an interview. I think the skills to do each are very different and that’s one of the things I’ve been really trying to differentiate between is these two activities. Cuz I meet a lot of teams that say, oh, we interview customers every week. Whereas really what they’re doing every week is they’re usability testing and they’re missing out on the benefit of interviewing.
Shane: And I think if we bundle the two together, we’re gonna ask the person we’re interviewing to context switch. We’re gonna go from tell us a story about what you do, cuz I’m trying to figure out where the problems are to, here’s a solution, does this work for you? And that switch between those mental states would be quite difficult for somebody I would assume.
Teresa: When you’re collecting this story. We really want the interview to be all about the customer and about their experience. And as soon as we switch to testing solutions, now the interview is all about our product. And so there’s a little bit of like a mismatch there. You can do both in the same session. And actually where it can be really helpful to do both in the same session is to say, first, tell me a story and then test your prototype in the context of the story. So in this specific instance that you were telling me about, now imagine you had this prototype, what would you have done differently? That’s really powerful, but it takes skill and you’re probably not gonna do that until you’re pretty close, to evaluating a solution because you probably need a pretty functional prototype for that to be an effective prototype test.
Shane: How often do you find that the person you’re interviewing is the wrong person? So if I give you an example of buyer versus user. So talking to somebody around our data platform, our data product. I tend to ask them for real life use cases where they’re struggling with data. And this one was a financial services insurance company. And so the person I was talking to was a buyer not a domain expert, not a user, they hadn’t felt the pain. So when I said, okay, so tell me who does what in the organization from an insurance point of view, they couldn’t articulate the core process.
So it’s hard to figure out where the real pain was because they were sitting above it, they weren’t actually involved in experiencing it. So you must find that a lot in terms of that, iterative discovery process that you have the wrong person in virtual room to achieve what you need to achieve.
Teresa: Yeah. So a lot of this comes down to your recruiting strategy, how you’re screening people and what you’re trying to learn that week, right? So if you’re really trying to understand who in the organization is using data, and you’re talking about end user needs, you wanna make sure in your recruiting strategy and your screening that you’re going after end users and not buyers. Sometimes we get it wrong. Sometimes we think we recruited the right people and we’re in the room and we realize like, this person doesn’t have a relevant story. We need to be able to adapt. So like if I was in that situation, I would quickly change the scope of my interview and I would just learn what I could from that buyer about how they make buying decisions, I would say.
Tell me about the last data product you bought. what was that experience like? . And so one of the things I often say is the golden rule in interviewing is that your customer story trumps everything else. So what they wanna share is what’s most important. And if you need a story that’s different from what they wanna share or able to share, tough luck. And the good news is, if you’re interviewing continuously, your next interview is right around the corner. And so if you miss the mark on one, it’s not that big of a deal.
Murray: So let’s talk about the continuous part of this, because a lot of the stuff we’ve been talking about is usually done upfront by a market research company or your user experience design team if it’s done at all. We very into agile. We wanna know how do you do it continuously so you can learn things as you go.
Teresa: Let’s imagine you’re an engineer on a team. You’re in the middle of a sprint and you’ve, got a set of user stories you’re working on. You pull the next story off the top of the backlog. You start to look at it maybe, there was a ton of market research and you’ve got a really good product requirements document that supports that user story. so you’ve got a lot of context. you know what problem it’s designed to solve. You’ve got some customer quote. ,you know exactly what the goal is. You’ve got pretty good context. That’s like a best case scenario. But let’s just imagine that happened and you start building. What’s inevitably gonna happen. You’re gonna run into a design decision. Maybe it’s a data model decision. Maybe it’s how the workflow of the feature should work. Something that you’re not gonna be sure about how to answer. Maybe if you’re a really diligent engineer, you’re gonna go look through the product requirements document or ask your product manager. And your question is so specific, the research didn’t cover it. So what do you do? What most people do is they just make their best guess and they hope it works. What a continuous discovery team does is they’re able to quickly test an assumption.
And that moment that I described where an engineer or a designer has to make a decision in a sprint with very little time happens continuously. And so if we always have an interview coming up, if we have really good tools that allow us to run assumption tests in a day or two, no matter how much market research we have, we’re equipped to get fast answers to our daily questions.
And I think we underestimate how often those situations come up, but we see it in our products. Mental models that don’t quite work, data models that restrict when we’re able to do features that hit the conceptual mark, but just don’t work.
Murray: It happens all the time with engineers. They’re always facing choices on how to implement things. They’ll go and talk to the designer but I think having customers available would really help as well. Maybe being able to do some um, online testing where you are putting up things before people. So there’s some tools where you can digitally watch people use your product and even intervene and ask them questions.
Teresa: Yeah, so we have amazing discovery tools now. So if we think about the assumption testing side, We’re talking about usability testing prototypes. We have unmoderated testing platforms where we can upload a prototype walk away and come back the next day and have a dozen videos of people using our prototype. We have what I call one question survey tools where we can rapidly collect feedback from lots, of customers in as little as an hour. Because we’re running the one question survey within our product or service. We have really good data mining tools and analytics tools where we can start to watch real-time what our customers are doing on our sites, which means from a live production prototyping test standpoint, we can minimize who we expose ’em to and get real-time data in an hour or two. I remember the days where to test a prototype, we had to spend weeks recruiting people. People even rented out facilities with two-way mirrors. It cost tens of thousands of dollars, and we did it maybe once a quarter if we worked at a really customer centric company. But now we live in a world where we can do this literally every single day for 50 bucks a test at most.
Murray: What are the best tools
Teresa: They’re always changing. I can tell you the unmoderated testing platforms, user testing.com, really innovated in this space. Maze is a fast follower. I think it’s maze.co. We have tools like Intercom and Pendo that are really great for just customer communication and allow you to do some of those in product surveys and onsite customer interactions. Qualaroo, Ethneo, Hot Jar. Validately and Look Back have added unmoderated testing. On the analytics side, we have companies like Mix Panel and Heap Analytics and Amplitude. Tableau is making it easier for people that don’t know SQL to work with data. There’s no excuses Anymore. The tools are pretty spectacular.
Shane: I think tools are also part of the problem. If you look at the data world, if you have to put together what we call the modern data stack, which is the latest wave there’s some people like five to 23 different products you’ve gotta integrate together to make it work. And you have to have domain knowledge about which one works with which and which is the best. It’s the same thing I found when I entered the product world is there’s so many different products that do one little thing really well. We started off with Sketch and then we moved to Figma and that was a big change gave us massive value. But then when you start adding all these other products together it’s a complex en environment to come in as a novice to actually understand where to start.
Teresa: I think there’s three main tools that a product team needs. They need access to a unmoderated testing platform. They need access to a one question survey tool, and they need to access to a behavioral analytics tool. There’s plenty of other things that’ll help you do your job. But you can start there. you can go a long way with even just one of those tools. You don’t even need all three tomorrow. You don’t have to do everything perfectly tomorrow. Pick one thing to adopt, get good at it, and then iterate from there. Take a continuous improvement mindset to the way we adopt this way of working.
Murray: So who should be in your product team then? What sort of skills should they have?
Teresa: lo I advocate for a cross-functional approach. I know we historically have worked in silos where product managers handed off requirements to designers and then both the design and the requirements got handed off to engineers. But I really think we build better products when those three roles collaborate from the very beginning. I know at a lot of companies we’re seeing a lot of diversity in titles. And so when I talk about those three roles, I’m not trying to leave people out. If you work at a company that has UX writers and user researchers and data analysts, and they’re all relevant to your products, then include them. I think it really is about. How do we get better and working together cross-functionally, and how do we learn to truly collaborate? What’s funny is humans know how to do this. It’s just business has taught us how to avoid it at all costs. And so some of it is a little bit of unlearning. How do we get everyone in a room and leverage each other’s strengths and come up with fun things to build?
Murray: All right, so how would you go implementing continuous discovery? What’s your change management process?
Teresa: Yeah. So it really does start with getting that cross-functional collaborative team working together. So I often refer to it as the product trio doesn’t literally need to be a trio. The concept is very flexible based on your roles on your team. But it’s just how do you change the culture in your organization so that more of your decisions about what to build are cross-functional? I think that’s the place to start Because even if you never talk to a customer, your products will get better by just including your engineers in the conversation. I think from there I would look at interviewing. I think interviewing is the highest return activity. I’m amazed at how many people work on products and have never talked to a customer. I will say for most teams I would start with interviewing, but if you’re in a really quant focused organization where no matter what you hear from a customer, people will push back and say, yeah, but that’s not a representative sample for those companies, I would start with assumption testing, cuz you can do quantitative assumption testing. My book is Continuous Discovery Habits and I titled Habit. Because I really do think this way of working is a collection of habits. I think you could adopt any of the habits in any order, and I think the key is for you in your unique organizational context, what is the easiest habit to start with? And then start there.
Murray: Yeah. We are big believers in that approach as well. We don’t really believe in installing something like installing Agile. It’s more about finding out what the problem is And coming up with the solution for you and then iterating on it.
Teresa: Yeah. I think this is more a mindset than it is a process. I mean, I definitely teach a ton of tactics and I definitely teach process, but I think the nugget is the mindset and, how do you develop that mindset. I’ve worked with enough teams to know every organizational context is different. What works here is not gonna work there. So it really is about how do you start with an outline. And then you fill in the sentences based on your organizational context.
Murray: Do you advocate for discovery stories and a discovery funnel? So the team has two types of work coming through discovery work and, building work.
Teresa: I don’t have a strong point of view on this. I will share that I’ve worked with teams that have included all of their discovery work in their sprints mixed in with their delivery work, and I’ve worked with teams that keep it separate. Some patterns I’ve observed. If you’re a Scrum team and you’re doing two week sprints, it’s a little bit hard to include discovery in there cuz discovery is unpredictable. You can’t size it, you don’t know how long it’s gonna take. If you’re a Kanban team, it’s a lot easier to mix in discovery with delivery cuz you’re just working on the next thing in the queue and whatever comes next, comes next. Most teams are somewhere in the middle doing a mix of both. So on this one, I really think what’s most important is the team experiment and find what works best for them.
Shane: I think about patterns a lot. That’s how my brain works and a lot of the people we’ve had on the podcast from the product world, I’ve seen two patterns. There’s the research heavy pattern, the way you described it, give me two to four weeks, leave me alone. Let me go talk to a bunch of people. I’ll come back and tell you what the answer is. And the pattern for that one is we don’t know what the problem is. We want to go out and discover what the problems in the world are, and then we can look at how we might solve them.
And then the alternative pattern is the mvp, the lean product pattern, which is, we have a theory or a bet, let’s quickly get it out there. Let’s see if it solves that problem. So we have a bet on what the problem is, and then we wanna see whether it’s a problem that needs to be solved and whether this product solves it. What I like from what you are saying is you sit in the middle, you are bringing, I almost wanna call it lean research, cuz we love to put the lean word on there. You’re saying that make it consistent, that discovery process and make it just in time, but keep doing it right. It’s not I think I know what the problem is. Here’s my solution, let’s go test it. But it’s also not a big brown out period of months to find that problem space.
Teresa: You know, I read something several years ago that fundamentally changed the way I look at this stuff. That I think is what help helped lead me to a continuous mindset. There was a design researcher, so he is an academic. He spent his career looking at what just sets apart expert designers from novice designers, and one of his major takeaways. Was this idea that for expert designers, the problem space and the solution space evolved together. We think we know what the problem is, and then we try to develop a solution and we get feedback on the solution, and we realize we kind of misunderstood the problem. And so as we test our solutions, we learn more about the problem, and then we design another solution, and then we test that we learn more about the problem. So there’s this back and forth movement between problem framing and solution development. It’s not a one time activity. When I read that, I was like, wow, we’re doing this wrong. We can’t start with research and then move to solutioning. They both have to happen continuously.
Murray: I agree.
Shane: But we also can’t start with solution and then go do the research. Because we are handing off. Instead of an iterative loop where we are constantly going through the cycle quickly. We’re water for. We’re mini water falling though. right? But we’re taking a step and then a handoff to next step rather than iterating those steps as fast as we can to get that feedback loop.
Teresa: You’re hitting on something that I think people, even though it’s been 20 years later, people are still fundamentally misunderstanding about agile. I don’t think most people understand small batches and small batches of value, not just small batches of code. If we think about the Lean Startup and Eric Reese talks about I M V U and the very first thing they did was they stubbed out the feature and nobody even clicked on the thing. It’s a really great example of running a demand test with as little work as possible. Whereas what’s happened in the intervening years, we think about M MVP as the version 1.0 of the product. He wasn’t even talking about a product.
I think what’s missing is how do we look at these first step teeny tiny increments of delivering value and then iterate from there. Stubbing out a feature that doesn’t exist isn’t actually delivering any value. It’s a great demand test, but it’s not delivering any value. And so we do have to get to what’s that small increment of value? But I think this is the heart of so much of what we’re trying to get at that I think is still very broadly misunderstood.
Murray: So let’s say somebody wants to start implementing continuous discovery. They’ve listened to us and had a look through your book. What are the common mistakes or antipas that people make?
Teresa: Yeah, you know, I see people get really excited about frameworks. So they read about the Opportunity Solution Tree and they go and sit in a room by themselves and they create their first Opportunity Solution Tree, and they’re missing the point entirely. The Opportunity Solution Tree is a way to synthesize what you’re learning from your research and as a way to do it as a team. So if you’re just sitting in a room by yourself making up opportunities, you’re not really getting the value from the activity. And if you’re doing it with your team and you’ve never talked to a customer, and all your opportunities are just based on organizational truths, you’re not really getting value from the activity.
It’s always really fun when you see a new framework that helps you think about something in a new way. But the purpose of an opportunity solution tree is not the tree itself. The purpose is it helps you do the discovery work. And I think a lot of folks are missing that. They want the framework to stand on its own. They don’t wanna put in the time to do the discovery work. And I think some of that is just time. I talk to product people that have eight hours of their day double and triple booked. I get the constraints. , But if you can’t find 30 minutes in your week , to talk to a customer, I don’t know how you’re gonna build a product that anybody cares about. I just don’t think it’s possible. So I think the biggest mistake people make is they try to do this by going through the motions and not really putting in the time to build the customer feedback loops. And I think it’s all about the customer feedback loops.
Murray: Yeah. I, agree with what you’re saying about Agile. Too many people focused on output, whereas agile is really supposed to be about continuously delivering value to your customers. that’s what it says in the first principle. A lot of people just treat it as a process that they can follow.
Teresa: This is the fault of business culture. Most teams are shifting from a waterfall mindset to a more agile mindset. And what that really means is minimized waterfall. I mean, look at
safe. It’s like the least agile thing on the planet. But that’s not all these individual contributors fault. We’re swimming upstream against a hundred years of business culture. And unfortunately it’s gonna happen slower than we all want. Don’t wait for your company to change. You’re gonna die with a lot of regret if you do that. Your company’s not gonna change as fast as you want. The good news is a lot of people can adopt these habits individually and on their own teams without the rest of their organization changing.
Murray: The vast majority of organizations are very bureaucratic and agile is anti bureaucratic, and so is continuous discovery. I think that what we’re doing is agile product development, lean startup, agile, design thinking, they all mix together. for me.
Teresa: I agree. I think they’re different flavors of the exact same nugget. And this is why, like, I’m not even dogmatic about my own content. People always ask me. what’s the difference between jobs to be done and opportunities? I don’t think there is a difference. I think opportunities are jobs and jobs are opportunities. I think for some the jobs to be done framework resonates, and for some my opportunities, framework resonates and I think each team should use what works for them. I think there’s lots of people trying to make the same ideas accessible to more people.
Murray: Yeah. I feel like it’s fundamentally about accepting that there’s inevitable uncertainty about the problem and the solution, and so therefore we have to allow time for research and learning and exploring and trying different things. How long ago did you write
Teresa: I wrote it over the course of the year in 2020, and it came out in May of 2021.
Murray: Ah, so it’s pretty new. I was gonna ask you, what have you learnt since then?
Teresa: I’ve learned a lot since then. I’m both amazed by how many people are sharing how they’re putting the habits into practice, and I’m also a little overwhelmed by how many people are only engaging at the most surface level. The biggest thing I’ve learned is that there are a subset of people for whom a book will be plenty and they will take it and they will read it and they will put it into practice and there’s a much larger percentage of people that need a lot more help and a lot more support and a lot more training than just reading a book.
Murray: So is there anything else that’s really important that we haven’t asked you about?
Teresa: I can’t tell you how many times I talk to people that are waiting for permission to work this way. They’re waiting for somebody to say, yes, you’re allowed to talk to a customer. Yes, you’re allowed to test an assumption. I really wanna encourage people to stop waiting. First of all, you don’t need permission. In fact, I think Steve Blank wrote a great blog post of that same exact title. You don’t need permission. And I think if we don’t start working this way, we’re gonna keep building the wrong stuff. I think it’s just a shame that so many people hours are being wasted on the wrong stuff. So I really wanna encourage everybody to find the smallest, simplest, easiest habit to start with and just iterate from there.
Murray: Have you got any stories of people using your approach who have made a real difference to their product?
Teresa: So I shared several in the book. We also share stories of real product teams on the product talk blog. One that really stands out to me is my partner, hope Gurion, was coaching a travel company, like a bookings.com based in UAE and their target market was Saudi Arabia. We were working with them at the start of Covid when literally Saudi Arabia shut down all travel like most of the world, and their business went almost to zero overnight. We actually helped this team find a new market interview customers and quickly pivot in about like a three to six week period.
What they found was that, there’s this idea where people will rent out houses or hotels to do a one day retreat. It’s kind of like staycations. Like you have a little mini vacation at home. And what they found was at the start of Covid, families were getting together and doing this on a regular basis, because it was their way of like creating little covid safe bubbles and having some normalcy in their life. And so what they did was they started collecting inventory and helping people book, venues for day long events. And that was enough to fill the gap for the first few terrible months of covid, to just give them a lifeline until some of their travel started to return.
Murray: That’s great. All right. Shall we go to summaries, Shane?
Shane: Let’s do it. All right, so you started out by talking about how you’re a product discovery Coach. You help by simplify and organizing the noise. And then throughout the whole conversation, you’ve given us a bunch of patterns. Techniques that have value, given a certain context that you could apply. And you talked about this idea of adopting habits in any order. So this idea of pick a path, that actually it depends on the context of your organization. Each of these patents have value. You can put them in certain orders depending on your organization, what you’re trying to achieve.
You talked about discovery and you said discovery should be involving our customers in the conversation around what we are gonna decide to build. And when we’re doing it, we’ve gotta be really careful of confirmation bias because we tend to influence what we do to help support the great idea we think we had. So we need some techniques around that. So customer interviews and assumption testing. Are we talking to somebody to discover what they’re doing and get those stories or are we actually testing and assumption we have to see whether it has value.
You talked about how teams started off where our key metric was. How many features have we shift we shipped and then we got much better now, have we moved that North Star metric? But we’re still sometimes in a feature factory, we’re still focusing on features and moving the metric, not changing behavior of our customers. So you know, this idea of talking to a few customers every week. I think that’s, for me, the crux of it. We often talk about the cost of change. That if the cost of change is small, we are willing to make bets because the impact of that bet going wrong is small. If the cost of change is large, then we are less likely to iterate. If the cost to deploy something into production as part of our product is a two week cycle and involves three people, we are less likely to risk deployments that fail because of that cost.
And so this idea that we’ve already booked up continuous conversations with customers in their future, that really now our cost to test a bit by talking to them is really low because they’re just there and it’s the next highest valuable conversation that we need to have because that’s the problem we’re trying to solve. So I think for me that was the crux. I love the idea that we can use this and we don’t have to spend two to four weeks downtime to do research. But we are still doing research.
I love the idea that when we’re interviewing people, we are talking about real world stories. So we’re getting their view of things they do not forcing on the questions that we want them to answer.
I’m gonna go and investigate this idea of an interview snapshot. I do like pattern templates because I find having a one pager where I’m forced to put things in certain segments helps me think about those segments.
I’ve always struggled with this idea of UX research being this big upfront piece of work where we’re trying to understand the problem and then we’ll figure out the solution and then the MVP approach or the lean approach, which is, okay, we think we know what the problem and the solution is, let’s go out and test that really quickly. This seems to be a nice place where we’re in the middle.
I like the idea that, when Murray asked you that question, what three tools do I need to buy? You gave us three categories, which to remind ourselves they were unmoderated testing tools. One question. Survey tools and behavioral analytics. And you don’t have to have all three, but maybe start with one. I need to start with at least one of those.
You talked about cross-functional teams the idea oft skills. We should have a team that could do all the work as much as possible themselves without a handoff. I heard on the podcast that a handoff between teams increases the time to delivery by a factor of twelve. So it takes 12 times longer just cuz it’s going outside the team.
And then I like this idea of adopting habits. This idea that we have ways of working, we have playbooks, whatever term we want to use, but they are a set of patterns that we daisy chain together given our context because they have value.
Things like the opportunity discovery tree. The artifact itself is not the value, it’s the conversation we have with our teams to create the artifact. It’s the same as sprint planning. It’s not the allocation of the tasks and that stupid geritol, it’s not the sizing of them. So we can know what our commitment was. It’s the conversation the team have about the work they’re about to do that has the value for us. Conversations over documentation. Hey,
Somebody should have written that down. They did. So for me great podcast, it was a great chat. Set of patterns from a coach who’s got stories about how they’ve used this in the real world with real customers and added real value, that was me. Murray, what do you got?
Murray: I was thinking Teresa about how foreign these ideas we’re talking about are to a lot of companies. Because a lot of it’s about mindset and habits are a part of implementing a new way of thinking. Things like, there is a lot of uncertainty in what the problem is, the customer’s problem and what the solution might be is still a foreign concept to most of the companies I deal with.
I don’t see many companies at all doing regular customer testing, not even upfront, not even research. Mainly what they do is they go out and try and sell things and get feedback from people that way. This is all part of a big movement. We touched on it at one point, there’s Marty Kagan staff, lean ux, agile, lean startup, continuous delivery, continuous discovery. It’s all part of the same new way of doing things. It’s much more streamlined, much less bureaucratic, much more customer focused, product focused, outcome focused, and people focused. So I really like it. I don’t see many people doing it.
I want more people to do it. I’m gonna be strongly recommending this to my next client if I can work with the product people or next company I work for.
Shane, why on earth are you spending so much time and money building your product without asking customers what they really want?
Shane: Because we have to be at a stage that we can actually make the changes those customers need. After we’ve asked. because we’re not VC funded, we have to have enough money to pay the team to actually make that change and go after it hard. And right now, if I go and actually get customers onto that platform, we can’t iterate fast enough. Cause we just don’t have enough time, to do it, so that’s what we’re doing right now, we’re banking the customer cash by doing it for them.
The other thing is, it’s really interesting. So we thought we were building a product for analysts, that was our goal, as analysts do the data work, and that’s our vision, for where we wanna be at the end of it. But what we found is actually there’s a hell of a lot of customers out there that just want the work done, and they pay us to get the work done. It’s a product for us, because we don’t want to be a service company. So how do we deliver what they want without us actually having to touch it, that’s what we’re focused on right now. But what they’re buying is they’re buying a service. They’ve got this problem, they tell us about it, it goes away. So it’s really interesting where we’ve, ended up .The problem we’re solving for a customer is they want the data work done and they don’t care how it’s done.
Teresa: It sounds like you’re running a pretty classic concierge test. Your team is doing the work that your product will eventually do. That’s a great way to learn what you should build
Shane: And we may end up with a gig economy. We may end up with that marketplace of producers and consumers and the gig economy. But , what we’ve found is that people will just pay to make the problem go away. They don’t care how you do it. So we know the problems here. We know they’ll give us money to solve it. We are not hitting the model we think’s the right model to solve that problem. But hey, we may get to our end goal or we may go, actually that wasn’t the right market. We’ll see.
Teresa: Shane, that was a very thorough Summary. I appreciate that. There’s one thing I wanted to comment on. So when you were talking about lean, there’s this idea of like, if the bet is small, we can just build it, but as the bet gets bigger, we need to remove more of the risk.
I think that’s right, but only to a point. Because what I like to remind people is what if you made 30 small bets and all 30 of them failed? You still wasted a month of your life. So we forget to take into account opportunity cost of what else we could have been building. And I find that even a single customer conversation could often throw out 15 of those 30 bets. I love the Lean Startup. When the book first came out, it resonated with me so much. I was like, wow, this describes exactly what I’ve been doing for the last decade. As I know many people had that reaction, But the idea of the building being the only way to learn, it’s very wasteful. And I see you nodding along so it doesn’t sound like I said anything that you greatly disagree.
Shane: No so again, this. Back to that problem that I’ve had over the last six months of podcasts where there’s a chasm between people that are really research heavy and people who are incredibly build the solution and just get it out there and do testing. And I like to be in the middle ground because I think that’s where the value is.
Teresa: And then Murray I definitely heard some enthusiasm, but I also heard a little bit of skepticism of do people actually work this way? And, I think that’s fair cuz I think there’s a majority of people don’t work this way. But there’s a William Gibson quote that I think captures exactly where we are, the future is already here, it’s just unevenly distributed. We just did a survey where we asked people about their discovery habits because I’m really sick of Twitter trolls telling me that nobody works this way and this is idealistic and it’s impossible and I clearly don’t get it. And I’ve clearly never built a product. So we ran a survey and we, got responses from 2000 people from a wide variety of rules. I’m gonna be writing up the results and sharing ’em on product talk after the new year. But I was blown away by the results. We’re seeing over 50% of our responses are moving towards outcomes. 30% are already outcome driven. 20% are still in our feature factory world. Now don’t get me wrong, our sample is bias. They’re people that follow me and are familiar with my work. But 2000 people, right? So even if they were the only people that worked this way, it’s a lot bigger than nobody, which is really exciting to me. But we’re also seeing like 25% of our survey respondents said they’ve interviewed a customer every week for the last quarter. small number, but we’re seeing teams starting to build patterns and habits. I’ve often said like maybe two to 5% of product teams work this way. I think the survey is convincing me that it’s bigger than that. It really is spreading. People are getting excited about this. We have the tools to work this way. We’re finally getting to a point where there are leaders out here who wanna work this way. So I share your skepticism and I’m starting to turn the corner into like, holy crap, we might be having an impact.
Murray: I think what’s happening is that there’s an evolutionary process going on in the marketplace where companies who do this, who do lean startup, lean X, that do agile properly, continuous delivery, continuous discovery. These companies are all producing much better products much quicker that deliver better value and they’re much more responsive to what’s going on in the environment and to changes. And so those companies are just going to outcompete the other ones. And we see that so many of the good companies outta Silicon Valley are doing all of this stuff. There’s a lot of hit and miss, but a lot of them are becoming huge value drivers. So I think it’s just the way the future and as you said, the old dinosaurs are not doing it and we have to try and help them if we can.
Shane: So the old dinosaurs are profitable. They may not change fast and they may get wiped out like Kodak and Blockbuster, but they’ve got a big long tail of survival before they completely get taken out. And then there’s a whole lot of lazy startups that got given a hundred million to play with some ideas that were profitable, and they didn’t need to follow practices like this cuz they could just hire people and make themselves look busy and that money’s dried up. So I think what we’re gonna see over the next couple of years is we are gonna see the true agile product focused companies survive. The big boys will still be there cuz they’ve still got that long tail revenue and everybody else that was pretending that they knew how to build a product company, they’re gonna disappear.
Teresa: I think that’s spot on. So first of all, I think venture capital creates a lot of waste. It’s designed to create a lot of waste. So we can’t really paint those companies as like, this is the pinnacle of how we should work. Sure, if you can afford to throw away a hundred million dollars, feel free to work the way they work. We train a lot of product people at those old dinosaur companies. Something is changing. We work with banks, insurance companies. We’re working with airlines, we’re working with healthcare companies. We’re working with pharmaceutical companies. It really is spanning all industries, not just the sort of sexy internet companies. Those companies, it’s gonna be really hard for them to change. It’s gonna be really slow for them to change. But the first step is a desire and interest, and that’s there, we’re seeing it every day. So that’s really exciting to me.
Murray: Yeah. That’s a good point. There’s always pockets within those companies where we can help and they really want to take a very forward looking view and do these new things, so that’s great. So now you talked before about a website where you had stories and Tips and so on. What was that?
Teresa: Yeah, so my website is product talk.org. So we released two long form articles about continuous discovery every month. Some people are gonna read the book and put it into practice and be great on their own. I wrote the book with that in mind. It’s called Continuous Discovery Habits. You can also find that@producttalk.org. But for those of you that need a little bit more support, putting the book into practice we also have a Slack community for practitioners who wanna work on developing their habits. And we have a variety of online courses to help you build skill in the different areas. And you can find all of that@producttalk.org.
Murray: Awesome. Thanks very much for coming on, Teresa. We really appreciate it.
Teresa: Thanks for having me. This was a lot of fun.
Murray: That was a no nonsense. Agile podcast from Murray Robinson and Shane Gibson. If you’d like help to build great teams that create high value digital products and services, contact Murray evolve.co. That’s evolve with a zero. Thanks for listening.