Delta-v Roundtable on Conducting Win/Loss Analysis

February 22, 2024

EVENT RECAP

Win/loss Analysis allows you to rely on data rather than preconceived theories or hypotheses to understand the drivers of GTM success and failure in your company. Willem Maas has led dozens of Win/Loss Analysis projects and debriefed more than 500 win/loss buyers for companies including Reltio, ServiceChannel, and Lightbend. In this event, he walks through building a Win/Loss Analysis program, including outreach methods to increase participation rates, conducting illuminating interviews, and producing actionable analysis. 

Ideal for Marketing, Product, & Sales Leaders and CEOs/Founders

Join to discuss:

  • The five phases of a win/loss analysis project to get from planning to action 
  • Why only talking to losses can create blindspots and lead to misbegotten changes
  • Win/Loss Analysis deliverables to create post-call and post-analysis 
  • Historical participation rate benchmarks for wins and losses 
  • Outreach cadences and tips to increase your participation rate

Video

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

Unnamed Speaker

My name is Colin Barkley. I’m a partner at Delta V. I help lead our vertical SaaS and CXO software practices. Nice to see everyone, for those that I haven’t met. And thanks for being here. I’m excited for the session today. As an investor and board member, we’re always trying to understand the why of what’s working in our companies and also what’s not, so that we can do more of the former and address the latter.

Unnamed Speaker

And one way we get at the why is through these win- loss analyses. And so I’m excited to learn from Willem today, what’s the best way for us to do these and how do you apply a consistent framework and rigor to extracting this kind of knowledge from your customer base. So Mary, I’ll turn it over to you to introduce Willem and Willem, thanks for being here.

Unnamed Speaker

win- loss analysis consultancy for complex B2B sales. He’s led dozens of these win- loss analysis projects and debriefed over 500 win- loss buyers for companies, including Reltio, Service Channel, and Lightbend. So now I’ll turn it over to Willem to kick off the session.

Unnamed Speaker

Thank you, Mary. Thanks, Colin. We thought it would be good, everybody, to spend just a couple of minutes making some introductions. Maybe you saw that in the slides. If each of you could just take a turn saying your name, your role, your company, experience that you have with win- loss analysis. I’m particularly interested in where you get your data, right? You could be getting it from your CRM, you could be doing interviews, you could be doing both. And then last, what you’re hoping to learn from the session today. Who wants to jump in and start off?

Unnamed Speaker

I’m happy to jump in.

Unnamed Speaker

Brandon, I was like, I’m glad, me and Kirsten were both like, all right, let’s go.

Unnamed Speaker

Thanks, Brandon.

Unnamed Speaker

So my name is Brandon Riggs. I’m a senior product marketing manager at Perceptix. You might see behind me, I have a three- year- old daughter. Here’s her bike. It’s my wife’s birthday today. Here’s a present. And I work for Perceptix. My experience with win- loss research, I’ve been doing win- loss interviews for several years, even before I came to Perceptix, but I’ve been doing it.

Unnamed Speaker

I conducted and managed our win- loss program for about three years here, but now we have recently switched over to working with a vendor that we’re working closely with on our win- loss. But, and I’m working right now on setting up integrations with Gong, Salesforce.

Unnamed Speaker

So basically just creating a much more robust methodology around the way that we do win and loss with them. So, I mean, that’s what I’m hoping to learn from the session today is some additional considerations around how we can really continue to step up and progressively add incremental value out of the things that we’re doing with win- loss.

Unnamed Speaker

And so you’re doing interviews, is that right?

Unnamed Speaker

Firing?

Unnamed Speaker

I was, yeah, I was for a long time, but now we’re working with another vendor to do those.

Unnamed Speaker

Got it. Yep.

Unnamed Speaker

And I’m Kirsten Helvey. I’m the Chief Customer Officer at Perceptix. I work closely, obviously, with Brandon. I utilize the data from all the work that Brandon does to reflect back on what changes we need to make, either in our customer journey, our services, et cetera. And I am trying to determine how do we get to leading indicators, not lagging. I think a lot of what we do is focused on the retro, and I am trying to put processes in place and analytics in place for us to start to understand how to predict. As Brandon said, we use all sources of data.

Unnamed Speaker

We’ve done deep dive analysis on churn. So we know some of the drivers and I’m trying to connect all of our customer feedback to our win loss, to kind of all of our churn data points we’re looking at. So I’m just here because I want to benchmark what we’re doing against what others are doing, what’s best practice. And as Brandon said, what are the things we should be considering?

Unnamed Speaker

Great, thank you.

Unnamed Speaker

I can go next here. John Lovis, Senior Product Marketing Manager at LogRocket. I’ve been kind of in and out of win loss research for a few years now. I’ve worked previously at Pegasystems where I managed our, the third party vendor we used for win loss and sort of ran that program. And then more recently have been doing win loss interviews here at LogRocket where we use those interviews and a combination of kind of Salesforce data and gong info kind of compile our win loss info there.

Unnamed Speaker

In terms of kind of what I’m hoping to get out of today’s session, I think thoughts on what other people are doing, what’s worked well, ways to be more efficient in win loss. And then, you know, any tips on, you know, ideas, I guess, for extracting more value out of these interviews, tactics that kind of lead to more informative responses, I guess, would be interesting too.

Unnamed Speaker

Okay, thank you.

Unnamed Speaker

Thanks, John.

Unnamed Speaker

I can go next. Hi, everyone. My name is Vicky Giles. I’m Senior Director of Product Marketing at Edited. I would say we’re kind of at the very beginning of a revamping of our process of win loss interviews. I think we’ve kind of done it ad hoc in the past and it’s kind of been led a little bit more by our kind of customer success team. So just a bit of context, product marketing is a fairly new function here at Edited. We’ve been going for just about a year now.

Unnamed Speaker

So we’re kind of getting ourselves off the ground and win loss is part of the process that we’re trying to introduce. At the moment, we’re utilizing gong and HubSpot, sort of CRM to kind of get the initial data and conducting win loss interviews. My hope for today is kind of just to hear from all of these people that have more experience with this process and what’s kind of worked for them. I think I’d love to understand how we can increase participation.

Unnamed Speaker

And something that happens quite often with us is that just through the natural process of kind of having conversations with customers that are thinking of churning from our product, our account executives go through a number of these conversations.

Unnamed Speaker

And I think it feels a little bit tedious to people to then kind of take part in a kind of win loss interview following that process to kind of have how we should sort of differentiate or even use the conversations that our account executives are having in a better way to help contribute towards this process rather than detract from it. And that’s me.

Unnamed Speaker

Thank you, Victoria.

Unnamed Speaker

Do we have everybody?

Unnamed Speaker

I’m the founder of Delta V. I’ve got a bunch of companies that have board meetings right now. So I’m listening in to then try and direct them to this recording later. So I don’t do any win loss ratio analysis.

Unnamed Speaker

And no doubt Rand and Connor will be coming to Brandon and myself. So we will pay close attention to anything going on.

Unnamed Speaker

I think what I saw in the last board meeting, Kirsten, tells me you guys are on the front end of this, not the back end of this.

Unnamed Speaker

I’m always looking for more.

Unnamed Speaker

That’s kind, yes. And that’s, I’m the incremental, we just want to get better. That’s what we want to do.

Unnamed Speaker

Operational excellence is our focus, as you know.

Unnamed Speaker

That’s right.

Unnamed Speaker

All right, I’m going to jump in. Everybody see my screen here, title screen? Yep. All right. I think, you know, just to kick off real quick, I think it’s a really interesting moment or just to take in the fact that win- loss has, this segment has really taken off a lot in the last few years. This session and your participation is one sign of that. I think, you know, some other signs, there are now more than two dozen, I would venture to say, service providers providing win- loss analysis.

Unnamed Speaker

There’s a market guide just came out from Gartner about a month ago, mid- January. And there are dozens, literally dozens, or more of SAS offerings for transcribing, summarizing, analyzing interviews, and reporting findings. When we got started seven years ago, little of that was true. There might’ve been a handful of companies doing this. All the software was still laptop software, software that you installed locally. Wasn’t any SAS. There wasn’t really any Gartner coverage.

Unnamed Speaker

So it’s a pretty cool moment to be involved with this and for you all to get in. So I wanted to start off briefly just with, you know, an overview, just six slides, literally agreeing on what win- loss analysis is, how it’s used, and then we can jump into Q& A. But of course, if anything comes up as we’re going along, just let me know. Any questions you have, I’m glad to take those on.

Unnamed Speaker

So from my perspective, our perspective, win- loss analysis is a methodology for making data- driven go- to- market decisions, rather than or in addition to what’s typically done, which is using past experience for tribal wisdom inside the company, customer anecdotes, those sorts of things. The data that we use for win- loss analysis is primary data. It’s data that comes from buyers that we get either through interviewing them or surveying them, wins and losses. But win- loss analysis doesn’t require doing that. It doesn’t require primary data.

Unnamed Speaker

And the most common source and most common starting point for win- loss programs is the closed reasons and primary competitor fields in a CRM that have been recorded by your sellers. You can also get data from your sellers, secondary data by surveying them, interviewing them, loss reviews that they run. Those are all pretty common sources of win- loss data as well.

Unnamed Speaker

We see go- to- market teams using win- loss analysis, particularly product marketing, sales, product management, using this to get more credible CI to enable sales, particularly with product marketing. In many cases, the focus is on product capabilities, which ones buyers assess as stronger than competitors, which ones they assess as weaker and why exactly. Sales and product management want that same CI, that same intelligence.

Unnamed Speaker

Sales they find also wants to know when and how competitors are providing a better buying experience, whether that’s holistically across the whole team or whether that’s at the level of individual sellers. I think you can get an initial answer to these questions with your CRM, but this sentiment from a CMO that we work with is quite common, that there’s really nothing substantial to act on in the CRM. This is a report of closed reasons from the CRM. And so what do we learn from a report like this?

Unnamed Speaker

Well, we learn what the top loss drivers are in blue price. I can’t even read a product, didn’t meet requirements in orange, too complicated in yellow. And we can track the trend, right? We can track from quarter one to quarter two, quarter three. But what we, to act on those insights, we need to understand why, what’s wrong with price. Are buyers objecting to licensing costs, to implementation costs? Do they find the pricing model risky, hard to understand? What does good pricing look like to your buyer?

Unnamed Speaker

Those are details that you’re rarely gonna find in a CRM. And so let’s compare what we get here with the CRM with what you would get with buyer interviews.

Unnamed Speaker

This is a report from, or screen grab of a report from our win- loss portal. So in the background, there’s a donut chart of loss reasons from a set of buyer interviews, and the chart, that chart shows the key criteria that are driving losses for this vendor. In the foreground, we’ve drilled down to excerpts from the interviews, details of their assessments, of this vendor’s extended capabilities.

Unnamed Speaker

So now these are not actual interviews for obvious reasons, but you can see that each of these has been tagged with extended capabilities, and so we would get, if these were actual interviews, it’s an understanding of how they assessed us on this attribute, on this criterion, and why even that was important, why that was a consideration for them, why they felt that a vendor might be better or not as good. So there are two tips I want to share here. One about the donut chart, and the second one about time management during these interviews.

Unnamed Speaker

The first is on related to time management. If you like details like I do, and you want to dig deep during these interviews, the challenge though is that you get about 25 minutes. Once you net it out, you ask for 30 minutes, there’s some short introductions, some rigmarole to do at the get- go. You get realistically about 25 minutes with each of these conversations. You’re not going to have time to dig deep into every single factor and decision.

Unnamed Speaker

And so the way we handle this is by first asking vendors which buyers, asking which vendors the buyer eliminated, as they went from an initial consideration set, to their short list, to their winner. And once that’s fresh in the buyer’s mind, I ask them what was missing from the vendors that they eliminated. I find this question really very quickly reveals the key factors that drove their decision making. If instead I only asked them why they chose a vendor, I find I get a longer answer to that question.

Unnamed Speaker

It doesn’t really hit on the details or the criteria that really made the difference. You get extra details, things are sort of soft, edge kind of considerations. So that’s what you see in this donut chart. These are the key factors that were driving their decision. So once we’ve got those key factors, then we shift into the details. Which vendor had the best solution on each of those criteria? Which had the worst? Why? Now going back to the donut chart, as I said, this chart is showing the key factors that influenced their losses.

Unnamed Speaker

But what if extended capabilities or data security was also influencing our wins? And that’s not unusual, right? We don’t, we don’t, these criteria are not binary. They’re not going to only influence one side or the other. They’re going to come up as both win and loss influences. And the question really is, where’s the preponderance? Are they more strongly influencing us to lose or to influencing us to win? So a chart like this, I really, I like a lot. I like the donut chart. I think visually it’s very nice and easy to understand.

Unnamed Speaker

But for that reason, I felt like I really needed a better way to represent win- loss drivers. So what you get with a chart like this is you’re getting the loss drivers by prevalence only. How many lost opportunities they influence. It’s not factoring in the influence that they had on one deals. And so it doesn’t give you that blended view of what, how they’re really impacting outcomes.

Unnamed Speaker

So if you were to produce a chart of wins and a similar chart like this, a donut chart of losses, you might very well find that you have one or two that show up on both of those charts. So they’re the same between those charts, which is, it’s kind of confusing to see them on both places. I think we think of them as being distinct and separate, but they’re really not. And they often blend across the spectrum. So that’s why we developed this quadrant.

Unnamed Speaker

This quadrant, I think, is one of the key outputs that we provide, whether we’re doing interviews or surveys. And what we have here on the y- axis is prevalence. That’s just as you saw with the donut chart. That’s how many deals this was a key criteria, each of these were a key criterion in. And then we have a win rate on the x- axis. So we’ve calculated for each of these criteria what their win rate was.

Unnamed Speaker

And they were a key criteria in what was the outcome, what are they associated with more strongly, more strongly associated with winning, more strongly associated with losing. So the ones on the right here are influencing wins, the ones on the left influencing losses. And in terms of the strongest criteria that are driving or steering the ship, it’s going to be the ones at the top, top right driving wins, top left driving losses. In this case, we’ve circled three that had a shared influence.

Unnamed Speaker

There was an interrelationship between time to value, implementation speed, and industry experience. So that’s another, I think, important point. It’s a subtle point that these things are not to be, I think it’s, you should not think of them as just being in isolation, that there can be interdependencies between these things.

Unnamed Speaker

And as you analyze or think about them, it’s important to understand holistically which an interview allows you to do, understand holistically how they interact, as opposed to just being solo operators, let’s say, or independent influences on the outcomes. Any questions here, comments? All right, well, let’s jump into discussion. We have more slides. I think you, Mary, distributed the slides. So there’s more there. If there’s one that you want to come dive into or ask some questions about now, we can do that. I also got a few questions here.

Unnamed Speaker

Should I jump into these or does anybody have something you want to ask right now? All right, well, take number one, how to most effectively scale a win- loss program? I think, you know, one really important thing to consider with this question, sort of implicitly scaling, we’re going to get more data, we’re going to get more interviews, more surveys potentially. With interviews in particular, I think it’s, you know, I very often get this question. I think we’re all trained to sort of sniff out or focus on sample size.

Unnamed Speaker

What’s been found when this has been studied by professionals, by professional researchers, is that representativeness is much more important in the case of interviews than sample size. So in the case of win- loss, where we’re talking about your pipeline, you really couldn’t have a participant, an interview respondent, who’s more representative than a buyer who came out of your pipeline. So I would say that’s a highly representative sample.

Unnamed Speaker

And therefore, we don’t need to be as focused, let’s say, or emphasize sample size as much in this case where we’re doing interviews. Now, another way that this has been assessed, a mathematical way, is with what’s called the binomial probability theorem. The idea with that is that an issue that is less common is going to take a larger sample to identify.

Unnamed Speaker

So with win- loss, you know, I’ve talked a bit about prevalence, the number of times an issue has come up, a criterion has come up, a weakness, let’s say, in the buying experience, whatever it may be, the more often we’ve heard that, the more confidence I think we can have that that’s a big issue. So it’s important, I think, for us to have a large sample to find those edge cases. With the binomial probability theorem, what it’s saying is that, I’ll give you an example.

Unnamed Speaker

One very common way that we size studies is to get issues that come up in at least 20% of the interviews pool. So if you wanted to find things that came up in 90% of the interview pool, you could have a much smaller sample. But we size interviews or studies to get things that are only occurring in 20% of the interviews. For the reason I just said, we want to get a very representative sample, we want to see how common these things are coming up. So with 20% occurrence and to have a 95% chance of seeing those things, we only need nine interviews.

Unnamed Speaker

Nine interviews on the win side, nine interviews on the loss side. That gives us 18 in total, we round up to 20. And so for a baseline study, we typically recommend for that reason, 20 interviews.

Unnamed Speaker

So that’s, that’s not really answering the question of how you scale up, but I think it’s just hopefully I’m giving you a little context for how to think about that, I think the more of a better reason than sample size for scaling up is to include other use cases.

Unnamed Speaker

And this is pretty common once interviews once his buyer data buyer feedback has been seen and, and people it’s like Kool Aid people get very excited. And say, you know, I’m interested in how buyers think about this or that the use cases tend to grow and that’s going to grow the amount of data and conversations that you need to have with your buyers to support all those use cases. So one way to to handle that to scale up. I think is, you know, automation is an important way to handle that we often use software that you already have to support that.

Unnamed Speaker

We’re able to use the CRM that you have in place. We’re able to automate with sales development or sales engagement software that you have that automation, the pinch points that you’re going to find in as you scale up I think are around the buyer outreach and around the analysis. If you think about the volume of data that’s generated in an interview, 10 to 14 pages of transcripts, that’s a lot of data and Gen AI has opened up the gates. There are many, many tools now coming out to support automating the analysis process.

Unnamed Speaker

So in terms of the outreach, you know, see if they’re built in capabilities with most CRMs to to handle this to trigger either a task or an email directly to the buyer in your sales engagement software in your outreach or your sales loft and schedule the interview. On the analysis side, we can talk about that. I can show you some tools for that. But there are quite a few great tools now to support the analysis. Another important way to scale up, I think, is a qual quant or a mixed methods design.

Unnamed Speaker

Surveys are a more efficient way to to gather a lot of data. What we typically do is by using surveys, we’re able to efficiently get key data about decision criteria and competitors that were in the deal. And then as we identify issues, new concerns that are driving losses or what have you, we set up a sort of a smaller sample of qual interviews to dig in and understand what the issues were there. Those are a few thoughts as far as scaling up. And then I see we got, do we get a few other questions?

Unnamed Speaker

Oh, Willem, I know you mentioned there were a lot of great tools out there to automize this. I know there’s a slide 23 that had some examples of different softwares, or I wasn’t sure if you could share what some of those tools were.

Unnamed Speaker

Sure. I turned off the film strip. So is this 23 or are we looking at a different one? Yeah, you know, actually, I’m going to stick with this one. So this is what I was describing before in terms of your CRM triggering an email. I mean, most I’d be hard pressed, I think, to find a company that doesn’t have some sales engagement software, grew sales loft outreach. As I said, once these reports have been configured to automatically spit out a list of wins and losses from Hotspot or Salesforce.

Unnamed Speaker

triggering a task or an email directly to that buyer. It can alternatively generate a list of those deals and send it to you if you want to review them before there’s outreach. Then on the backside here, the analysis. One other minor point as far as summarization and transcription of the interviews. For a long time, we used an onshore transcription service. We’ve recently switched over to using Sonics. That’s not Gen- I, Gen- AI, but it’s another branch of AI. Those models have gotten a lot better.

Unnamed Speaker

We had previously found in terms of the lexicon accents that we were dealing with that the AI transcription tools couldn’t handle that very well. But Sonics, we did a test of about a half dozen Rev and other transcription tools. Sonics was by far the best. That’s really becoming our go- to tool for transcription. Then as I mentioned before, around the analysis, Gen- AI tools were typically integrating with Slack or Teams to distribute those summaries back out to the core stakeholders in your group.

Unnamed Speaker

Does that spark any questions for anybody as far as tooling?

Unnamed Speaker

This isn’t really a tooling- related question, but one thing that I’ve noticed in interviews that I’ve done is you can sometimes tell when the person you’re talking to is trying to be nice to you. They’re like, oh, well, you had great things and this, this, and this, and you feel like you’re not kind of… You want them to give you the hard truth, but you get the sense that they’re holding back because they want to be nice. Do you have any tactics for kind of getting around that?

Unnamed Speaker

I think just asking pointed questions. I find it’s not so much an issue for us as an independent that people are sort of holding their punches, but asking very direct questions. I described a process earlier by which we’re first talking with them, and I find it’s helpful to maybe start with a little bit of the context, why they were switching, so on and so forth, and then just sort of walk through their process, right? Talking about the consideration set.

Unnamed Speaker

And then, I mean, it’s just a very short way, but just guiding them through and bringing back to mind the experience that they had and the process that they went through. So from consideration set down to short list, down to a winner, and who they eliminated as they went along. So in a loss, obviously, we got eliminated somewhere along the way there. And when we didn’t, either way, we’re going to talk about the losses. I’m asking about the losses. And I want to understand, you know, what were the key factors, why?

Unnamed Speaker

And I find people have an easier time describing to me in a… It was this or that, you know? They can give me a couple of very pithy key reasons why vendors were eliminated. And then drilling down. So it’s sort of an iterative process that I use, John, where I’m iteratively getting to the details, rather than just jumping to, why did we win? Why did we lose? Sort of walk through and bringing it back into their mind. They’re, you know, reliving it a little bit.

Unnamed Speaker

And then iteratively drilling that, rather than sort of, you know, just very hard, right on the face of it. Okay, so, you know, I don’t know how you do it, but that’s how I do it. And I found that to work really well, as I said, in terms of just bringing it back and making it salient in their minds again.

Unnamed Speaker

Yeah, no, I think that makes sense. And, yeah, your point about, you know, if they’re going to walk you through why they got rid of one competitor or another, or you even, right? I think, you know, you pick up on trends and what their priorities are, even if they aren’t explicitly kind of stating to you things, or they’re trying to hold back a little bit. So that makes sense. Thank you.

Unnamed Speaker

Yeah, and I think, you know, if you get them, it’s almost like you get them to a place by walking through it like that. You get them to a place where there aren’t a lot of options to sort of pull back, right? Because we started out talking about the vendors that they eliminated, and then there’s like, you know, and I wouldn’t really necessarily ask about us, right? I mean, there’s typically going to be whatever, two, three, four other vendors that were eliminated.

Unnamed Speaker

So why were those groups, why was that set of vendors eliminated from the consideration as you got down to your shortlist or your winner? And so it’s not controversial. It’s not really in their face. And it doesn’t give them much space, I guess, to sort of back out of it in a way.

Unnamed Speaker

I’m trying to just be very sort of factual and I mean, I do start out, I mentioned before, you know, you end up with about 25 minutes and part of that is because I have some disclaimers at the get go. And one of those, you know, is just simply asking them to be as forthright as possible, telling them I’m going to ask stupid questions. I think that’s a really important thing as an interviewer is to question your own assumptions.

Unnamed Speaker

And so when you hear things, you know, ask follow up questions, ask things that you might think on the surface, well, yeah, we’re a SaaS, we all accepted SaaS as goodness. So why would I question why somebody said that choosing a SaaS was important to them? We’ve actually found that, you know, we found subtleties that were highly valuable to one particular client around understanding that that was not actually, they said SaaS, but what they really meant was in the cloud.

Unnamed Speaker

And so people were blending cloud hosted, PaaS, SaaS, and that provided an opportunity for competitors who didn’t have a SaaS to fudge that. So I think, you know, asking the stupid questions, asking follow up questions, super important. Yeah, thanks. Anyone else?

Unnamed Speaker

I just had a quick question, and I’m not sure if we’ll cover this later, but you talked briefly about surveys. And I was just interested to hear from your perspective around, you know, like, what’s the kind of structure of those surveys? Like, how in depth are they to kind of guarantee some sort of participation from people that we’re doing short, sharp, couple of questions, or are they a little bit more in depth?

Unnamed Speaker

Yeah, you know, it depends on how you design the study. For us, what we’re typically doing is keeping them really short. I mean, I think that is a best practice. You try to keep the survey less than 10 minutes. We target five minutes in our common configuration. And so we are just asking about very simple things. One client that we’re doing this with had some issues with the buying experience. So we ask people to rate their satisfaction around responsiveness on the demo or any POC that they did, leadership involvement.

Unnamed Speaker

So there’s just a simple seven point scale there, and we ask them to rate us on those things so we can track the trend. And I think that’s a really valuable thing that surveys allow you to do is because they’re so efficient, you can track trends over time. And in that case, what we were looking to do is we knew that those from the interviews, we knew that those were problem areas. And so the survey allowed us to see, are we getting better, right? Are the changes that we’ve made making things any better?

Unnamed Speaker

But then we’ll ask about consideration set, and we’ll have a list of 10, 15, same thing, you know, who did they shortlist? Kind of, as I said, described with the interviews, we’re kind of walking them through that process. And then we ask about criteria. Which were the most important criteria that they used to make those decisions as they went from, as they decided to shortlist, as they went from a larger consideration set down to a shortlist or choosing a winner. Some open ends, but not a lot. I, you know, again, I think time is the concern.

Unnamed Speaker

And so when we find something popping out, you know, a new issue as we’re, as we’re trending these things, then we’re able to very tactically, you know, set up five or 10 interviews to understand what’s going on there. Does that answer the question?

Unnamed Speaker

Yeah, thank you.

Unnamed Speaker

My pleasure. Okay, let’s see what else we had. Optimizing response rate. You know, I find the best practices here, you know, just like any kind of outreach that you may do in marketing or in sales development, right? So you’ve got to leverage your strengths, you’ve got to find an angle to make this.

Unnamed Speaker

interesting, appealing to those buyers. One strength that you have, one lever you have, particularly with the late stage loss and certainly with the win, is the relationship of your rep or your AE with that buyer. So you want the email to come from that rep.

Unnamed Speaker

Now, these are not, none of these are, you know, gonna immediately lead to a slam dunk, but this could be a difference of 10 points, let’s say in your response rate, having the AE versus, let’s say, having our CRO send it or our CMO. So in total, you know, these things can really start to make a difference. You want to make that email as short as possible. I think it’s important to, you know, everyone’s getting a lot of these requests. It’s not just the credit card company and the bank anymore. Win- loss has grown, as I started out saying.

Unnamed Speaker

I think, you know, many people now get these requests for feedback and they can all sort of lure together. So it can, it helps to say why it’s important, you know, for your company, for you, how it’s going to be used. You know, the contra point to that is you want to keep it as short as possible. You want to make it easy for them, right? So we use Calendly, you know, use some kind of link to easily schedule the call rather than going back and forth. With losses, we do offer an honorarium between $ 100 and $ 200.

Unnamed Speaker

The rule of thumb there is you want to offer two to three times what you’d expect our hourly salary would be, just to make it sort of eye- catching as much as possible. But again, I’ve had clients who sort of expected, well, we’ll just raise the incentive and we’ll get more, but it is not that simple. I mean, if it was, you know, everybody would just offer 500 bucks and be done. It really isn’t. And very often, you know, the time is much more valuable to people than the money, at least the ones that we’re interviewing.

Unnamed Speaker

So those are some thoughts as far as response rates. I mean, the last thing I would say on that is just as a best practice, you know, iterate. So try different things, you know, try different subject lines, try different, you know, body content and follow up. I find about 50% of these respondents will, you know, even just the first email, they’ll respond on that. The other 50%, you know, in round numbers, it requires a call. They’re not gonna answer the call.

Unnamed Speaker

I’d say maybe 10% or fewer will answer, but they’ll see that you called and it triggers them to respond in some way, to send an email back, to click on it and schedule the call. So it is important to follow up in email or calls and both. Increasing participation, I think that’s about the same question. Unless, I don’t know if the person that asked that, is there a subtlety to that you wanna ask that I’m missing?

Unnamed Speaker

He was on the line before, I’m not sure anymore.

Unnamed Speaker

All right. And then extracting more from interviews. You know, I think the really important point I’d say about interview, I think we all know, you know, we all are familiar with best practices as far as how we’re gonna ask questions in an interview. You know, why questions, open- ended questions, those sorts of things, you know, are really valuable, good practices as far as interviewing. I think maybe what’s more valuable in my experience is to focus on what you’re gonna ask.

Unnamed Speaker

Because again, you got that 25 minutes, align what you’re gonna ask with your goal. So being really clear about what our goal is and using that to fine tune that discussion guide to get just the data that you need and to provide enough room for you to ask follow- up questions. I think there’s so much that we’d all like to understand and know from these buyers and it’s easy to get.

Unnamed Speaker

And this is probably one of the hardest things I find with interviewing is it’s easy to get into details, to start asking questions that are not really directly related to your goal. And so you need to have, keep that goal in mind always and decide as you’re talking and interviewing somebody, do I follow up on this or do I keep moving along? Because there are gonna be very interesting things that come up, but there’s, you wanna have consistency in your data set from one interview to the next.

Unnamed Speaker

And so for that reason, it is important to hit on most of the same points, but you wanna be fluid and follow up and provide room to learn. I mean, there can often be things that you never anticipated, they’re gonna come up and you wanna have some space to dig into those.

Unnamed Speaker

So those are some thoughts as far as interviewing, just being comfortable asking the stupid question. It can feel uncomfortable, and I actually started adding, as I mentioned, that disclaimer about asking stupid questions because I found that some buyers would sort of look at me sideways like I was a lunatic or an idiot, maybe is the better word, to be asking, you know, why SaaS or what have you? But I explained to them that I assume nothing. Now, my only assumption is that I know nothing, and I find that that actually unbundles or unloads it.

Unnamed Speaker

You know, it takes away the sort of discomfort that can come from asking these sort of goofy questions, and they can be very, very revealing, very valuable. Anyone else, anything I’ve missed?

Unnamed Speaker

This isn’t directly related to any of these discussion points, but curious to get your thoughts on what an appropriate sort of time commitment to win- loss would be, either on a monthly basis or quarterly basis or whatever, just because, you know, I think, at least speaking for myself, but I think probably for others as well, right, you know, we’re at smaller companies, right, where time and resources are constrained, and win- loss is just a sliver of what any one of us is doing.

Unnamed Speaker

So, like, you know, what’s an appropriate amount of time to be dedicating to this to, you know, make sure we’re seeing value without kind of over- indexing on it and neglecting other things?

Unnamed Speaker

You know, Gartner in that market guide that I mentioned, excuse me, they estimated that for an in- house program, win- loss can require a quarter of an FTE’s time. If that person’s managing that program, it can, in a cross- hole year, be a quarter of that person’s time. So I think that, you know, these are reasons why you look to surveys as a compliment, why they add so much value as a more efficient way of getting data, and then more tactically or maybe more strategically, you know, using interviews, because interviews are very time- consuming, right?

Unnamed Speaker

Yep.

Unnamed Speaker

And John, we do a quarter look back, right, just with the postmortem of looking at data for that quarter tied in. So we produce a summary every quarter, but the interviews are ongoing and we get those. So whenever they come in, you know, I look at them on a real- time basis, but we will only summarize and have anything formal on a quarterly basis. Like, there’s just not more time and, you know, depending on how many responses you’re getting, right, you may not see a shift for that quarter. So just, that’s what we do.

Unnamed Speaker

Yeah, that’s a solid approach.

Unnamed Speaker

That’s a good point, Kirsten. I think it’s, you know, thinking about the outputs, right? And I mean, I find the heaviest lift is around the buyer outreach and the analysis at the back end and reporting. On the analysis and reporting, I think, you know, one possible approach in terms of mitigating or managing the amount of time is to back off and report less frequently, semi- annually, for example. And one could re, I mean, you know, obviously efficiency is one reason to do that.

Unnamed Speaker

Another reason is you do need to be conscious and align your share outs with the ability of the downstream teams to absorb this feedback, you know, and act on it. And if you’re coming back to them quarterly with new data, they, you know, may be too much, right? Maybe they may not really be able to absorb it and act on it as opposed to getting it, beginning to absorb it, make some changes and then see what happened. Are we getting any better before I give you more data and more potential changes that we need to make?

Unnamed Speaker

Well, I think one of the key things you said that’s important is either having the hypothesis, looking at sort of what the trends of the data coming and then testing on an ongoing basis because the why might be different. And so that for me is a key takeaway around, you know, we might have that hypothesis, test it. Does it change over time? And then there might be something new we have to test. So that iteration on an ongoing basis around, you know, what we’re looking for either to confirm, right?

Unnamed Speaker

Or dispel is something we have to review on an ongoing basis.

Unnamed Speaker

Yep, absolutely. You can make changes. This is the great thing that Win- Loss, a continuous program, allows you to do is to see, you know, are the changes that we made having an impact? Are buyers noticing them even? Those sorts of things. And obviously, you know, we live in highly dynamic environments. We make changes to our go- to- market, to our product. All our competitors are doing the same things. So you can’t stop and make Win- Loss a once- a- year type thing. And in my earlier comment, I wasn’t advocating that.

Unnamed Speaker

I think you do want to collect data continuously and share the summaries. But in terms of a full- blown analysis and trying to manage the level of work to do that, it can be useful to back off the number of times that you, or the frequency that you analyze and share out reports.

Unnamed Speaker

Thanks, Willem. Well, we have a little over five minutes. Were there any other questions that you guys wanted to dig into, whether it’s for Willem or for each other on the call? I had a question around, I think it was, when I was reading the slides before, there was one slide that talked about setting a goal for a Win- Loss program that was kind of above and beyond just having a Win- Loss program. And I think that’s where we are at the moment. It’s like, let’s establish a Win- Loss program.

Unnamed Speaker

And I think I’d be really interested to hear from other people about the kind of, I guess, the goals that people are setting for themselves over and above that to kind of help me understand how we shape what our goal needs to be.

Unnamed Speaker

So I’ll jump in and say something. I think this is really one of the most fundamental, critical things that you can do in terms of being clear about your goal and what’s the problem? What are we trying to improve? It affects the program from beginning to end in terms of obviously trying to understand no decisions or early stage losses. And maybe it’s not obvious, but that’s a very different recruit and it’s a very different interview than late stage Win- Loss, which is what I think most people typically think of, right?

Unnamed Speaker

So late stage Win- Loss, we’re going to intercede with losses at a stage four or five in your sales process and your wins. No decisions, we’re talking maybe qual or discovery, things that are much earlier in the process. You can’t mix those two things together. They’ve had different experiences with you. They’ve had different experiences with your competitors. The interviews are totally different, right? And in the case of a no decision, we may want to talk more. We’re going to talk more about the early stages of that life cycle, right?

Unnamed Speaker

Why didn’t it get funded, right? Who was most influential in not funding it or making that decision? How large was the buying team? Who was on the buying team? Which roles were on the buying team? What was the trigger? What was the status quo solution? What were your initial impressions of us and other vendors? Whereas late stage, obviously, they’ve gone through potentially a POC, definitely demos. They have a much more in- depth assessment of your offering, and that’s what you want to talk with them about when you’re doing late stage.

Unnamed Speaker

We’re trying to optimize and improve the more detailed assessments of what we’re doing and go to market and product. Does that make sense, Victoria?

Unnamed Speaker

Absolutely. Thank you.

Unnamed Speaker

Unless there are any other questions, Willem, thanks for the time and the insight. I appreciate all the attendees and engaged questions. This session has been recorded. We’ll post it on Delta V Flight Plan, our portal. Let us know if you have any follow- up questions. I’d be happy to send around the recording or facilitate one- on- one time with Willem, but again, appreciate everyone making time today.

Unnamed Speaker

Thanks, everyone. Appreciate it. Thank you.

Unnamed Speaker

Thanks, everybody, for joining.

💡 Quick tip: Click a word in the transcript below to navigate the video.

Recap

  1. Utilize Automation Tools: Sales engagement software and automation tools are crucial for streamlining buyer outreach and analysis processes, enhancing efficiency as you scale up.
  2. Manage Data Volume Effectively: With the abundance of data generated from interviews, leverage AI-powered analysis tools to automate and streamline the analysis process efficiently.
  3. Qual Quant Approach: Adopt a mixed methods design combining quantitative surveys with qualitative interviews to efficiently gather data and gain deeper insights into decision-making processes.
  4. Tailor Outreach Efforts: Personalize outreach efforts by leveraging relationships between sales representatives and buyers, keeping emails concise and focused on why feedback is important.
  5. Optimize Response Rates: Offer incentives for participation, keep survey lengths short, and utilize tools like Calendly for easy scheduling to increase response rates.
  6. Continuous Improvement: Implement a continuous feedback loop, iterating on outreach strategies, survey design, and interview processes to improve participation rates and data quality over time.
  7. Clear Goal Setting: Define clear goals for the Win-Loss analysis program beyond just its establishment, aligning with broader organizational objectives to drive strategic insights and improvements.
  8. Segment Interviews Strategically: Segment interviews based on the stage of the buyer’s journey, tailoring questions and approaches to gather relevant insights effectively.
  9. Efficient Time Management: Allocate resources effectively by balancing the time commitment needed for Win-Loss analysis with other organizational priorities, considering both the frequency of analysis and reporting.
  10. Stay Goal-Oriented: Maintain a clear focus on the program’s goals throughout the process, aligning interview questions, analysis methods, and reporting frequency with the overarching objectives of the initiative.

Slides

Conducting Win/Loss Analysis

Willem Maas is the Founder and President of Growth Velocity Inc., a rigorous, tech-enabled Win/Loss Analysis consultancy for complex sales. He’s led dozens of win/loss analysis projects and debriefed more than 500 win/loss buyers for companies including Reltio, ServiceChannel, and Lightbend. In this guide, he walks through building a Win/Loss Analysis program, including outreach methods to increase participation rates, conducting illuminating interviews, and producing actionable analysis.
YOU MAY ALSO BE INTERESTED IN
Said Mussa
Moving From Revenue Technology 1.0 To Revenue Technology 2.0
Said Mussa is the founder of Cepture.io, helping B2B companies grow through revenue technology. In this guide, Said walks through how to transition your organization from revenue technology 1.0 to 2.0, including tools that can help, key practices to install in your revenue org, and metrics to measure success.
Destiny LaLane
Leveraging AI and Automation in Recruiting
Destiny Lalane is the Founder of the Recruiting School, providing training for recruiters and embedded recruiter services. She’s worked with FAANG companies, YCombinator and Techstars-backed companies like DrChrono, Ashby, Laudable, and Propagate and later-stage companies like Affirm and Chainalysis. In this guide, Destiny walks through the different ways that recruiting orgs can incorporate AI across the recruiting process, from designing a role to taking it to market and evaluating candidates.
Sam Torres
Leveraging AI in SEO
Samantha Torres is the Chief Data Officer of The Gray Dot Company, an SEO and Data Consulting Company. Samantha has more than 10 years of SEO experience optimizing SEO at 90+ B2B and B2C companies. In this guide, Samantha walks through how to leverage AI to improve your SEO and content strategy, and the implications of generative AI tools on SEO.