Producing Successful Customers at Scale with Dan Ennis from Monday.com
In a time where software is eating the world and companies are frantically trying to figure out how to grow bigger and scale faster, it’s easy to forget that at the core of every business are people like you and me trying their best to help other people. But people don’t scale. We heard those exact words in our last episode with Maranda Dziekonski. How do you ensure that the people are still being successful even as the business explodes?
Today’s guest, Dan Ennis, is the Scale Team Manager on the customer success team at monday.com. One of Dan’s favorite things about monday.com is the data-driven culture. But we know that data is not equal to people.
If you enjoy our discussion, check out more episodes of the Happy Customers podcast. You can subscribe on Apple Podcasts, stream on Spotify, or grab the RSS feed in your player of choice. What follows is a lightly edited transcript of the episode.
Why Segment Customers Between High-Touch and Low-Touch Onboarding Models
Let’s start by understanding the specific customer segment that Dan works with and how the approach to customer success for that segment differs from everybody else at monday.com.
Dan Ennis (01:47): I know every company does scale differently, so at the highest level for us, scale is looking at how to achieve customer success goals: growth, accomplishment, good health, NDR (Net Dollar Retention), and everything that comes with customer success, at a larger scale of accounts. How do we leverage more tech-touch? How do we leverage more one-to-many? How do we do that so that we can have more strategic human intervention, knowing that our CSMs have probably a very similar, actual overall ARR (Annual Recurring Revenue) that they manage to higher-touch CSMs, but it’s just more accounts as opposed to less. Knowing that we can’t have as hands-on of a relationship with customers as the higher touch side, what does it look like to achieve similar goals?
Stuart Balcombe (02:30): We know there’s a whole spectrum of engagement models from self-serve to the highest touch motions, typically reserved for enterprise customers. But can an account change their position on that spectrum over time?
Dan Ennis (02:42): Yeah, it all depends on company size really. Our goal is for them to go through. That’s the ideal flow. As they come in, part of land and expand, and grow through that.
We’re one of the segments where we love to see when a customer leaves our segment. That’s the goal, and that’s the pipeline that we hope for with customers.
Although of course, if it’s a smaller company that’s forever going to be in that segment, that’s something that we encounter as well.
Stuart Balcombe (03:06): How is the team structured around that goal? Acknowledging that this maybe operates a little differently than maybe how you work with lower-touch and much higher-touch customers. How do you structure the team around helping customers be successful in this segment?
Dan Ennis (03:21): Yeah, so the team that we’re on, we do some proactive engagement around their life cycle and what they need based on where they’re in their journey with monday.com: their maturity, all those different components. Then there’s a little bit of a really data-driven reactivity.
From the customer side, it might seem like it’s proactive because we’re reaching out to them, but it’s reactive in the sense that it’s related to things that are triggers that we are seeing in their data. It’s just really sophisticated data-driven reactivity.
Really the team is structured around both because we know that these proactive touchpoints, for some accounts, that’s really all they need.
Dan Ennis (04:08): They’re healthy. They’re self-sufficient. Our couple times a year proactive touchpoints is all they’ll need, plus our one-to-many outreach and everything that we do on that level. Other customers, we can tell from the data triggers that we get, that they need more. We’ve got to have some more human intervention to help achieve the goals that we have with them, which is primarily around health and retention. We’ll work with customers when we see those triggers come in. From their perspective, I’m sure it feels proactive, but at our end, it feels reactive because we’re reacting to triggers that we’re seeing.
Stuart Balcombe (04:39): This is really interesting to me and a theme we will continue to unpack throughout the episode. How does Dan bring together tactics that may be more common in a self-serve or higher volume segment and combine them with some you might not expect to see in a self-serve segment at all. I wanted to know what data is used to prioritize these actions and what triggers they use as the jumping-off point for engagement.
How to Use Data to Prioritize Customer Engagement
Dan Ennis (05:03): For us, a lot of it looks like doing reverse engineering on successful and unsuccessful customers. We know customers that have churned, and we know customers that have grown. So we can plot out back against that, looking at milestones, looking at health triggers, looking at different usage trends, adoption trends, all those different kinds of things. We try to make everything really data-driven. That’s one of Monday’s core values and one of the things that I’ve loved with monday.com is that they live that value. We try to take a look at mapping what was successful. What did successful customers look like? What did that look like? So we can map, okay, this is what we want to see with people on that path, and customers that churned, customers that weren’t successful, what did that look like? So we can know what to look out for as warning signs.
Then we really use a lot of that data to be our north star when we’re testing and following almost a scientific model of, we think that this is what being successful looks like. Let’s compare it against the accounts that we know are successful. There is some validation in that where we’re able to validate our thoughts to customers that we know have been successful. That’s really what a lot of that looks like. It tends to be very product-driven components as far as how they’re using the product and how they’re adopting it. There’s a lot of that that goes into the specific triggers that we look at.
Stuart Balcombe (06:22): Yeah. I’d love to hear, it sounds like a lot of these things are not things that typically fall under the success function or in other companies may not fall under the success function. How much cross-team collaboration is there here versus this is just a project that is the responsibility of our team and we live soup to nuts.
Collaborating Across Teams to Create Successful Customers
Dan Ennis (06:40): We have really great teams that we collaborate with, whether it’s business operations teams that we work with to help define and use some of our data analytics tools and our business intelligence tools. Or they do that combination of training us to be able to do some of that on our own with the tools that we have, as well as for more advanced queries that I’m working with and pulling these reports for us to be able to look at. We’ve got teams that do that a lot. We’ve worked with sales teams to help get data on which were accounts that from our end, when we look, we can just see expansion, which is great and worth knowing. But which were accounts that were tough sells versus this was just a “Got it”, and they grew with the product, and it was really quick, and it was a really natural fit versus an uphill battle.
We can get some of that validation from our sales team. We work with the marketing team to look at types of customers that are opening things that we’re sending. What are the open rates on all those communications that we’re sending out so we can get better data on the communications that we send out. There is a lot of collaboration while also having a lot of autonomy around building out entirely what it is that we are doing.
We’ve got a fantastic leader who manages the team, and she does a great job with helping us build as much as we can autonomously while also being really aware of what other teams in the org can bring to the table and really how we can help be a part of their success and them be a part of our success.
Dan Ennis (08:11): It’s certainly not a silo, certainly not an island. There are times when it feels a little more like that. Sometimes you’ve just got your head down, trying to figure out the process, what is the goal that we want to achieve within the bigger goals of monday.com? And from there, how do we partner with what other teams are already doing in those areas?
Stuart Balcombe (08:30): Right. Yeah. That makes a ton of sense. I know that you mentioned aligning on goals. How do you measure progress? How do you make sure that you are lining up your individual team goals, with the full CS function goal, with ultimately the company goal above that?
Aligning on Goals Across the Organization
Dan Ennis (08:47): Yeah. That’s where we just have really great leaders. I can’t oversell that enough. We’ve got leaders that are in communication with each other. We’ve got great communication at the top and all the way down. There are leaders at the different functions who are owning the communication of what their goals are. As you can imagine, when we’re in hyper-growth, that becomes more of a challenge, but we’ve got leaders that are really committed to that because we know that, again, as an ethos, as a company, a rising tide lifts all boats.
Being able to make sure that our goals are in alignment with what other teams are doing, or being aware of what other teams are working on, helps us make sure that we’re not working in counter directions.
Dan Ennis (09:30): An easy example would be if we were having as one of our goals, something that was maybe customer marketing related because that’s something that marketing may have historically been focused on: new business, top of funnel. And then we find out that marketing’s been working on a big initiative for customer marketing - great. We don’t need to own that. That’s something that we don’t want to have to do. We would rather let other teams do that if that’s where their goals are. We don’t want to be working in a silo. A lot of it comes from that high level of communication, which again is just a culture that’s been hard-fought to have and maintain because it doesn’t come by accident.
Stuart Balcombe (10:09): Especially as you grow.
Dan Ennis (10:13): 100%. Which is a big part of the process, even as we’re growing to make sure that we’re always bringing in people that will be value-culture-adds to what we have to help contribute to and grow that type of culture of sharing, learning, growing, making sure we’re all in the same direction.
Stuart Balcombe (10:30): Totally. To that end or that theme of growth and operating in the hyper-growth environment, what are some things that you’ve found used to work that you’ve had to iterate on and change tech on for the scale and the goals that you have now?
Dan Ennis (10:46): I would say one example would be office hours that we used to do with customers where it used to be more product-focused around features. But as our company grew where we would have these sessions as part of our scale segment, for example, where we would have feature-related office hours where we would present a little bit about it and have a more open conversation with customers, truly office hours style: bring your question, let’s talk, let’s have an open forum to work through this, and answer any of the questions that you have in addition to a proactive presentation.
We’ve found that as we’ve grown and scaled as a company and as our customers have gotten more sophisticated and therefore our support structure, in a really good way, has gotten more sophisticated at providing that, we just didn’t need that as much anymore. Instead, we found that a more strategic angle that we’ve been working with is more industry vertical office hours. It’s less frequent, but a more strategic focus where we’re bringing customers together around their vertical rather than around a component of the product.
That’s been something that’s been really helpful as we’ve grown and scaled because frankly, as we’ve scaled and grown, that office hours wouldn’t have been able to get done what we needed to at the scale we did. Our support team responded and built out great resources that have taken the place of what we did before because that wasn’t going to scale as our customers continued to grow, as we grew our customer base and brought in more customers. So we’ve now landed on this other way of doing it that’s much more strategic and brings, I think, from my perspective, a lot more value to the customers. It’s something that brings a lot more of that human value to it, as opposed to what our support and customer experience teams were able to really help build out.
How to Allocate Budget and Resources For Customer Success
Stuart Balcombe (12:28): What does this look like in practice? How does Dan think about allocating resources to reach these goals? What are the things that are in the proactive bucket versus the reactive bucket and how do you allocate time to each?
Dan Ennis (12:40): Yeah, absolutely. We know that we don’t want to have the level of relationship that will be there on a higher-touch side. Knowing that, we want to reserve our human interactions for times that are going to be most strategic and impactful to have just-in-time human interaction. Part of that is proactive lifecycle-based. We factor that around where the customer is in the lifecycle, and how we allocated resources, frankly, is just digging into the data. Where are our customers at? Where are they in the contract lifecycle? Where are they in the product lifecycle? What do we think they’ll need based on what we’ve built out around that? Then that’s the resources that we plan, and we know this because these are certain fixed data points that are happening within this timeframe.
Even if they expand it a little bit, sure, give or take, these are the accounts that are falling into this portion of their life cycle in this quarter. That’s some pretty fixed data points, which makes for some really easy planning on the proactive side. Then to also plan out what does our one-to-many look like? We try to do that more proactively. What are the campaigns that we can launch? What are the one-to-many office hours that we can do in a given quarter? We plan a lot of that, whether it’s the one-to-many initiatives or the one-to-one initiatives of more account-driven things, digging into the data to see where do we need that. Then similarly, I’m going to just be a broken record when I talk about all our data because I’m also a data nerd apparently, is looking at the data for even the reactive side.
Okay. How many accounts did we work with that experienced these different triggers in the past year? Once we’ve identified, what are the triggers we want to look for, on average, obviously there’s no way to guarantee and fully predict that, but what does it look like to dig into the accounts that we have and see how many needed that and then plan our people’s time accordingly.
How to Use Data to Validate Decisions
Stuart Balcombe (14:36): I know we’ve talked about data throughout. What are other tools that you are living in every day that are giving you that data? Or is this a ‘we are going and running queries and SQL or Postgres or whatever’? What’s the way that you are accessing that data?
Dan Ennis (14:49): Absolutely. Salesforce reporting and our business intelligence tool. We use Looker, but frankly, any business intelligence tool would help you as long as it’s a robust one. We dig into our business intelligence tool for a lot of the questions that are around things like “trending over time, how is usage of the product?”, “trending over time, how are they allocating the seats that they’ve purchased, as far as using them?” And Salesforce is fantastic for driving things like expansion: How is that looking over time? How is it looking that they are having as far as the number of seats in their plan, that they’re expanding over time and growing versus when are they renewing? All those different kinds of things that we look at when we combine the two and get a lot of our data through those.
A really robust business intelligence tool is something that I think is non-negotiable and has been super helpful to be able to do this at scale. So that we’re not just saying, “I feel this or I think this.” And trying it out. We’re saying, “I think or I feel this.” We’re testing it against data before we go out and do this at scale with our customers.
Stuart Balcombe (15:53): Yeah. You mentioned this earlier, being able to validate some of these thoughts with data and then ultimately with customers. How do you plan or how do you think about the cycles around ‘we thought this, we saw this trend or whatever it is in the data’, and ‘we did something’? How do you close that loop of what we did actually helped move the number or improved the experience for that group?
Dan Ennis (16:16): Yeah. There’s a lot of trial and error as far as iterating and checking against data. But there are two ways to think about it for me: One is the proactive gut validation: ‘We think feature X is really sticky.’ ‘This is one that we think keeps customers.’ Let’s look at customers that grew and validate that. Does the data show that we’re right, that they did use this feature a lot? There are times when you find out, yep, we’re correct. Those that churned did not use this. We can say that there’s a correlation, even if there’s not a causation, we can say there’s correlation here worth investing our energy into trying to help customers use this, that we know has made other customers more successful. Sometimes we get the negative validation of that and we say, ‘Wow, this feature seemed really sticky. Sure, there are some that really use it, but a lot of our successful customers don’t use it.’ That doesn’t mean it’s not a great, helpful feature for those that do, but it’s not one worth going out of our way to drive adoption on. That’s one way. So the proactive validation of ideas.
Then the second would be looking at the results of our impact. Part of that is thinking strategically, ‘Are we looking to have a long-term change on customer behavior? Or are we trying to have a short-term change?’ When we take an action, we try to look at what is the desired outcome, because it’s easy to just get into the habit of, okay, we want to do something with customers. This is what I’m doing. When in reality, it’s helpful before taking those actions with customers to say, ‘what is the goal that I want to see come out of this?’ Is it that I want to see them create more boards? Is it that I want to see them increase their user count? What is the goal for this particular action? Then is that a short-term goal where we should expect to see this impact quickly? Or is this a longer-term goal over a longer-term campaign?
We define a lot of those criteria so that we can then go back and look at the data and say, okay, this worked for these accounts and didn’t for these. Is there anything in common here between these accounts? Is there anything we can glean from that to do this better next time? Or is it just, great - this is effective with some accounts and not with others. This is worth doing at a broader scale. There’s a little bit of gray in there, but it all depends on what we’re trying to do.
That’s one of the things that I think comes out a lot is “Start with the end.” What’s the goal that we have in mind for this so that we can measure the success, so that we can measure the effectiveness, and then know what to iterate on.
The Top Customer Success Metrics that Matter to Monday.com
Stuart Balcombe (18:48): Right. What is the metric that defines success for you and your team? What is the number that you are reporting back up to the customer success team lead and ultimately out beyond to the rest of the company?
Dan Ennis (18:59): Yeah, like most, it’s really a twofold factor. It’s health/adoption, which we’ve got pretty specific numbers internally that do health/adoption. Then like all CS teams, net dollar retention. I think rolling that up, those two are the high-level ones.
Stuart Balcombe (19:17): One conversation that’s bubbling up in CS right now is how to measure and report on the business impact of making customers successful through NRR (Net Revenue Retention). I was curious to know how Dan and monday.com think about the difference between dollars added by sales in the initial checkout and the dollars added by success in account expansion and recovery of unhealthy accounts.
Dan Ennis (19:39): Yeah. We do look at that. There’s a couple of different components to it. One is we try to celebrate and look at wins when they happen and accounts do expand. I think, for example, on the scale side, we don’t touch every account as close. It’s a little harder to see. But we own retention. Part of owning retention is having the net dollar retention. If we’re able to see some of those leading indicators, that’s why on our side, I think health comes out a little bit stronger initially because when we are trying to measure “Did we help turn health around?” because we know that it’s significantly harder for sales to sell and expand an account that’s not healthy in the first place. If they’re a new business, that’s one thing. If you’re closing a big deal, they might not have experience already. But if they’re currently using a platform, and they’re in really poor health because they’re not using it or not enjoying it because they already have frustrations or fill in the blank, it’s a significantly harder battle for sales to help turn that around.
A leading indicator of expansion is just helping get their health turned around, helping to work with them to get them into a healthy spot with their adoption because we want to see them accomplishing their goals because we care about seeing our customers be successful.
A lot of times that looks like expanding because Monday is such a great robust platform and really expanding is in their best interest. But it’s also because we just care about seeing their success. On our team, part of it’s the leading indicator: Was there a trend? Did the CSM work closely with this? A lot of times, because we do have a separate team that is account management that does own the expansion part of it, we know that these are a little inseparable.
Dan Ennis (21:18):
If a CSM is doing a good job to get customers directly bought in, then it’s going to make the account manager’s job that much easier to get the expansion out of it. While we don’t own the actual sales motion of it, that’s why we look at the overall net dollar retention because if we’re owning the customer and their success, part of that success includes the expansion.
I’m giving a little bit of a longer answer, but it’s because it’s something that isn’t necessarily cut and dry when there’s as many teams as there are involved with the complexity of it because we care about seeing a customer grow. We care about seeing a customer be healthier.
Stuart Balcombe (21:58): One thing that you mentioned is that health is a leading indicator. It’s also a metric where you have an internal measure of health. This is maybe a little different in a much higher-touch, more one-to-one relationship where you’re actually having lots of conversations with a customer. But what goes into health for your team? What are the markers on that health score for you?
Dan Ennis (22:22): Yeah. We look at a number of things, whether it’s a number of weekly users that are being active in the platform, its depth of usage, so how many average actions are people actually taking in the platform? How many days are they logging in? And their capacity, or how many seats are they using out of what they’ve purchased? All of those factor in to give us a health score.
What Would You Change If You Could Wave a Magic Wand?
Stuart Balcombe (22:45): Gotcha, gotcha. A couple of final questions. This one’s kind of a fun one, hopefully. In all your experience in Customer Success, you’ve seen plenty of different challenges and different roles, different companies, different segments of customer. What’s the one Customer Success or Customer Success-adjacent problem that you would just wipe off the slate if you had a magic wand?
Dan Ennis (23:07): Yeah. It might sound a little cliche, but just being able to actually magically wave the wand around having customer goals and CS goals be in alignment because you get this mismatch where sometimes customer success managers are in companies where the goals aren’t in alignment with actually making the customers successful. Then you get these people that are passionate about that working in companies and not able to focus on helping a customer achieve their goal because we know that even something on our end, like a health score, tells us one picture of health.
If a company’s accomplishing the goal that they bought monday.com for, they’re golden, even if they have a “so-so” health score. They’re happy if they bought us to help make a process easier, and that process is no longer something they think about because it was a headache before they came on board, they don’t care what their health score says.
Dan Ennis (24:01): That’s not a Monday-specific thing; that’s just across the board. CS teams have a hard time lining up with what a customer goal is. If I could magically wave the wand to figure out what does that look like, because I also get there’s a subjectivity around, oh, the customer’s achieving their goals, it’s fine, ignore the score. What does that look like? I don’t know. People way smarter than me and paid way bigger bucks than me are figuring that out, I’m sure. That’s why if I could wave my magic wand, I would make that go away so then CSMs could just be freed up to really focus on making customers successful.
Stuart Balcombe (24:32): I love that. That also brings up another question as these things always do, right? It’s always a rabbit hole when you talk about the things that a company can track and the metrics that we have and the systems that we have for understanding what customer success looks like, and what the customer actually sees as success. How do you think about it? How is that incorporated into your team’s work? The data that you get from actions and product-centric data, versus things that the customer is actually saying which is more qualitative, and I guess squishy for lack of a better term.
Dan Ennis (25:07): Absolutely. This is my favorite thing here. Monday has the best culture of celebrating wins and breadth of wins out of any company I’ve ever been a part of. A big part of that is it’s really hard to measure that in a formal goal, right? That more qualitative piece, some of those. Whether sometimes you can with getting success stories and getting customer quotes, but even that can be a little hard to make an objective thing you measure. Where Monday does that, is we’ve got our objective pieces that are our goals that we have to measure because we have to be data-driven to do at the scale of a company like monday.com.
But in the meantime, we celebrate the heck out of those other moments, whether it’s something a customer said in a call, whether it’s something a customer said in an email, we make sure we celebrate that up.
Dan Ennis (25:45): We look at that and we say, hey, this is how the customer’s feeling. Here’s something they said before. They weren’t as happy, but oh my goodness. Now they’re talking about how they couldn’t imagine doing this any other way. That’s a win. We see some usage change in the data and we would love to see that even more play out in the data because we know that that individual isn’t going to necessarily be at the company forever. That’s why being data-driven is so important because when their replacement comes in, who doesn’t have the emotional attachment comes in and says, well, show me and justify this. We need data to back that up. I fully get that. We celebrate the heck out of those people that do. Monday has just a really strong culture of celebration around that. That makes all the difference for me.
Stuart Balcombe (26:39): It’s interesting to me, as I have more of these conversations too, that company culture, whether it’s alignment between sales and CS, whether it’s alignment around company success versus customer success, whatever it is, culture has such a huge impact on the ability to make customers successful and ultimately happy customers at the end of all this. This has been great. One final question to wrap things up. Who is somebody who you think would be great to have on the show as well? Who would you like to hear from to answer similar questions about what they’re up to at their company?
Dan Ennis (27:15): Fantastic question there. The number one person that always … two that come to my mind would be first and foremost, Diana De Jesus at Catalyst, just because she’s someone who has such a broad perspective as a CSM and being a part of a company like Catalyst, where they’re a company for CSMs. I think it’s such an interesting perspective. I would love to be a fly on the wall for that conversation and hear that. Or listen to the episode when you produce it, of course. She’d probably be the number one pick with the second close one being Erica Villarreal of Condeco, because I think that, again, her experience transitioning into having both a really small org that she was a part of as well as a much larger organization that she’s a part of now on the more enterprise side, that breadth of experience I think would be really cool to hear from. I know that she thinks really strategically about things. It’d be a great guest snag.
Stuart Balcombe (28:11): Amazing. I would definitely go reach out. But yeah, thank you so much for doing this. There’s so much great stuff. I love the really data-driven approach and how iterative and, yes, there is the customer’s goal and the goal of making the customer successful proactively, but it doesn’t always have to actually be proactive. There’s a reactive element if you have enough data to do things just in time. Thank you so much for joining me.
Dan Ennis (28:34): Absolutely. Thanks for having me, Stuart. Really appreciate it.