‍Turning insights into action to create memorable customer experiences

Connor Cirillo is a Senior Conversational Marketing Manager at Hubspot. He leads the global internal rollout of all things conversational, chatbots, and messaging at Hubspot. Before that, he founded two companies and helped grow early-stage B2B and B2C companies in market and product. He specializes in creating conversational experiences that make marketing, sales, and service more natural and meaningful.

Enterprise companies have huge amounts of data they can examine. That’s great for them, but where does it leave businesses with small user bases? Can you still pull actionable customer insights out of statistically insignificant numbers? 

As someone who founded a small company and is a conversational marketer at a large one, Connor speaks from experience when he says the answer is yes. In this interview, we explore why creating a great customer experience takes ruthlessness, why you need both quantitative and qualitative data, how an insight turns into action at Hubspot, and what all this means for smaller players. 

Stuart Balcombe: What advice do you have for somebody who is building a product? A lot of what you do, like getting people to take action, creating better frictionless experiences...those are things a product-team should care about.

Connor Cirillo: I'm ruthless about finding a great user experience. Anytime I'm looking around the product, or thinking through an experience we're going to launch, I’m asking, “is this as short as it needs to be?” 

...throughout your whole product, you want to ask: “What is the least amount of stuff you can have people do to get them where they want to go?”

There's a great quote by a mathematician named Blaise Pascal. He said, “I would have written you a shorter letter if only I had more time.” That’s how I think about UX. Great design is invisible. The more time you spend, the less it should be for the user. You're only giving them the absolute necessities in terms of what helps them. 

I apply that philosophy whenever we're building something. And product teams can do that too. Be ruthless. Remove the fact that you're the one building the thing and say, “If I’m at point A, and I need to accomplish this task and get to point B, what is the least amount of stuff I need to do to get there?” 

This goes beyond just conversational interface, bots, or messaging. Really throughout your whole product, you want to ask: “What is the least amount of stuff you can have people do to get them where they want to go?”

And we've seen lots of data on how if you can reduce the time to value for a customer, you greatly improve your chances of them being successful with your product. That impacts all sorts of things. It definitely improves retention and adoption. Can you give an example of a win you’ve seen at Hubspot? 

Yes, so let's let's zoom in on how you get help in-app. 

One thing we know is users get stuck. Particularly with a good product, there's a lot of stuff you can do when you log in, and people get stuck. That isn’t unique to us. That’s just tools. 

But one thing we found is when folks get stuck with us, they hit a point where they’re like, “I don't know how to do this thing. Let me go check Google.” They open up a new tab and search, “How do I import contacts?” and the articles they find are HubSpot articles.

We wrote the resources, but we weren't connecting the resource and the customer. So we built a bot for our users and customers where, if you type in the thing you need help with, we'll go look it up for you. We totally remove a step in the customer workflow.

What we're finding is, by removing that step in the workflow, the time to resolution is shorter. And it keeps people within a nice little sandbox where they can stay focused on the task at hand and not get sidetracked.

How did you find that insight? How did you discover people will go to Google and leave the product before coming back with the answer?

Our research team studied a bunch of users and watched how they interact in a live environment, but also asked more pointed interview questions when they got stuck. Questions like, “What's important to you?”  And time to resolution was really important. Quality of resources recommended, too. 

We took the quantitative and qualitative insights and blended them together to answer: what's a good holistic approach to this?

What does your interaction with customers look like? How do you do customer discovery to find the perfect copy or the things that are going to resonate? Where do you go to to find those things out?

People literally tell you in their own words what they want to do. And there's no better roadmap than taking the things your customers say and asking how you can build against them. 

Every time a bot asks, “hey, what are you looking for help with?” we store the response and feed it into natural language processing software where we’re able to categorize and break it down. We’ve become really good at saying, oh, this segment of users has this problem on this page, and here's how we solve it. 

People literally tell you in their own words what they want to do. 

It’s reactive in a good way. Anytime we launch a new bot, web property, app, or part of the product, we’re able to say, what are the things coming in? What are the top three issues? Let's go build against that. 

Yeah, makes sense. And you're in a nice position at HubSpot where you have enough data to do that. Which is not true for everyone, particularly founders who are just starting out or have maybe 10 or 100 users. What would you recommend to founders who don't have that volume of data?

If you can hear from the 20 most high action or valuable users you have, if you can get their thoughts or figure out a problem they're facing, those insights are really valuable. 

...there is value in hearing from whoever you have, and it doesn't have to be thousands or millions of people. 

And then being able to see themes. Maybe people are asking a lot of “how do I do this?” kind of questions. And you might realize, oh, I'm just not making it clear what these next steps are. Or they might be saying, “Who do I talk to about this thing?” And maybe you're not making it clear who the point of contacts are for different things. 

The way you cut it is going to change a lot based on maturity and vertical and all those fun things. And I'm ignoring statistical significance and a lot of the big volume things you would ideally want. But there is value in hearing from whoever you have, and it doesn't have to be thousands or millions of people. 

Yeah, I advocate for getting those insights as often as you can and talking to as many people as you can. Specifically folks who are adding the most business value and doing as close to what you want them to be doing as possible. Segmenting your audience is super important. 

How do you identify intent in raw insight? And then where does it go? Take me on the journey of an insight. 

So a user comes in and they’re in-app looking for help. 

And they say, how do I import contacts? With the tool we built, we take the request and look up a couple articles. We say, what's the most relevant set of articles for that problem? And we return those. 

Then we have a feedback loop in there right away. It says, were these helpful? Did this solve your problem? And if they say yes, we say, great, thanks so much. If they say no, we hand them off to a human. 

Then we’ll try to see are there a lot of these requests coming in? The best insights for action are high quantity and have high levels of repetitiveness. 

Once we identify an insight, we'll ask, is there a right answer to this? To import contacts, there is a right answer. Like set up your CSV this way or set up an integration that way. 

So in this particular case, we would build out the right answer and distill it. People don't necessarily want three articles; they want three sentences. They want just the right amount of info to do that thing. 

People don’t necessarily want three articles; they want three sentences. 

We identify, what's the least amount of information we need to get you to the next step? Then we'll feed it into the tool, so the next time someone says, “how do I import contacts?” we'll send them back those three sentences instead of the three articles.

We're able to give them less and have them get more out of it. It's much more tailored and contextualized. And that really works when there's a problem you know a lot of folks are having. 

And you don't need natural language for this. You just need to understand what people are getting tripped up on the most and what's the right answer or right next step to that question. Maybe the right next step is you give them to a human, because you know this is not going to be something you can automate. 

We identify, what's the least amount of information we need to get you to the next step?

But if you do nothing else than fast track them for a human and set good expectations, that's a loop we see work really, really well.

We take this structured approach to find a couple of new things each week, find the right answers, and build that in. So every week we get a little bit better.

This really highlights the importance of getting insights on an ongoing basis. What does this process look like internally? Who owns looking at the raw data? 

The way the Conversational Marketing team is set up is, it's everyone's responsibility to be looking for these insights. And it’s on everyone to say, “what are the problems we're seeing?” 

In the past we’ve had a sheet where we would dump everything we've gotten the past week and start saying, “OK, what do we notice here? What sticks out?” And it's interesting. Even though that data is pretty unstructured, if you read a bunch of lines of text, you'll start seeing, “oh, well, that's a match, and that's a match. I've seen a lot of those and people are interacting in a way I didn't expect them to.” 

The best insights for action are high quantity and have high levels of repetitiveness.

If you comb through even unstructured data like we do, you can come up with insights pretty quickly. 

How much time would you say that you allocate weekly to combing through those insights?

I try to do at least half an hour a day. So maybe three to four hours in a week. Particularly in this world, it is so important to hear what people are saying in their own words. 

Particularly in this world, it is so important to hear what people are saying in their own words. 

I am in a fortunate position where I have a lot of data. My job is better and easier the more time I spend looking at that stuff. I don't see it as a chore. I see it as part of the job I really relish because it helps build a road map. 

You're in a very conversational role where the output of your work is words. Do you think this also translates to people in other roles, whether that's a PM, designer, or marketer?

Yeah, absolutely. If you're working in another discipline—product, design, marketing—you have numbers to work with and those numbers tell a story. 

One thing I see great marketers do is learn how to tell a story with data. They see, “oh our conversion rates from page A to page B are 5%.”  And they learn to put a story together of why might that be, and what problems those people are experiencing. They try to think in the shoes of that user and work backwards from what test we need to run to validate whether that's true.

Even if you can't get in front of all of those people, you could think holistically. Who are they? Why are they going here, and what might they be trying to do? 

This is something I've heard a lot: “Great marketers can turn data into story.” Do you do you believe you can actually do that? Go from data to story?

Yes, I believe you can. I don't think it's one of those weird jedi things where you're either born with it or you're not. You can get more comfortable over time looking at a problem and looking at all the numbers, pieces, and conversion points involved. Working backwards from, “why might this be happening?”

You may not get it 100% right all the time, but you can definitely form hypotheses. For example, we believe the reason people aren't going to this page is they don't know the buttons are there and they're getting lost looking for this other thing. I mean, that’s not MIT level stuff. Anyone can have those insights. 

You may not get it 100% right all the time, but you can definitely form hypotheses.

It’s an exercise and a muscle. If you're looking at numbers for the first time, you might be staring at a page thinking, “I don't know what to do with this.” 

But the more you challenge yourself to ask, “why might that be happening?” and the more persona work you do up front to figure out who is our user and what you think they’re trying to do, the more you do start to think this way. And it’s not that you'll come to the answer. But you will come up with a test.

You realize, “oh, people are definitely getting stuck here because X, Y, Z. If only this thing was more clear or if only we position this differently.” 

Do you think people would be better served if they used qualitative data? Where do you think qualitative data falls on the value spectrum?

It really depends on the problem you're trying to solve. 

I appreciate numbers make an easier comparison, but qualitative data can be great when you're trying to get at the heart of something, or more of the under the hood. For example, “What would you expect to be able to do here?” Qualitative data can be really helpful for that.

For any marketer, product manager, or designer, what you need to make a decision is going to change, so it's learning to be comfortable with both. And trying to find when one serves you better than the other. 

If you’re only looking at one piece of data, you're gonna box yourself in.

Right. It depends on the stage, how much data you have, and the specific problem you're trying to solve. Having quantitative data is great. If you have it, it's definitely valid. It gives you those hypotheses you mentioned, and then you can dig deeper with qualitative data and test some assumptions. 

Yeah, and at Hubspot we talk a lot about, do we need statistical significance on every test? Or are there some things that are just the right thing to do? 

And there are times where the experience is just broken. We’ll monitor things, but you have a gut feeling of this is the right thing to do for our customers based on their goals, and based on what we need for them to be successful. 

Being ruthless about removing friction means sometimes going, that’s just broken...

And so there is a balance and learning to get comfortable when you need each kind of data. The statistically significant stuff and being numbers-driven is great, and when we believe in that for a lot of things. 

But being ruthless about removing friction means sometimes going, “That's just broken. There's no way this is the best thing we can be doing here, and I don't need another week of people suffering to know that it should be easier to do this thing.”

So one thing that’s come up a couple of times in this conversation is the difference between HubSpot as an enterprise company and the level of data you have compared to, say, a founder who only has their first hundred customers. They don't have the data to make statistically significant quantitative decisions. 

What’s transferable from what you do at a large company to what they’re going to be able to implement in their business?

Yeah, I get that problem all too well. I used to be a founder of a one hundred customer company that got less than a thousand page views. 

One thing I've seen translate is knowing when qualitative data is going to help me the most vs. when quantitative data going to help me most. You don’t need crazy large data sets. If you're realizing a thousand people came to your home page and 20 of them bought, you probably have some kind of problem you can figure out. You don't always need statistical significance to tell you something's wrong or something's right. 

If you're seeing there's an 80/20 split where 80% of your customers buy this one thing, well, what does that tell you? Maybe it's signaling, oh, the fit for the market is much more towards this one thing we're selling vs. this other thing. Don’t be afraid to talk to yourself and ask those questions. 

You don’t always need statistical significance to tell you something’s wrong or something’s right. 

Start taking things you see, even at a small business level, and saying, what does this really mean? What is someone really trying to do? When they say this, what do they really mean? 

Enterprise down to one man shop SMB, those insights are so valuable. Learn how to push out more of them. It's something every founder, marketer or designer can do, and it’s really impactful. Plus it scales as you grow up.

This interview has been edited and condensed for clarity.

Subscribe to Short Loops - A newsletter by Stuart Balcombe

Insights for early-stage growth and product development that puts customer outcomes at the center.