StationCLOUD | How we increased customer loyalty by 125% in 6 hours - StationCLOUD
6
single,single-post,postid-6,single-format-standard,ajax_updown,page_not_loaded,
 
 
Awakening_Torres_by_ArturStanisz

How we increased customer loyalty by 125% in 6 hours

Recently, I sent out a Net Promoter Score (NPS) survey to all Baremetrics[1] users. I was hesitant at first, but it quickly surfaced the areas where we were really excelling as well as the areas we needed to improve.

In a previous life, I ran a survey company[2] (which happened to be what I initially built Baremetrics for since we used Stripe). We had an NPS question you could add to your surveys and, truth be told, I was never really a fan of it. That was mostly because we just didn’t have the tool setup to properly run a true NPS survey the right way.

Thankfully, the folks at Promoter.io[3] (who are also Baremetrics customers) have a product specifically for running a proper NPS survey, and sweet goodness, when an NPS survey is done correctly, it’s amazing.

So, let’s pick all this apart. First, we’ll take a brief look at what an NPS survey is, and how we sent ours out. Then, we’ll tackle the juicy bits: what the results were for our survey. We’ll also cover some tips on getting the most out of sending one yourself.

 

What is this “Net Promoter” survey you speak of?

Net Promoter[4] is a method for measuring the loyalty of your customers based on a 0–10 rating. The “score” you receive after the survey is an indicator of how likely your customers are to stick around, or how likely they are to churn in the coming months.

High score = more likely to stay an active customer.

Low score = more likely to churn in the next 60–90 days.

The score can range from –100 to +100, and is calculated as % of Promoters – % of Detractors. We’ll cover what that means shortly.

The individual score itself isn’t terribly important in and of itself, but tracking the score overtime is a great indicator of the customer loyalty trend which is quite important.

 

Promoters, Passives and Detractors! Oh my!

At the core of Net Promoter are 3 categories of customers based on the 0–10 rating the customer gives you.

  • Promoters: 9 or 10
  • Passives: 7 or 8
  • Detractors: 0 – 6

Let’s look at what each category actually means.

Promoters (9–10)

These customers are your most loyal. They love the product, they love your company and they’re more than happy to recommend you to others. Keep them happy.

Passives (7–8)

These customers are generally satisfied but may have a small issue or two that keeps them from really loving your product. This is low-hanging fruit for you to knock out of the park and turn in to a “Promoter.”

Detractors (0–6)

These customers are a major risk. If they stay as “detractors” they’ll likely churn in the next 30–90 days. They need quick attention to resolve the issues they’re having.

So, what do you do with the responses?

Having some number attached to customer loyalty isn’t inherently valuable, but the opportunity to converse with your customers is.

We followed up with every single rating and either thanked them for their positive rating (and gave them a way to easily share Baremetrics), or we asked how we could do better and fix whatever issues they were having.

The ratings are useless without the conversation.

How we set up and ran our Net Promoter campaigns

So, how did we set up and run Net Promoter for our customer base?

Sending the survey

We have, roughly, 350 active users. These are people who are currently paying customers and have logged in to Baremetrics in the past month.

It’s important to send the survey out only to current, active users as the purpose of the survey is to measure loyalty…and I think we can agree that customers who are no longer active aren’t exactly the most loyal group of people for you. :)

We divided up the users into two separate campaigns. The first was mainly to do a little tire-kicking and make sure there were no issues with the tool and that people would actually respond.

Here’s what the email survey looked like…

First responses

We sent the survey out to the first group on a Tuesday morning. Ratings started coming in. The first few were great…then a string of “bad” ratings came in, along with feedback on why they weren’t happy.

The feedback revolved primarily around two things: stability issues and lack of progress. Both of which we’ve been working tirelessly to address.

Our growth has created a lot of stability and performance issues with the platform and we’ve been working literally day and night to tackle them (this past week we actually made a major database upgrade that has almost completely eliminated performance problems).

In addition to those performance issues, the perceived lack of progress is something I knew was an issue, but the Net Promoter survey brought to light just how big of an issue it has been.

Many customers felt that after months and months of big product additions/changes, we’d kind of stopped actually progressing and making it better. That couldn’t be farther from the truth, though.

Then something interesting happened in the middle of sending these NPS campaigns: we shipped a major new feature[5].

So, how did that affect future feedback?

 

The second round and a different story told

I mentioned that we sent out the first campaign on a Tuesday morning. Later that morning, we launched a new feature that we’d been working on for quite some time. Something that many users had been requesting.

Then, that Tuesday afternoon, I sent out the NPS survey to the second group. The ratings and the responses told quite a different story.

In the time between the two campaigns, we had directly addressed one of the primary issues: lack of progress.

Here are the final outcomes of the two groups…

What you’ll notice is that the percentage of “Detractors” didn’t change much (though it did go down), but the percentage of “Passives” changed quite a bit.

Our addressing that primary issue had a direct, positive affect on customers’ view of Baremetrics, and people who were on the fence had their confidence in us boosted immediately.

The timing was convenient from a measurement standpoint as it quickly solidified that we’re on the right track.

 

Takeaways and tips

1. We’re doing better than we thought

We have a ton of room for improvement, but, honestly, the minority of users who have had isolated issues have been very vocal and were throwing off what, I assumed, the general feeling of our user base was. This was a great opportunity to give ourselves a light pat on the back.

 

2. Following up with every customer is crucial

Regardless of the rating, each customer deserves a response. It’s fast and easy to do. Don’t skimp on the opportunity to talk to everyone who took the time to give you feedback.

 

3. Ignore the score…initially.

Our NPS score for the first group was 16, and 36 for the second. Neither number really means much. Technically, they’re considered “good” but what’s more important is the trend. As with any metric, a single, individual number isn’t all that useful, but the long term trend is what helps you make smart business decisions.

 

What about you?

So, what about you and your business? Have you run an NPS survey before? Any tips for other business owners?

This article was originally published here: https://baremetrics.io/blog/increase-saas-customer-loyalty-with-nps

  • http://www.EGAFutura.com Juan Manuel Garrido

    Test