Skip to:
Share
How do you apply user analytics to conversion rates optimization?
User analytics was a key factor that helped us convert 2x more visitors into customers.
But why did we couldn't understand where to focus on for months. Nothing worked.
Sometimes you might feel like you're stuck because you're not getting enough new customers. You might tell your sales team to up their calls and throw more money into marketing, but it feels like you're just running in place.
If you don't have more cash to spend but everyone's looking for growth, user analytics is your friend.
Follow the steps I talked about in this article and I guarantee you'll uncover what's holding you back.
We'll show you how fixing those things first will allow you to release the brake before pressing the gas pedal and achieve even better results than we did.
Let's dive in!
Step 1 - Setup Tracking Using Segment & Mixpanel
The #1 thing you need to kick off user analytics is to start tracking your users' behavior along the funnel journey. Realistically, you can't really do anything if you don't have this in place. And don't worry, tracking doesn't mean spying on users or undermining their privacy. It's rather an anonymous way to identify patterns in your users' flow and pinpoint the issues with it so you can then improve the user experience.
There are different ways you can deal with it, and I won't dive into too many details as I understand that the majority of our readers are not technical. But ultimately, it boils down to 2 tools: what you use to collect first-party customer data and how you analyze it. And there are both free and paid versions of each which I will cover below:
Tools to collect data
There are only 2 tools that you realistically may ever need when it comes to tracking:
Segment - the most reliable option that I have the best experience with. It's also scalable and requires a minimum time investment compared to Google Tag Manager, which is free and considered the most popular one.
Google Tag Manager - has the most flexibility, is free, and has a lot of documentation. You can find dozens of content online as GTM is something you usually need no matter if you use Segment or not. The problem is it's not scalable, it's heavy on your website speed, and at some point just becomes too much to maintain.
Talk to your developers; most likely, you already have something in place. If not, I suggest going with Segment as it's hassle-free and always makes sure your data is coming to the right destination. And speaking of which…
Analytics tools
I've tested 15 analytics tools, and to make it simple for you, I will talk only about #1 paid and #1 free tool that I think are worth your time.
Mixpanel - covers 90% of user analytics and has extremely accurate results when your tracking is implemented correctly. Historically, it was considered that Mixpanel (and its alternatives) are the best for web app/product analytics, and website analytics are better left to Google Analytics. But with all of the recent privacy issues with Google Analytics, the inaccuracy of its last version (Google Analytics 4), and new pricing updates on Mixpanel's side, it makes a lot of sense to just rely on Mixpanel for every step of your funnel and save you a headache from using Google's product.
Google Analytics 4 - Yes, it still works. It's free, and there are no competitors to it that have the same functionality (aside from Matomo, but it's really technical) and flexibility. So if you're fine with lots of skewed data and an extremely inconvenient interface, aside from the privacy of your users that are now all under Google's hood, it's a good (and only free) version that you can use for user analytics.
While both options are solid and a lot of people use Google Analytics, in the next steps, I'll be using Mixpanel as this is what we did and worked well for us. I am sure you can emulate this in Google Analytics or any other software you're using to measure your growth metrics, be it Amplitude, Heaps, or Kissmetrics. The process remains the same.
Step 2 - Build a Funnel Report and Find Key Drop-Off Points
The funnel report is basic but extremely helpful. It's a great way to enable the power of user analytics. To create one, all you need is to visualize the journey of your users from the very first touchpoint to all of the purchases.
It's also important to break your funnel down by acquisition channels. Very often, you will see completely different behavior and purchasing power when you compare your ads, for example, with cold calling. So when building the report, make sure to analyze all of these segments.
Here's how our funnel looked like:
Landing Page Visit → Signup button click → First name fill in → Email fill in → Signup happened → Key Action X Happened → Plan Purchase
The idea is to be as specific as possible. We had around 25 steps in the funnel; I listed here only the main ones. To create a funnel like above, you will need:
Developers to implement the tracking. Do a simple flow as I did above and send it to your dev team so they know what you want to achieve.
In Mixpanel, build a simple funnel report using “event” or steps that your engineers will be sending into it from Segment or Google Tag Manager. If you're not sure how to do this, you can ask somebody technical on your team, and I am sure they'll be able to assist you.
In the end, you should end up with something like this:
Now it's important to identify the drop-offs. All you have to do at this point is to look at conversion rates from step to step and see if they hit the benchmarks. The benchmarks vary from industry to industry, so simple googling will help you on that one, but generally speaking in SaaS industry these are the numbers you should try to hit:
Website to signup form: 14%
Signup form to MQL: 38%
Signup to conversion rate: 30%
Pick the one step which is far beyond the benchmark level, and this is what you're going to be optimizing now!
Step 3 - Set Up Hotjar Surveys & Watch Recordings + Heatmaps
Well, user analytics doesn't stop at qualitative insights only. Sure, it can tell you where the drop-off is happening, but it will never tell you why it is so. If your conversion rate is too far off, sometimes it's enough to go through this step of the funnel yourself and try to put yourself in your customer's shoes, but because of your status quo, it may not work. You need the real customer insights to understand why this step is causing troubles.
And generally speaking, there are 2 easy ways of how you can do this, using Hotjar and talking to customers. Let's focus on Hotjar for now.
Well, first of all, Hotjar is not the only tool out there. Fullstory is another popular and reliable choice. I have used both, and Hotjar seemed to work best for smaller projects and just has features to get you what you need.
Install Hotjar on your website & product. Give it some time to gather data. Then sit and just watch the most relevant recordings. You can do it by using its ML capabilities to determine if the session has any insights or not.
Check the heatmaps and see if people are clicking somewhere they're not supposed to or skip some important sessions which you don't want them to skip. Basically, look at what draws users' attention and compare it to what you want them to read/do.
It will give some initial thoughts and ideas, but likely, you will want to see what they're thinking while doing something. None of the experience insights tools will allow you to do this. And here we come to the next step.
Step 4 - Conduct 3 Face-to-Face User Tests
Love it or hate it - talking to customers is an inevitable part of user analytics. There is always something you will never know till you ask your ICP, who's not aware of your product yet, to go through a specific experience firsthand. This is where the real magic happens.
There's a lot of theory related to this topic. Make sure to read Steve Krug's book: "Don't Make Me Think" if you'd like to get a good understanding of how to correctly conduct these interviews and why it is important. Maze has some great resource on this topic too.
Simply speaking, there are 2 ways of how you can conduct user tests.
Unmoderated testing: This is a more flexible option as contributors can complete their tests on their own time without you syncing with them in real-time. But it usually brings fewer results as you can't ask follow-up questions and dig deeper into some actions the user did.
Moderated testing: This is what you should be doing first. A direct, face-to-face zoom call with a person from your target audience.
The idea is simple. You create a simple set of tasks, including the one you pinpointed earlier as a high friction point, and just watch. All you can ask is "Why?". For example, if a user did something, you ask: "Why did you do it?" and then answer: "Thanks, that's helpful".
No: "Do you like it?" or "How cool is this?" kind of questions. Try to be as unbiased as possible.
Start from a homepage before giving the first task. Make no introduction about the company whatsoever. Explain the scenario in which you want the participant to imagine having a need/problem and using YOUR service to fix it. Then ask your first task, for example: "How much would it cost you to do it using <your product name>?
A very good first question to ask when you show the user the homepage is: "What do you make of it?". You will uncover a lot of interesting things here, trust me.
Now, make sure to record everything (ask for consent first) and don't spend time taking notes. Also, what worked really well for me is to jot down my thoughts right after the interview, as trust me, you will have a lot.
Step 5 - Brainstorm and Pick Top 5 Ideas to Address Issues
Now, this step can be self-explanatory, as you will find it while following the process, as your head will be full of ideas after all of the tests. But there are 2 steps people usually miss that I think are very important:
Set up a meeting with other interested people to get their feedback. I found having different perspectives very helpful. We can be blind to the obvious, and we are also blind to our blindness. So make sure to share your recordings and observations with others and see what they have to say. It will also build up trust and loyalty for user testing, so you will end up having more supporters.
Brainstorm a list of ideas. Basically, what you want to do here is to think out of the box and create hypotheses for the problems you pointed out during the user tests. As you get the list of ideas, have the top 5 ones circled. Those should be your top priority. I like this prioritization method by Warren Buffet and I think it fits in nicely here.
Step 6 - Implement the Ideas in A/B Test
As you got the ideas, it's very important to measure if they actually do work. It's always tempting to skip this step, relying on your gut feeling. Do not. Not only you will be only guessing that the new version is better until some time passes and you will be able to compare it with the old conversion rates, risking to hurt the existing experience even more. But also even if your hypotheses are correct, you will never know the extent of it and it will be hard to come up with a business case of this user analytics experiment.
There's one thing though to keep in mind. A/B tests work only if you have a sufficient number of conversions happening into the experiment. You can determine sufficiency by using this calculator. As a rule of thumb, if your experiment has fewer than 200 conversions a month, A/B testing will not work for you.
By conversion, I mean, let's say your hypothesis is to improve the signup form. You can look up in Mixpanel how many signups you have per month. If it's more than 200, then you're good to go with a quantitative way of proving your hypotheses with A/B test.
If it's less, then your best bet would be to repeat a user tests cycle with new participants and see if they experience the issues you were trying to fix. This is a more qualitative way of validating your ideas and also works well.
But A/B testing shouldn't be applied only to design or product experience. If you're in sales, you can A/B test the strategies aiming to help your customers along the way to see what resonates the best. It's another thing I like about user analytics. The insights you get can be applied in a variety of ways. So it's important to view user analytics as something that every team in your company can be using.
A/B Testing Tools to choose
So if you have enough conversions to run A/B tests now, it's time to pick the right tool.
VWO is probably the best for this use case if your test is website-related and if you're not running 20 A/B tests a month or more. If you're planning to integrate experimentation and user analytics into your culture and set up a dedicated team running experiments and improving the conversion rates, something we ended up doing, eventually it will make more sense to transition to Optimizely.
For product-related A/B tests, we used Launchydarkly that allows you to deploy the variation to some portion of your new customers. But you'd better consult with your engineering team and ask for their advice on what is the best way they can accomplish what you need.
Before you rush into putting your ideas to the test, it's important to aggregate them by semantics. In other words, one A/B test can represent multiple changes to the user's flow done around the SAME hypothesis. If your top 5 ideas are all based on completely different core assumptions, you will need to set up 5 different, consecutive A/B tests (never all at once) to see which assumption is correct.
As you decide on the A/B testing tool, ask your engineers to implement it correctly. This includes:
Making sure it works and doesn't flicker (when users can see a different version for a second).
Making sure the tracking is set up and integration with Mixpanel is enabled. While all of these A/B testing tools have in-built reporting, it's far from what Mixpanel or any other dedicated tools have to offer, so make sure you use VWO only to set up your test and analyze everything in Mixpanel.
Making sure to implement an A/B test and not an A/B/n test (a multivariate test).
Another tip is to always test for full business cycles (usually one week) to catch all of the people's behavior throughout the week. Meaning if you start on Monday, you should finish your test on Monday only. And make sure to calculate in advance for how long you will need to run the test, by comparing the MDE (from the calculator) with the actual test performance.
This should be enough to get you started. I covered the main things, but by no means all of them. Experimentation is an art that has so many nuances, these guys wrote 303 articles only around it. So if you want to learn more about it, they are the best.
Step 7 - Analyze The Results and Pick The Winner
When it comes to analyzing your test results, you will be using the same calculator as you used to determine if you have a sufficient number of conversions to run the test.
As you input your data, it will check for the conversion rate and whether you have a statistically significant winner or loser. There is also a nice Bayesian chart that I found helpful to use for reporting purposes.
It's important to understand here that the worst possible outcome is inconclusive results. Basically, when there is no statistical difference in conversion rates. Failures are always good as well as winners as now you discovered what is really important for users.
If you can't learn from the failure, you've designed a bad test. Next time you design, imagine all your stuff failing. What would you do?
When you see inconclusive or bad results, but your hypothesis is well research-driven - it's always a good idea to check segments (another reason to use analytics tools for analysis). Often different segments cancel each other.
If you didn't move the needle with any segment - and still confident about your hypothesis, the implementation of your idea could be lousy. So you can run 2-3 more tests with different execution of the same hypothesis to see if any of them reach significance.
If you run 2-3 more tests and still get lousy results - your hypothesis is probably bad. If you couldn't learn from all these tests anything - it's even worse.
So half of the analysis is basically done around what caused the effect. Dive into segments, talk to other stakeholders, run more tests around it to prove or disregard it. The most important aspect of A/B testing is learning something about your customers. So you can then delight them with exceptional experiences and in return, a high-converting funnel that every business is dreaming of.
As you wrap everything up - share the results and learnings with others in the company. Make it as easy as possible to understand. You can ask your UI designer to create a nice-looking presentation. Here's a structure I use:
Reasons and hypotheses
Setup of the test (duration, segmentation, and KPIs)
Results for each main KPI
Business case
Segmentation and other useful insights
Learnings & Recommendations
Always Learn, Always Iterate
The magic of user analytics happens when you view it as an ongoing journey, not a one-off task. If you look at your funnel, every step of it contains an experience for your customers, and you want to make sure that it's as smooth as possible.
And remember, perfection always comes with iteration. You can't just set it and forget it. It's extremely important to regularly revisit and update your strategies based on new learning and insights.
Follow the steps in this article as a part of your user analytics optimization cycle that you can repeat as many times as needed. Or even better, loop them into your monthly or quarterly routine and slowly but steadily create a user-centric data-driven environment rooted in your company culture.
In The End, It's All About Users Experience
After diving deep into how user analytics can significantly boost your conversion rates, it's crucial to understand the multifaceted role of user data in shaping an optimal customer experience.
User experience goes beyond just the surface-level interactions - it encompasses understanding user behavior, journey, and engagement at every touchpoint. By utilizing the right user analytics tool, such as Google Analytics or Amplitude, you can gather a wealth of user data that illuminates the path to refining product design and enhancing user engagement.
The collection and analysis of usage data, through tools like user analytics software, enable us to map out the user journey in detail. This journey analysis offers insights into various user personas, revealing different types of users and their behaviors. Such data is invaluable, not just for tailoring the user experience but also for driving customer retention and optimizing the customer experience.
A comprehensive user analysis strategy involves both quantitative data and qualitative insights from user interviews and survey data. These insights help in understanding the nuances of customer behavior and user needs.
With the right analytics software, tracking key metrics becomes easier, allowing for a more targeted approach to product analytics. This, in turn, informs the development of new features, making the app or website more responsive to user needs.
Always Make Use Of The Insights Learnt
Incorporating user behavior data into your product design process ensures that new users find value in your app from their very first interaction. User behavior analytics, combined with user engagement metrics, guide the development of features that reduce friction and improve usability. The goal is to create an interface that resonates with different user personas, enhancing the overall user experience and driving active user engagement.
Moreover, user analytics data offers a clear view of different traffic sources, including mobile devices and various browsers, enabling you to adapt your product for a variety of user contexts. This level of user analysis and data collection allows for more effective customer experience strategies, focusing on what users genuinely need and how they interact with your product across different platforms.
Choosing the best tools for user analytics involves considering a variety of metrics and functionalities, from dashboards and notifications to AI-powered insights.
Tools like Google Analytics provide a broad overview of user engagement, while specialized software like Amplitude offers deeper insights into user behavior and retention metrics. These analytics tools are really important for marketers aiming to understand customer behavior in less time, with dashboards that highlight key metrics like recency, frequency, and engagement patterns.
By employing a user analysis process that includes both data analytics and user research, companies can discover new ways to engage users. Whether it's through personalized email campaigns, social media notifications, or in-app messages, the aim is to meet users where they are and address their needs directly. This approach not only enhances the user journey but also plays a critical role in customer retention and engagement.
User analytics tools, therefore, are not just about collecting data; they're about making that data actionable. By understanding user needs, behaviors, and journeys through a comprehensive analytics strategy, businesses can design better products, improve user engagement, and ultimately retain more customers.
This journey from data collection to user analysis to product improvement is what turns good apps into great ones, ensuring that new features meet user needs and that every update leads to higher engagement and retention rates.
In conclusion, embracing user analytics means embracing a data-driven approach to growth. It's about using a variety of metrics and insights to understand and improve the user experience, engagement, and journey. With the right tools and strategies, you can unlock new growth opportunities, enhance customer retention, and ensure that your product not only meets but exceeds user expectations.
How do you apply user analytics to conversion rates optimization?
User analytics was a key factor that helped us convert 2x more visitors into customers.
But why did we couldn't understand where to focus on for months. Nothing worked.
Sometimes you might feel like you're stuck because you're not getting enough new customers. You might tell your sales team to up their calls and throw more money into marketing, but it feels like you're just running in place.
If you don't have more cash to spend but everyone's looking for growth, user analytics is your friend.
Follow the steps I talked about in this article and I guarantee you'll uncover what's holding you back.
We'll show you how fixing those things first will allow you to release the brake before pressing the gas pedal and achieve even better results than we did.
Let's dive in!
Step 1 - Setup Tracking Using Segment & Mixpanel
The #1 thing you need to kick off user analytics is to start tracking your users' behavior along the funnel journey. Realistically, you can't really do anything if you don't have this in place. And don't worry, tracking doesn't mean spying on users or undermining their privacy. It's rather an anonymous way to identify patterns in your users' flow and pinpoint the issues with it so you can then improve the user experience.
There are different ways you can deal with it, and I won't dive into too many details as I understand that the majority of our readers are not technical. But ultimately, it boils down to 2 tools: what you use to collect first-party customer data and how you analyze it. And there are both free and paid versions of each which I will cover below:
Tools to collect data
There are only 2 tools that you realistically may ever need when it comes to tracking:
Segment - the most reliable option that I have the best experience with. It's also scalable and requires a minimum time investment compared to Google Tag Manager, which is free and considered the most popular one.
Google Tag Manager - has the most flexibility, is free, and has a lot of documentation. You can find dozens of content online as GTM is something you usually need no matter if you use Segment or not. The problem is it's not scalable, it's heavy on your website speed, and at some point just becomes too much to maintain.
Talk to your developers; most likely, you already have something in place. If not, I suggest going with Segment as it's hassle-free and always makes sure your data is coming to the right destination. And speaking of which…
Analytics tools
I've tested 15 analytics tools, and to make it simple for you, I will talk only about #1 paid and #1 free tool that I think are worth your time.
Mixpanel - covers 90% of user analytics and has extremely accurate results when your tracking is implemented correctly. Historically, it was considered that Mixpanel (and its alternatives) are the best for web app/product analytics, and website analytics are better left to Google Analytics. But with all of the recent privacy issues with Google Analytics, the inaccuracy of its last version (Google Analytics 4), and new pricing updates on Mixpanel's side, it makes a lot of sense to just rely on Mixpanel for every step of your funnel and save you a headache from using Google's product.
Google Analytics 4 - Yes, it still works. It's free, and there are no competitors to it that have the same functionality (aside from Matomo, but it's really technical) and flexibility. So if you're fine with lots of skewed data and an extremely inconvenient interface, aside from the privacy of your users that are now all under Google's hood, it's a good (and only free) version that you can use for user analytics.
While both options are solid and a lot of people use Google Analytics, in the next steps, I'll be using Mixpanel as this is what we did and worked well for us. I am sure you can emulate this in Google Analytics or any other software you're using to measure your growth metrics, be it Amplitude, Heaps, or Kissmetrics. The process remains the same.
Step 2 - Build a Funnel Report and Find Key Drop-Off Points
The funnel report is basic but extremely helpful. It's a great way to enable the power of user analytics. To create one, all you need is to visualize the journey of your users from the very first touchpoint to all of the purchases.
It's also important to break your funnel down by acquisition channels. Very often, you will see completely different behavior and purchasing power when you compare your ads, for example, with cold calling. So when building the report, make sure to analyze all of these segments.
Here's how our funnel looked like:
Landing Page Visit → Signup button click → First name fill in → Email fill in → Signup happened → Key Action X Happened → Plan Purchase
The idea is to be as specific as possible. We had around 25 steps in the funnel; I listed here only the main ones. To create a funnel like above, you will need:
Developers to implement the tracking. Do a simple flow as I did above and send it to your dev team so they know what you want to achieve.
In Mixpanel, build a simple funnel report using “event” or steps that your engineers will be sending into it from Segment or Google Tag Manager. If you're not sure how to do this, you can ask somebody technical on your team, and I am sure they'll be able to assist you.
In the end, you should end up with something like this:
Now it's important to identify the drop-offs. All you have to do at this point is to look at conversion rates from step to step and see if they hit the benchmarks. The benchmarks vary from industry to industry, so simple googling will help you on that one, but generally speaking in SaaS industry these are the numbers you should try to hit:
Website to signup form: 14%
Signup form to MQL: 38%
Signup to conversion rate: 30%
Pick the one step which is far beyond the benchmark level, and this is what you're going to be optimizing now!
Step 3 - Set Up Hotjar Surveys & Watch Recordings + Heatmaps
Well, user analytics doesn't stop at qualitative insights only. Sure, it can tell you where the drop-off is happening, but it will never tell you why it is so. If your conversion rate is too far off, sometimes it's enough to go through this step of the funnel yourself and try to put yourself in your customer's shoes, but because of your status quo, it may not work. You need the real customer insights to understand why this step is causing troubles.
And generally speaking, there are 2 easy ways of how you can do this, using Hotjar and talking to customers. Let's focus on Hotjar for now.
Well, first of all, Hotjar is not the only tool out there. Fullstory is another popular and reliable choice. I have used both, and Hotjar seemed to work best for smaller projects and just has features to get you what you need.
Install Hotjar on your website & product. Give it some time to gather data. Then sit and just watch the most relevant recordings. You can do it by using its ML capabilities to determine if the session has any insights or not.
Check the heatmaps and see if people are clicking somewhere they're not supposed to or skip some important sessions which you don't want them to skip. Basically, look at what draws users' attention and compare it to what you want them to read/do.
It will give some initial thoughts and ideas, but likely, you will want to see what they're thinking while doing something. None of the experience insights tools will allow you to do this. And here we come to the next step.
Step 4 - Conduct 3 Face-to-Face User Tests
Love it or hate it - talking to customers is an inevitable part of user analytics. There is always something you will never know till you ask your ICP, who's not aware of your product yet, to go through a specific experience firsthand. This is where the real magic happens.
There's a lot of theory related to this topic. Make sure to read Steve Krug's book: "Don't Make Me Think" if you'd like to get a good understanding of how to correctly conduct these interviews and why it is important. Maze has some great resource on this topic too.
Simply speaking, there are 2 ways of how you can conduct user tests.
Unmoderated testing: This is a more flexible option as contributors can complete their tests on their own time without you syncing with them in real-time. But it usually brings fewer results as you can't ask follow-up questions and dig deeper into some actions the user did.
Moderated testing: This is what you should be doing first. A direct, face-to-face zoom call with a person from your target audience.
The idea is simple. You create a simple set of tasks, including the one you pinpointed earlier as a high friction point, and just watch. All you can ask is "Why?". For example, if a user did something, you ask: "Why did you do it?" and then answer: "Thanks, that's helpful".
No: "Do you like it?" or "How cool is this?" kind of questions. Try to be as unbiased as possible.
Start from a homepage before giving the first task. Make no introduction about the company whatsoever. Explain the scenario in which you want the participant to imagine having a need/problem and using YOUR service to fix it. Then ask your first task, for example: "How much would it cost you to do it using <your product name>?
A very good first question to ask when you show the user the homepage is: "What do you make of it?". You will uncover a lot of interesting things here, trust me.
Now, make sure to record everything (ask for consent first) and don't spend time taking notes. Also, what worked really well for me is to jot down my thoughts right after the interview, as trust me, you will have a lot.
Step 5 - Brainstorm and Pick Top 5 Ideas to Address Issues
Now, this step can be self-explanatory, as you will find it while following the process, as your head will be full of ideas after all of the tests. But there are 2 steps people usually miss that I think are very important:
Set up a meeting with other interested people to get their feedback. I found having different perspectives very helpful. We can be blind to the obvious, and we are also blind to our blindness. So make sure to share your recordings and observations with others and see what they have to say. It will also build up trust and loyalty for user testing, so you will end up having more supporters.
Brainstorm a list of ideas. Basically, what you want to do here is to think out of the box and create hypotheses for the problems you pointed out during the user tests. As you get the list of ideas, have the top 5 ones circled. Those should be your top priority. I like this prioritization method by Warren Buffet and I think it fits in nicely here.
Step 6 - Implement the Ideas in A/B Test
As you got the ideas, it's very important to measure if they actually do work. It's always tempting to skip this step, relying on your gut feeling. Do not. Not only you will be only guessing that the new version is better until some time passes and you will be able to compare it with the old conversion rates, risking to hurt the existing experience even more. But also even if your hypotheses are correct, you will never know the extent of it and it will be hard to come up with a business case of this user analytics experiment.
There's one thing though to keep in mind. A/B tests work only if you have a sufficient number of conversions happening into the experiment. You can determine sufficiency by using this calculator. As a rule of thumb, if your experiment has fewer than 200 conversions a month, A/B testing will not work for you.
By conversion, I mean, let's say your hypothesis is to improve the signup form. You can look up in Mixpanel how many signups you have per month. If it's more than 200, then you're good to go with a quantitative way of proving your hypotheses with A/B test.
If it's less, then your best bet would be to repeat a user tests cycle with new participants and see if they experience the issues you were trying to fix. This is a more qualitative way of validating your ideas and also works well.
But A/B testing shouldn't be applied only to design or product experience. If you're in sales, you can A/B test the strategies aiming to help your customers along the way to see what resonates the best. It's another thing I like about user analytics. The insights you get can be applied in a variety of ways. So it's important to view user analytics as something that every team in your company can be using.
A/B Testing Tools to choose
So if you have enough conversions to run A/B tests now, it's time to pick the right tool.
VWO is probably the best for this use case if your test is website-related and if you're not running 20 A/B tests a month or more. If you're planning to integrate experimentation and user analytics into your culture and set up a dedicated team running experiments and improving the conversion rates, something we ended up doing, eventually it will make more sense to transition to Optimizely.
For product-related A/B tests, we used Launchydarkly that allows you to deploy the variation to some portion of your new customers. But you'd better consult with your engineering team and ask for their advice on what is the best way they can accomplish what you need.
Before you rush into putting your ideas to the test, it's important to aggregate them by semantics. In other words, one A/B test can represent multiple changes to the user's flow done around the SAME hypothesis. If your top 5 ideas are all based on completely different core assumptions, you will need to set up 5 different, consecutive A/B tests (never all at once) to see which assumption is correct.
As you decide on the A/B testing tool, ask your engineers to implement it correctly. This includes:
Making sure it works and doesn't flicker (when users can see a different version for a second).
Making sure the tracking is set up and integration with Mixpanel is enabled. While all of these A/B testing tools have in-built reporting, it's far from what Mixpanel or any other dedicated tools have to offer, so make sure you use VWO only to set up your test and analyze everything in Mixpanel.
Making sure to implement an A/B test and not an A/B/n test (a multivariate test).
Another tip is to always test for full business cycles (usually one week) to catch all of the people's behavior throughout the week. Meaning if you start on Monday, you should finish your test on Monday only. And make sure to calculate in advance for how long you will need to run the test, by comparing the MDE (from the calculator) with the actual test performance.
This should be enough to get you started. I covered the main things, but by no means all of them. Experimentation is an art that has so many nuances, these guys wrote 303 articles only around it. So if you want to learn more about it, they are the best.
Step 7 - Analyze The Results and Pick The Winner
When it comes to analyzing your test results, you will be using the same calculator as you used to determine if you have a sufficient number of conversions to run the test.
As you input your data, it will check for the conversion rate and whether you have a statistically significant winner or loser. There is also a nice Bayesian chart that I found helpful to use for reporting purposes.
It's important to understand here that the worst possible outcome is inconclusive results. Basically, when there is no statistical difference in conversion rates. Failures are always good as well as winners as now you discovered what is really important for users.
If you can't learn from the failure, you've designed a bad test. Next time you design, imagine all your stuff failing. What would you do?
When you see inconclusive or bad results, but your hypothesis is well research-driven - it's always a good idea to check segments (another reason to use analytics tools for analysis). Often different segments cancel each other.
If you didn't move the needle with any segment - and still confident about your hypothesis, the implementation of your idea could be lousy. So you can run 2-3 more tests with different execution of the same hypothesis to see if any of them reach significance.
If you run 2-3 more tests and still get lousy results - your hypothesis is probably bad. If you couldn't learn from all these tests anything - it's even worse.
So half of the analysis is basically done around what caused the effect. Dive into segments, talk to other stakeholders, run more tests around it to prove or disregard it. The most important aspect of A/B testing is learning something about your customers. So you can then delight them with exceptional experiences and in return, a high-converting funnel that every business is dreaming of.
As you wrap everything up - share the results and learnings with others in the company. Make it as easy as possible to understand. You can ask your UI designer to create a nice-looking presentation. Here's a structure I use:
Reasons and hypotheses
Setup of the test (duration, segmentation, and KPIs)
Results for each main KPI
Business case
Segmentation and other useful insights
Learnings & Recommendations
Always Learn, Always Iterate
The magic of user analytics happens when you view it as an ongoing journey, not a one-off task. If you look at your funnel, every step of it contains an experience for your customers, and you want to make sure that it's as smooth as possible.
And remember, perfection always comes with iteration. You can't just set it and forget it. It's extremely important to regularly revisit and update your strategies based on new learning and insights.
Follow the steps in this article as a part of your user analytics optimization cycle that you can repeat as many times as needed. Or even better, loop them into your monthly or quarterly routine and slowly but steadily create a user-centric data-driven environment rooted in your company culture.
In The End, It's All About Users Experience
After diving deep into how user analytics can significantly boost your conversion rates, it's crucial to understand the multifaceted role of user data in shaping an optimal customer experience.
User experience goes beyond just the surface-level interactions - it encompasses understanding user behavior, journey, and engagement at every touchpoint. By utilizing the right user analytics tool, such as Google Analytics or Amplitude, you can gather a wealth of user data that illuminates the path to refining product design and enhancing user engagement.
The collection and analysis of usage data, through tools like user analytics software, enable us to map out the user journey in detail. This journey analysis offers insights into various user personas, revealing different types of users and their behaviors. Such data is invaluable, not just for tailoring the user experience but also for driving customer retention and optimizing the customer experience.
A comprehensive user analysis strategy involves both quantitative data and qualitative insights from user interviews and survey data. These insights help in understanding the nuances of customer behavior and user needs.
With the right analytics software, tracking key metrics becomes easier, allowing for a more targeted approach to product analytics. This, in turn, informs the development of new features, making the app or website more responsive to user needs.
Always Make Use Of The Insights Learnt
Incorporating user behavior data into your product design process ensures that new users find value in your app from their very first interaction. User behavior analytics, combined with user engagement metrics, guide the development of features that reduce friction and improve usability. The goal is to create an interface that resonates with different user personas, enhancing the overall user experience and driving active user engagement.
Moreover, user analytics data offers a clear view of different traffic sources, including mobile devices and various browsers, enabling you to adapt your product for a variety of user contexts. This level of user analysis and data collection allows for more effective customer experience strategies, focusing on what users genuinely need and how they interact with your product across different platforms.
Choosing the best tools for user analytics involves considering a variety of metrics and functionalities, from dashboards and notifications to AI-powered insights.
Tools like Google Analytics provide a broad overview of user engagement, while specialized software like Amplitude offers deeper insights into user behavior and retention metrics. These analytics tools are really important for marketers aiming to understand customer behavior in less time, with dashboards that highlight key metrics like recency, frequency, and engagement patterns.
By employing a user analysis process that includes both data analytics and user research, companies can discover new ways to engage users. Whether it's through personalized email campaigns, social media notifications, or in-app messages, the aim is to meet users where they are and address their needs directly. This approach not only enhances the user journey but also plays a critical role in customer retention and engagement.
User analytics tools, therefore, are not just about collecting data; they're about making that data actionable. By understanding user needs, behaviors, and journeys through a comprehensive analytics strategy, businesses can design better products, improve user engagement, and ultimately retain more customers.
This journey from data collection to user analysis to product improvement is what turns good apps into great ones, ensuring that new features meet user needs and that every update leads to higher engagement and retention rates.
In conclusion, embracing user analytics means embracing a data-driven approach to growth. It's about using a variety of metrics and insights to understand and improve the user experience, engagement, and journey. With the right tools and strategies, you can unlock new growth opportunities, enhance customer retention, and ensure that your product not only meets but exceeds user expectations.
How do you apply user analytics to conversion rates optimization?
User analytics was a key factor that helped us convert 2x more visitors into customers.
But why did we couldn't understand where to focus on for months. Nothing worked.
Sometimes you might feel like you're stuck because you're not getting enough new customers. You might tell your sales team to up their calls and throw more money into marketing, but it feels like you're just running in place.
If you don't have more cash to spend but everyone's looking for growth, user analytics is your friend.
Follow the steps I talked about in this article and I guarantee you'll uncover what's holding you back.
We'll show you how fixing those things first will allow you to release the brake before pressing the gas pedal and achieve even better results than we did.
Let's dive in!
Step 1 - Setup Tracking Using Segment & Mixpanel
The #1 thing you need to kick off user analytics is to start tracking your users' behavior along the funnel journey. Realistically, you can't really do anything if you don't have this in place. And don't worry, tracking doesn't mean spying on users or undermining their privacy. It's rather an anonymous way to identify patterns in your users' flow and pinpoint the issues with it so you can then improve the user experience.
There are different ways you can deal with it, and I won't dive into too many details as I understand that the majority of our readers are not technical. But ultimately, it boils down to 2 tools: what you use to collect first-party customer data and how you analyze it. And there are both free and paid versions of each which I will cover below:
Tools to collect data
There are only 2 tools that you realistically may ever need when it comes to tracking:
Segment - the most reliable option that I have the best experience with. It's also scalable and requires a minimum time investment compared to Google Tag Manager, which is free and considered the most popular one.
Google Tag Manager - has the most flexibility, is free, and has a lot of documentation. You can find dozens of content online as GTM is something you usually need no matter if you use Segment or not. The problem is it's not scalable, it's heavy on your website speed, and at some point just becomes too much to maintain.
Talk to your developers; most likely, you already have something in place. If not, I suggest going with Segment as it's hassle-free and always makes sure your data is coming to the right destination. And speaking of which…
Analytics tools
I've tested 15 analytics tools, and to make it simple for you, I will talk only about #1 paid and #1 free tool that I think are worth your time.
Mixpanel - covers 90% of user analytics and has extremely accurate results when your tracking is implemented correctly. Historically, it was considered that Mixpanel (and its alternatives) are the best for web app/product analytics, and website analytics are better left to Google Analytics. But with all of the recent privacy issues with Google Analytics, the inaccuracy of its last version (Google Analytics 4), and new pricing updates on Mixpanel's side, it makes a lot of sense to just rely on Mixpanel for every step of your funnel and save you a headache from using Google's product.
Google Analytics 4 - Yes, it still works. It's free, and there are no competitors to it that have the same functionality (aside from Matomo, but it's really technical) and flexibility. So if you're fine with lots of skewed data and an extremely inconvenient interface, aside from the privacy of your users that are now all under Google's hood, it's a good (and only free) version that you can use for user analytics.
While both options are solid and a lot of people use Google Analytics, in the next steps, I'll be using Mixpanel as this is what we did and worked well for us. I am sure you can emulate this in Google Analytics or any other software you're using to measure your growth metrics, be it Amplitude, Heaps, or Kissmetrics. The process remains the same.
Step 2 - Build a Funnel Report and Find Key Drop-Off Points
The funnel report is basic but extremely helpful. It's a great way to enable the power of user analytics. To create one, all you need is to visualize the journey of your users from the very first touchpoint to all of the purchases.
It's also important to break your funnel down by acquisition channels. Very often, you will see completely different behavior and purchasing power when you compare your ads, for example, with cold calling. So when building the report, make sure to analyze all of these segments.
Here's how our funnel looked like:
Landing Page Visit → Signup button click → First name fill in → Email fill in → Signup happened → Key Action X Happened → Plan Purchase
The idea is to be as specific as possible. We had around 25 steps in the funnel; I listed here only the main ones. To create a funnel like above, you will need:
Developers to implement the tracking. Do a simple flow as I did above and send it to your dev team so they know what you want to achieve.
In Mixpanel, build a simple funnel report using “event” or steps that your engineers will be sending into it from Segment or Google Tag Manager. If you're not sure how to do this, you can ask somebody technical on your team, and I am sure they'll be able to assist you.
In the end, you should end up with something like this:
Now it's important to identify the drop-offs. All you have to do at this point is to look at conversion rates from step to step and see if they hit the benchmarks. The benchmarks vary from industry to industry, so simple googling will help you on that one, but generally speaking in SaaS industry these are the numbers you should try to hit:
Website to signup form: 14%
Signup form to MQL: 38%
Signup to conversion rate: 30%
Pick the one step which is far beyond the benchmark level, and this is what you're going to be optimizing now!
Step 3 - Set Up Hotjar Surveys & Watch Recordings + Heatmaps
Well, user analytics doesn't stop at qualitative insights only. Sure, it can tell you where the drop-off is happening, but it will never tell you why it is so. If your conversion rate is too far off, sometimes it's enough to go through this step of the funnel yourself and try to put yourself in your customer's shoes, but because of your status quo, it may not work. You need the real customer insights to understand why this step is causing troubles.
And generally speaking, there are 2 easy ways of how you can do this, using Hotjar and talking to customers. Let's focus on Hotjar for now.
Well, first of all, Hotjar is not the only tool out there. Fullstory is another popular and reliable choice. I have used both, and Hotjar seemed to work best for smaller projects and just has features to get you what you need.
Install Hotjar on your website & product. Give it some time to gather data. Then sit and just watch the most relevant recordings. You can do it by using its ML capabilities to determine if the session has any insights or not.
Check the heatmaps and see if people are clicking somewhere they're not supposed to or skip some important sessions which you don't want them to skip. Basically, look at what draws users' attention and compare it to what you want them to read/do.
It will give some initial thoughts and ideas, but likely, you will want to see what they're thinking while doing something. None of the experience insights tools will allow you to do this. And here we come to the next step.
Step 4 - Conduct 3 Face-to-Face User Tests
Love it or hate it - talking to customers is an inevitable part of user analytics. There is always something you will never know till you ask your ICP, who's not aware of your product yet, to go through a specific experience firsthand. This is where the real magic happens.
There's a lot of theory related to this topic. Make sure to read Steve Krug's book: "Don't Make Me Think" if you'd like to get a good understanding of how to correctly conduct these interviews and why it is important. Maze has some great resource on this topic too.
Simply speaking, there are 2 ways of how you can conduct user tests.
Unmoderated testing: This is a more flexible option as contributors can complete their tests on their own time without you syncing with them in real-time. But it usually brings fewer results as you can't ask follow-up questions and dig deeper into some actions the user did.
Moderated testing: This is what you should be doing first. A direct, face-to-face zoom call with a person from your target audience.
The idea is simple. You create a simple set of tasks, including the one you pinpointed earlier as a high friction point, and just watch. All you can ask is "Why?". For example, if a user did something, you ask: "Why did you do it?" and then answer: "Thanks, that's helpful".
No: "Do you like it?" or "How cool is this?" kind of questions. Try to be as unbiased as possible.
Start from a homepage before giving the first task. Make no introduction about the company whatsoever. Explain the scenario in which you want the participant to imagine having a need/problem and using YOUR service to fix it. Then ask your first task, for example: "How much would it cost you to do it using <your product name>?
A very good first question to ask when you show the user the homepage is: "What do you make of it?". You will uncover a lot of interesting things here, trust me.
Now, make sure to record everything (ask for consent first) and don't spend time taking notes. Also, what worked really well for me is to jot down my thoughts right after the interview, as trust me, you will have a lot.
Step 5 - Brainstorm and Pick Top 5 Ideas to Address Issues
Now, this step can be self-explanatory, as you will find it while following the process, as your head will be full of ideas after all of the tests. But there are 2 steps people usually miss that I think are very important:
Set up a meeting with other interested people to get their feedback. I found having different perspectives very helpful. We can be blind to the obvious, and we are also blind to our blindness. So make sure to share your recordings and observations with others and see what they have to say. It will also build up trust and loyalty for user testing, so you will end up having more supporters.
Brainstorm a list of ideas. Basically, what you want to do here is to think out of the box and create hypotheses for the problems you pointed out during the user tests. As you get the list of ideas, have the top 5 ones circled. Those should be your top priority. I like this prioritization method by Warren Buffet and I think it fits in nicely here.
Step 6 - Implement the Ideas in A/B Test
As you got the ideas, it's very important to measure if they actually do work. It's always tempting to skip this step, relying on your gut feeling. Do not. Not only you will be only guessing that the new version is better until some time passes and you will be able to compare it with the old conversion rates, risking to hurt the existing experience even more. But also even if your hypotheses are correct, you will never know the extent of it and it will be hard to come up with a business case of this user analytics experiment.
There's one thing though to keep in mind. A/B tests work only if you have a sufficient number of conversions happening into the experiment. You can determine sufficiency by using this calculator. As a rule of thumb, if your experiment has fewer than 200 conversions a month, A/B testing will not work for you.
By conversion, I mean, let's say your hypothesis is to improve the signup form. You can look up in Mixpanel how many signups you have per month. If it's more than 200, then you're good to go with a quantitative way of proving your hypotheses with A/B test.
If it's less, then your best bet would be to repeat a user tests cycle with new participants and see if they experience the issues you were trying to fix. This is a more qualitative way of validating your ideas and also works well.
But A/B testing shouldn't be applied only to design or product experience. If you're in sales, you can A/B test the strategies aiming to help your customers along the way to see what resonates the best. It's another thing I like about user analytics. The insights you get can be applied in a variety of ways. So it's important to view user analytics as something that every team in your company can be using.
A/B Testing Tools to choose
So if you have enough conversions to run A/B tests now, it's time to pick the right tool.
VWO is probably the best for this use case if your test is website-related and if you're not running 20 A/B tests a month or more. If you're planning to integrate experimentation and user analytics into your culture and set up a dedicated team running experiments and improving the conversion rates, something we ended up doing, eventually it will make more sense to transition to Optimizely.
For product-related A/B tests, we used Launchydarkly that allows you to deploy the variation to some portion of your new customers. But you'd better consult with your engineering team and ask for their advice on what is the best way they can accomplish what you need.
Before you rush into putting your ideas to the test, it's important to aggregate them by semantics. In other words, one A/B test can represent multiple changes to the user's flow done around the SAME hypothesis. If your top 5 ideas are all based on completely different core assumptions, you will need to set up 5 different, consecutive A/B tests (never all at once) to see which assumption is correct.
As you decide on the A/B testing tool, ask your engineers to implement it correctly. This includes:
Making sure it works and doesn't flicker (when users can see a different version for a second).
Making sure the tracking is set up and integration with Mixpanel is enabled. While all of these A/B testing tools have in-built reporting, it's far from what Mixpanel or any other dedicated tools have to offer, so make sure you use VWO only to set up your test and analyze everything in Mixpanel.
Making sure to implement an A/B test and not an A/B/n test (a multivariate test).
Another tip is to always test for full business cycles (usually one week) to catch all of the people's behavior throughout the week. Meaning if you start on Monday, you should finish your test on Monday only. And make sure to calculate in advance for how long you will need to run the test, by comparing the MDE (from the calculator) with the actual test performance.
This should be enough to get you started. I covered the main things, but by no means all of them. Experimentation is an art that has so many nuances, these guys wrote 303 articles only around it. So if you want to learn more about it, they are the best.
Step 7 - Analyze The Results and Pick The Winner
When it comes to analyzing your test results, you will be using the same calculator as you used to determine if you have a sufficient number of conversions to run the test.
As you input your data, it will check for the conversion rate and whether you have a statistically significant winner or loser. There is also a nice Bayesian chart that I found helpful to use for reporting purposes.
It's important to understand here that the worst possible outcome is inconclusive results. Basically, when there is no statistical difference in conversion rates. Failures are always good as well as winners as now you discovered what is really important for users.
If you can't learn from the failure, you've designed a bad test. Next time you design, imagine all your stuff failing. What would you do?
When you see inconclusive or bad results, but your hypothesis is well research-driven - it's always a good idea to check segments (another reason to use analytics tools for analysis). Often different segments cancel each other.
If you didn't move the needle with any segment - and still confident about your hypothesis, the implementation of your idea could be lousy. So you can run 2-3 more tests with different execution of the same hypothesis to see if any of them reach significance.
If you run 2-3 more tests and still get lousy results - your hypothesis is probably bad. If you couldn't learn from all these tests anything - it's even worse.
So half of the analysis is basically done around what caused the effect. Dive into segments, talk to other stakeholders, run more tests around it to prove or disregard it. The most important aspect of A/B testing is learning something about your customers. So you can then delight them with exceptional experiences and in return, a high-converting funnel that every business is dreaming of.
As you wrap everything up - share the results and learnings with others in the company. Make it as easy as possible to understand. You can ask your UI designer to create a nice-looking presentation. Here's a structure I use:
Reasons and hypotheses
Setup of the test (duration, segmentation, and KPIs)
Results for each main KPI
Business case
Segmentation and other useful insights
Learnings & Recommendations
Always Learn, Always Iterate
The magic of user analytics happens when you view it as an ongoing journey, not a one-off task. If you look at your funnel, every step of it contains an experience for your customers, and you want to make sure that it's as smooth as possible.
And remember, perfection always comes with iteration. You can't just set it and forget it. It's extremely important to regularly revisit and update your strategies based on new learning and insights.
Follow the steps in this article as a part of your user analytics optimization cycle that you can repeat as many times as needed. Or even better, loop them into your monthly or quarterly routine and slowly but steadily create a user-centric data-driven environment rooted in your company culture.
In The End, It's All About Users Experience
After diving deep into how user analytics can significantly boost your conversion rates, it's crucial to understand the multifaceted role of user data in shaping an optimal customer experience.
User experience goes beyond just the surface-level interactions - it encompasses understanding user behavior, journey, and engagement at every touchpoint. By utilizing the right user analytics tool, such as Google Analytics or Amplitude, you can gather a wealth of user data that illuminates the path to refining product design and enhancing user engagement.
The collection and analysis of usage data, through tools like user analytics software, enable us to map out the user journey in detail. This journey analysis offers insights into various user personas, revealing different types of users and their behaviors. Such data is invaluable, not just for tailoring the user experience but also for driving customer retention and optimizing the customer experience.
A comprehensive user analysis strategy involves both quantitative data and qualitative insights from user interviews and survey data. These insights help in understanding the nuances of customer behavior and user needs.
With the right analytics software, tracking key metrics becomes easier, allowing for a more targeted approach to product analytics. This, in turn, informs the development of new features, making the app or website more responsive to user needs.
Always Make Use Of The Insights Learnt
Incorporating user behavior data into your product design process ensures that new users find value in your app from their very first interaction. User behavior analytics, combined with user engagement metrics, guide the development of features that reduce friction and improve usability. The goal is to create an interface that resonates with different user personas, enhancing the overall user experience and driving active user engagement.
Moreover, user analytics data offers a clear view of different traffic sources, including mobile devices and various browsers, enabling you to adapt your product for a variety of user contexts. This level of user analysis and data collection allows for more effective customer experience strategies, focusing on what users genuinely need and how they interact with your product across different platforms.
Choosing the best tools for user analytics involves considering a variety of metrics and functionalities, from dashboards and notifications to AI-powered insights.
Tools like Google Analytics provide a broad overview of user engagement, while specialized software like Amplitude offers deeper insights into user behavior and retention metrics. These analytics tools are really important for marketers aiming to understand customer behavior in less time, with dashboards that highlight key metrics like recency, frequency, and engagement patterns.
By employing a user analysis process that includes both data analytics and user research, companies can discover new ways to engage users. Whether it's through personalized email campaigns, social media notifications, or in-app messages, the aim is to meet users where they are and address their needs directly. This approach not only enhances the user journey but also plays a critical role in customer retention and engagement.
User analytics tools, therefore, are not just about collecting data; they're about making that data actionable. By understanding user needs, behaviors, and journeys through a comprehensive analytics strategy, businesses can design better products, improve user engagement, and ultimately retain more customers.
This journey from data collection to user analysis to product improvement is what turns good apps into great ones, ensuring that new features meet user needs and that every update leads to higher engagement and retention rates.
In conclusion, embracing user analytics means embracing a data-driven approach to growth. It's about using a variety of metrics and insights to understand and improve the user experience, engagement, and journey. With the right tools and strategies, you can unlock new growth opportunities, enhance customer retention, and ensure that your product not only meets but exceeds user expectations.
Stop using chatGPT for SQL today
Think about the last time you had a business question. How long did it take to answer it?
Stop using chatGPT for SQL today
Think about the last time you had a business question. How long did it take to answer it?
Stop using chatGPT for SQL today
Think about the last time you had a business question. How long did it take to answer it?