This is part 2 of a series about A/B testing and it really only makes sense if you read them in order. Click here to read part 1. In part 1, we decided to run an A/B test, we created a clear question and chose a metric to test. Today we will look at how to present what you found to your client so you can advise how to move forward.
OVERVIEW:
Return to your question
Remove outliers
What to present
How to present your finds
Return to your question
You might remember from part 1 that before we started any of our research, we posed a very specific question. What is the goal of this project? The goal was based on your clients business so it can range from increased sales to more awareness or a different perception but it must be focused. From this we set a metric to measure, for example how many people purchased a product. By setting this clear goal, we avoid confusing ourselves and the client with data that does not address the goal. Now, let’s keep this logic in our presentation. We set out to measure metric x so we should only present metric x. It might seem obvious, but the temptation to add more information can be strong.
Remove outliers
An outlier is an extreme result caused by something outside of your study or a random event. Let's say your client visits an international conference partway through your test. On that particular day, sales spike for design B but not for design A. Looking at the sales, you notice they are all from Spain since many of the conference visitors were Spanish. Should you remove this result? The best way to decide is to visualise the data. If you look at the image example below, we see a clear spike on day 6. Because it will have a big impact on the overall result and is contradictory to the other days of the study, you can choose to remove this day from your study. Make sure to remove the data from both design A and design B.
What to present
Your presentation structure should take your client through the logic of the test. Start by recapping your goal, what metric you used and the different design options you tested. Make sure to remind them about the difference between the designs and why you choose to test this aspect.
Secondly, explain how the test was run. How many people participated in the test? How did you make sure they were the right target audience? All this information will show the client that your results are trustworthy and it will help establish you as an expert.
Then get is to visualising the results. Keep this as simple as possible and talk your client through what the data means.
How to present your finds
Depending on the level of detail of your study, you can present your data in a few different ways. You always want to include the number of people who participated in your study.
If you chose to only look at the percentage difference between your designs, show the result for each design, the percentage difference and the goal you wanted to achieve. For example, you want to see at least a 10% difference between the designs. Design A got 60% of sales and design B got 40%. The image below shows an example of a clear way to present this.
If you decided to go a little deeper and run a t-test, you can present how confident you are in your results. Sticking with our example of sales, you would present the average number of sales per day from design A and design B. You would then present your "confidence interval". The confidence interval is given to you when you run a t-test. It looks at how much your data varies from day to day and shows a spread around your average. The more your data varies, the bigger the confidence interval. In the image below, we can see that design A has outperformed design B with an average of 11 daily sales compared to 5. They key thing to look out for is if the confidence intervals overlap. In this case they do not which means your designs did perform differently.
Most software like the ones mentioned in part 1 visualise data for you. You can decide to use this as is, just make sure to keep key information that answers your question and remove anything that could confuse the results.
Good luck testing and if you have a great tip we missed, join our Facebook community and spread the wisdom!
Yorumlar