Lifecycle A/B Testing
On this page, we will explain what it means to A/B test your Lifecycles, why you should consider implementing it, and how to get started.
🤷♂ What is A/B testing for Lifecycles?
A/B testing, also known as split testing, is a way to compare different versions of a Lifecycle by testing your player's response to version "A" against version "B", determining which of the two versions is more effective by looking at the collected conversion data.
In short: performing A/B testing allows you to compare the success of different versions of a Lifecycle.
🤔 Why should you perform A/B testing for your Lifecycle versions?
We believe that there's always room for improvement and by not making use of A/B testing you're missing out on valuable information that can help you improve your player engagement.
Performing A/B testing for your Lifecycles gives you the chance to make educated decisions on how to evolve your player engagement for the better.
⚙ How does it work?
When you have 2 or more versions of a Lifecycle you can run both or all simultaneously by allowing the system to filter players into all active versions. You simply decide the percentage split between the different versions.
Depending on the percentage split that you've decided for the different versions, the players entering the Lifecycle will split according to these numbers.
👩🏫 Example:
Assuming you have 2 versions you'd like to assign to your Lifecycle:
Version 1: 50%
Version 1.1: 50%
When a player enters the Lifecycle (upon fulfilling the Lifecycles entry conditions), this player has a 50% chance of joining Version 1, as well as a 50% chance of joining Version 1.1.
In addition to this, the system also assigns the players randomly to each version. This means that if 100 players enter the Lifecycle the split will not necessarily be 50 players in one version and 50 players in the other. Instead, the split could be 55 players in version 1 and 45 players in version 1.1 for the first 100 players.
🙋 Please note; following the example above.
Due to the random factor, there will never be a "perfect split" according to the percentage setup. However, the more players that enter the Lifecycle the split will even out more and more, and become more accurate according to the numbers that you've set.
🛠 How do you implement it?
In order to perform A/B testing, you need to have at least 2 versions in your Lifecycle. Then, depending on if you have a Lifecycle version in production or not, the process of running with additional versions is slightly different.
No Lifecycle version(s) in Prod
With no Lifecycle versions in prod and 2 versions in ready, this is how and where you would select the versions to push them live:
READY stage > Prepare for launch
By default, as you can see to the right in the image above, the percentage split will be done automatically and the split will be evenly distributed between the selected versions. However, you can always alter these settings to your liking.
Lifecycle version(s) already in Prod
If you already have one Lifecycle version, or more, in production and you'd like to push an additional version live, this is where to do it:
READY or PROD stage > Edit > Modify Entry Conditions
Continue with the default percentage split, or change according to your needs, and Update Configuration.
👩🔬 Analysing the Results
After you've run with your A/B testing for some time, and enough players have passed through the Lifecycle for us to have sufficient data, it is time to analyse the results.
This is where you will get valuable information, about which Lifecycle version performed the best in different areas.
You can easily access the conversion data, inside of your Lifecycle Projects, when hovering over the Lifecycle project that you wish to analyse:
Here you will be presented with the conversion analytics for your Lifecycle versions that are included in your Lifecycle project.
You can see the Lifecycle Version Summary, followed by a number of graphs and other interesting data:
Please note: slightly different data will be displayed in the conversion analytics for your Lifecycle depending on what Lifecycle template you have used.
We've simply adjusted the conversion analytics to show the most relevant information for the chosen template.
🔢 Understanding the numbers
Going back to the Lifecycle Version Summary, this is where you can get the most valuable information, as you can quickly get a good understanding of the overall performance of your A/B testing:
The information found here is pretty straightforward and self-explanatory.
Looking at the conversion data from Version 1 and Version 2 (the 2 different Lifecycle versions that were used for the A/B testing), in the above example, we can see that Version 2 overall performed much better:
- Version 2 had a higher conversion rate (= more players that entered this Lifecycle version have made their first deposit)
- The players belonging to Version 2 had a higher first deposit average (= their first deposit was a higher amount on average in comparison to the deposits of the players belonging to Version 1)
❌ How to remove A/B testing
If you've run an A/B test with Lifecycle versions, analysed the data, and is now ready to stop with the testing - this is how you stop a specific version:
READY or PROD stage > Edit > Modify Entry Conditions > De-select the version you want to stop running with > Verify and launch
This will move the de-selected version back into the READY stage again. Please note that the version would be partially stopped at this stage, meaning that players would be stopped from entering this Lifecycle version, but any players currently inside would still stay inside of it and trigger any relevant Activities.
Should you like to disable the Lifecycle version and all the Activities, click on the version to enter a new page and DEACTIVATE it in the top right corner.
🔝 Best Practice
This is one recommended way to make use of both features for your Lifecycle:
- 🛂 Implement a control group to Version 1 (your first version) After running with Version 1 for some time it's time to analyse the conversion data.The Control group will work as a benchmark and help you to analyse the success of your first Lifecycle Version. By analysing the conversion data collected from Version 1, in combination with the Control Group data, you'd most commonly find a couple of improvement areas. This will be the base for creating your second version to run the A/B testing.
- 🔎 Create your Version 1.1 (second version) based on your analysis of Version 1 Create your Version 1.1 by cloning Version 1 and implementing the changes that you believe will improve your player engagement, based on the analysis of your Version 1. Run with the A/B testing for some time before analysing and comparing the results from Version 1 and Version 1.1.
- ✨ Three suggestions on what to do next Based on the results of the A/B testing we have listed 3 different options on what you can do next:1. 🔁 Continue the A/B testing If there's no significant difference between the versions so far, you can choose to continue running with the A/B testing in order to collect more data and analyse again further down the line. 2. ➕ A/B test with a different version If one version clearly performed better than the other, you can decide to stop running with the version that performed worse and then replace it with a new and improved version once again. This way you continue to A/B test and evolve your Lifecycle in order to get the best possible version running. 3. 🛂 Stop the A/B testing and implement a Control Group again If one version clearly performed better than the other and you don't wish to continue A/B testing for the moment, you can simply disable the version that performed worse. Keep the best performing Lifecycle Version and implement a Control Group again.
If you haven't already, we recommend that you also read up on Lifecycle Control Groups: what it is, what the perks are, and how to set it up. Read it all here.