Why commitment is important when you do A/B-testingam 24. Juni 2020
A/B-Testing has become standard practice for many digital companies. But a lot of them are struggling with the right set-up that suits their processes best. At our DL Summit 2019, data scientist Sade Snowden-Akintunde presented insights into her work at Codecademy and Etsy, where she was responsible for growth and product analytics.
Commitment to A/B-testing
One of her main talking points was commitment. “Once you have started an experiment, you have to go through with it”, she told Sebastian Waschnik, who hosted the fireside chat. That may sound obvious. But the temptation or even the pressure to stop an experiment if it’s not working is always there. If you run an A/B-test with a specific user group, it might happen that you are really hurting their user experience and even lose them as customers through the process.While that might hurt you in the short term it’s important to go through as planned, says Sade. Otherwise, you run the risk of screwing your data entirely. “Before we did our experiments, we ran a power calculation how long it needs to run”, Sade explains the process. “And then you have to do it accordingly, otherwise the likelihood of false results is a lot higher. At Codecademy, we only stopped testing when we saw our conversion and our money tank seriously.”
Pros and Cons of having your own tooling
Another important lesson Sade talked about is to always have a control group of five to ten percent of users, that never get exposed to any experiments or user tests. This is in addition to having a normal control group during the regular testing process. By doing this you make sure that you always know how regular users interact with your product just the way it should be.
One question many companies have is if they should build their own testing platform or if they should buy something on the ever-growing market for A/B-testing. The benefit of having your own toolset is that it allows you to personalize how you want to do your experiments. On the other hand, it requires you to maintain your tooling, says Sade. So it is, in the end, a matter of resources and how much of them you can allocate to an owned platform.
- Full-Stack Developer*in (m/w/d) bei Encoding Management Service - EMS GmbH, Leverkusen
- Entwickler/in (all genders) Web Frontend bei Accenture, Kronberg im Taunus
- Full-Stack-Entwickler/in (m/w/d) bei Sipgate, Düsseldorf
Which, essentially, is the basic problem with data science in general. So, Sade’s final advice, especially for smaller companies, is to get your house in order, before you even think about scaling your data infrastructure. “Have a database in SQL, know how to operate that, so that your your data has integrity and makes sense.”
Working for mid-stage e-subscription companies has given Sade a unique perspective on data science, as she’s seen many ways that technology companies can work efficiently and inefficiently with their data and experimentation tools. Prior to working at Etsy, she worked for Codecademy and Hello Fresh out of their New York City office.
Next: Listen to our Podcast with Ingo Hettenhausen from Otto.de about A/B-Testing.