Should you ditch A/B testing? Learn how to use AI to discover the causes behind customer actions
Marketers use any number of data points to inform the recommendations they make to customers. But do they really know the causes behind why customers prefer one product or message over another?
Well, one way to find out is using AI to analyze more data from the entire customer journey, instead of depending on limited results from specific A/B tests.
"The ability to understand the true drivers behind customer behavior across the journey is transformative," said Zubair Magrey, GM, marketing for U.K.-based decision-making software company causaLens at The MarTech Conference.
That's because the data alone doesn't provide a full picture. To get that you need to know the causes behind outcomes.
"You're not only interested in predicting which one of your clients are going to churn, or predicting what your clients are going to buy next," said Andre Franca, director of applied data at causaLens. "What you're interested in understanding is what is the best product that I should recommend to my clients? What is the best action for me to retain my clients?"
Ultimately, that means assembling the best mix of digital channels to reach customers. This, in turn, optimizes revenues.
"The question that you should be asking is what is the causal impact of adding or removing a new ad channel, knowing everything that you already know about your current marketing mix," Franca said.
Avoiding correlations to discover real causes
Predictive AI is frequently used to measure likelihood according to a percentage or score. Where did that number come from? More importantly, what are the causes behind those numbers?
"The real question that I should be asking is: What causes customer loyalty?" Franca explained. (Loyalty is a particularly complicated subject in marketing right now.) The key to answering this is avoid conclusions based on correlations.
"Everyone knows that correlation does not imply causation," said Franca. "And why is that right? We need to understand in which situations a correlation is actually causal."
It's basic logic, but an example helps. Let's say you have a very hot room filled with smoke and you want to get rid of both. The solution isn't turning down the heat. The heat didn't cause the smoke. Instead, there is a third element, a fire, that is causing both. The solution is to put out the fire.
So, why ditch A/B testing? It has limits
Separating correlations from causes requires adding more data from stages of the customer journey. A/B testing is a relatively simple method for deciding which of two options works better in a limited context. For instance, it could be the choice between two different product recommendations served to the same segment of customers.
The A/B test is experimental and provides data about which of the two choices is more effective. But there are downsides. "Ultimately, this is very dependent on the experimental design and you're restricted in how you perform this experiment," said Franca. "Most importantly, it ignores the possibility that different clients are going to react differently to the options that you're giving them."
In A/B testing, the results show which option gets a better result, in that specific instance. But it doesn't explain why similar customers in the same segment respond differently. "With causal discovery, however, you don't necessarily need to be restricted to these limitations, because all that you need is to look through the data, and the data is going to tell you what is the causal effect and, at the end of the day, which parts of your cohort actually responded better to that intervention," said Franca.