Watch the “Lift”
By Heidi Tolliver-Walker on July 25th, 2012
Yesterday, I wrote about some of the factors that can influence response rate data reported in those channel comparison reports we all enjoy so much. There is another heavily used metric that also needs to be called out once in awhile. That is lift.
Often, you will see results of campaigns referred to in terms of the “lift” over previous campaigns. Often times, that number can be substantial. “So-and-so went from static to 1:1 and got a 18% lift over previous campaigns.” Or “Such-and-such switched to our new marketing platform and saw a 27% lift in return visits.”
These numbers sound impressive, but what do they really mean? Sometimes they mean a lot. Other times, they inflate the perception of the results without meaning much at all.
For example, according to the 2012 DMA Response Rate Report data, marketing campaigns using traditional postcards receive a 3.4% response rate on average. Meanwhile, campaigns using over-sized cards return a 3.95% response rate on average. The difference between the two is 3.95% – 3.4% = .55 percentage points.
But .55 percentage points doesn’t sound very impressiv. However, if we divide 3.4 into 3.95, we get 16%. Now we can report that moving from traditional to oversized postcards creates a 16% lift in response rates. That sounds much better!
In reality, this 16% lift may or may not be a significant difference. How do you know? There is a measure called standard deviation, which is a measure of variation within a sample. If the standard deviation (or amount of variation) is .2%, for example, then .55% is a significant difference. If the standard deviation is .61%, then .55% means nothing at all.
Without knowing the standard deviation, we are left to make guesses and assumptions about how much that lift really means. In the case of the DMA report, we are looking at a very large sample and (hopefully) professional researchers who are filtering the information, creating a certain level of trust. That’s not the case with all data.
So when data is reported in terms of lift, be careful. Look for the raw numbers like true response rates (or other metrics) so you know how big an increase you’re really dealing with. Then you can look at that number in light of other factors, such as sample size, to help you evaluate the value that those lift numbers actually provide.