Back in my agency days, clients used to ask me which landing page to use when linking their banner creative. Of course, my recommendation was based on the campaign type (e.g., branding, direct response) and the objectives (e.g., drive leads or sales, product awareness). I had an idea which existing Web site page would create the best user experience, but more often than not, I was not in the campaign’s targeted audience or demographics. Who was I to decide which experience would perform better?

Over the course of a month, the analytics team completed three different landing page tests, using two different approaches. One was for a financial institution, one was for an e-commerce site, and the other for an over-the-counter allergy medication.

The test for the financial institution started with a methodological A/B/C split test, followed by multi-variant to further optimize the winning page. We started with the control that had two tabs (to explain its two-step offer), and tested it against two one-tabbed designs. They were identical except for different background colors. I chose the green to win, and was very surprised when blue beat it by 20%. I was more amazed at how different the results were, and all just because we changed the accent color.

Phase II was more surprising, when we found adding a hero image beat out the control by an astounding rate, more than 30%. Had the client asked me to choose which page to use instead of testing the landing pages, I would have hindered the campaign success to the tune of over 1,000 new applications.

The e-commerce site was having trouble getting consumers to add items to their cart. We tested the “add to cart” button first, and the control button won by a few percentage points. Then we shifted the product image to the right of the product page from the left. At first, it looked like, again, the control was the stronger performer — until we segmented the results by traffic source. There was an increase of over 20% in revenue from Google-driven site visitors — just by moving the image from right to left. Slight and subtle changes were all it took.

The objective of the landing page test for the OTC medication was to determine if consumers were more likely to click for the coupon from the homepage or the product page. We created duplicate pages for each (so users couldn’t get to the page unless they came from a display ad), and drove equal traffic to each of the two pages. Again, if the client asked which page we should use for the landing page, I would have told them to use the product page to give the consumer an opportunity to read the benefits about the product. And, I would have been wrong. The click-to-coupon conversion rate was more than 25% from the homepage, compared to the product page.

The moral of the story is that marketers need to test, test, and test again. We (media and creative professionals alike, and self-proclaimed numbers geeks like myself) think we know what is right, but we don’t. We are often not our client’s target audience.

Landing page testing does not need to cost anything but time. Start by doing an A/B split on site traffic (which you can set up in under 10 minutes); you would be amazed at how you can optimize your campaigns without changing one thing in your media buy.

Click here to learn more about testing and marketing optimization at Level’s new Marketing Analytics bootcamp.

LEAVE A REPLY