Live from LeadsCon: It’s Not Just What You Say—It’s What the Customer Experiences

By Janet Driscoll Miller | Aug 15, 2014
More Articles by Janet


This morning the first session at B2B LeadsCon features a friend and a talented speaker – Scott Brinker of Ion Interactive. Scott started off by sharing statistics about what’s going on today with content marketing. He said there’s just so much content that there is content overload. How can you solve that? Perhaps through personalization — targeting content to the right audience. Personalization is helpful, but it’s not a panacea. First, you have to have enough data about the visitor to personalize, you need data for creating new content, things change and what if you guess wrong? Would it create a negative experience? Scott said we’re crossing a barrier now from passive content to interactive content. But what does that have to do with conversion optimization? He shared a case study of Ion Interactive’s own content marketing efforts, featuring a type of game to determine which landing page test won. In the process, Ion is showing off the benefits of A|B testing and giving social proof. Scott said that a professor of his confirmed that this is a great way to educate visitors because recall rates in learning improve dramatically when you have a student predict outcomes before a teacher tells students the answer. Scott then shared this graph of the power of marketing apps as content: You can also read about Scott’s thoughts on marketing apps as content in his article for Search Engine Land: “The 4th Wave of Content Marketing: Marketing Apps”.

The second speaker was Alhan Keser from Wider Funnel, an optimization company focusing on conversion optimization. Alhan shared a case study from Magento. Magento’s goal was to increase qualified leads. The first step they took was to determine which pages needed improvement. Then they created a hypothesis about each page and what they believed could be improved on the page. Alhan described the framework that Wider Funnel uses — the LIFT framework: The page tested, while it had a nice layout, definitely needed clarity. For example, the form, the way it was constructed, made it appear that the page ended at that point, while there was a good deal of content below the form that many visitors might not realize was there. Some areas they tested included the headline and layout, and an added privacy statement in version A. In version B, they made the same form changes as version A, but in this version, they showcased the value proposition and moved the case study down. Version B saw a 115% increase in qualified leads. On another page on the website, visitors arrived at the page via a button that said “try a demo.” The trick was to maintain lead quality. Version A clarified the value proposition with a clear call to action and removed the button that said “try a demo” that was essentially repetitive. In version B, they put the full form on one page versus splitting up the form into two pages, as the original page had done. On version C, once a few fields were filled out on the form, the full form was revealed. In the final version (d), they removed all distractions from the page. Versions C and D won. Version D nearly DOUBLED demo requests. But were they qualified? Version B had the highest improvement in qualified leads and had described the demo.

In the end, Alhan said don’t just rely on what worked for others. A|B testing isn’t always simple. Every site is unique and your approach should reflect that.

Share this article

Share on LinkedIn Share on Twitter

Blog Search


Subscribe today!

Over 4,000 marketers are receiving actionable insights and tips for improving their digital marketing in each monthly issue - why not join them?