Ideas for the next generation of conversion optimisation

Conversion Optimisation has become a regular fixture in a marketers toolkit. The ability to gain efficiencies based on data driven facts is a great opportunity for many businesses. As testing becomes commonplace we are approaching the next innovation in testing which is rarely discussed. CRO Testing currents involves running variances against a base. You monitor performance on single or multiple KPIs and a winner is found. The winner then get the majority/all of the traffic and we move onto the next test. We look for statistically significant results in the data to show success in performance and live and die by the numbers. What happens when a user prefers the version of the test that “fails”. If 90% of people preferred the new version - you still have 10 out of every 100 having a preference of your base. We talk about segmentation and targeting when setting…

How and when to use A/B or Multivariate Test Design

When running a testing programme you will have to decide for each test whether it should be run as an A/B test or a multivariate test. The decisions you make when designing your experiments will significantly impact important variables such as the depth of the insights, the speed of testing and delivering winning variations and therefore the impact of your testing efforts.

The 1,024 Variation Test

So when should we run that 1,024 variation multivariate test (Google actually did this)  and when would an A/B test be more appropriate? First, let’s consider some of the important variables: Time to test - The number of variations in your experiment will affect the length that your test needs to run. Traffic required - Your traffic volumes will also determine how quickly you…

Digging beyond Conversion Rate using primary and secondary conversion metrics and avoiding the common testing mistakes

A/B testing is certainly not new, with the number of people and companies involved in testing is continuing to grow at an impressive rate. Many companies start tentatively with a few sample tests, without investing in expertise or training in how to embed robust testing processes. Drawing conclusions based on half-cooked tests is a sure-fire way to kill internal faith in your testing programme. You’re also potentially missing out on some of the most interesting insights. I’ve written before about the importance of using both qualitative and quantitative research to develop the strongest hypotheses for testing. Also the importance of expertise and experience in developing the strongest concepts and then prioritising your testing schedule. However this post will focus primarily on how you then design experiments…

"The true price of free and cheap in A/B and Multivariate testing"

In this infographic, as a provider of a paid service are looking to educate marketers about the benefits of paid-for outsourced services for AB testing compared to Google's free Content Experiments tool (formerly Google Website Optimizer) or similar. It's an example of using an infographic to help position a service. We're including it since weighing up the costs-benefits of these services is an issue across these types of services listed at

Their infographic shares advice on how to choose the right tool/software and compares managing MVT in-house vs managed services where experiments are designed and managed by the service provider. It also suggests the hidden costs if you manage a free or low-cost solution in-house.

'Clients using managed services or enterprise graded solutions, have experienced 6x more uplift in conversion than those using DIY testing programs'

An example of using of A/B Testing

Examples of A/B testing often cover landing page conversion, but a common problem, not covered so often, is how to get visitors to engage more with a key page such as a home page, product category page or resources page. In this post I will give an example of how we used A/B testing to increase engagement for a home page.

Engagement can mean different things for different sites. For some it is the pages per visit; for others it can be social shares, number of comments, or any other page activity.

For our customer, Inside Buzz, engagement meant visitors exploring the site further, beyond their homepage. It was probably this need to have active engagement that their previous homepage design provided so many options for visitors to click whichever option they are interested in.

An interview with Matt Lawson of on their ecommerce strategy

Matt Lawson is Head of Conversion at

In this interview he describes how they have  grown their business through keeping the web experience focused on the customer through constant feedback, review and testing. Today are the UK's largest online kitchen retailer with over 4000 large appliances ready for Next Day Delivery.

Their About page is one of the best we've seen for showcasing the proposition and integrating customer feedback and social networking. Taking such care with the About page may not be essential for established high-street retailers, but is important for online pureplays and startups. We also like the way their masthead below the navigation showcases their proposition.

Not that can be called a startup…

An example showing how to set up an AB test on Wordpress

What is A/B testing?

You'll probably be aware of the approach, I'm sure. A/B testing, also called split testing, involves your site serving  one of several versions of a page to show to a site visitor. The aim is to usually find a better approach to converting your visitors. [caption id="attachment_344" align="aligncenter" width="300"] What is A/B Testing?[/caption]

Why do you need it?

Put simply, how do you know if the changes you made to a page are working better than the earlier version unless you can compare them side by side? A/B testing allows you to make changes from a position of intelligence - knowing the impact of the changes you make allows you to learn what works and what doesn't, and make sure that you repeat the successes and not the failures.

A/B testing in WordPress


Improving your testing workflow to get bigger, quicker wins

Should you test one thing at a time or many at once? A poor experimental workflow can waste loads of your time! Here’s an extreme example: We’ve seen a company take six months to do something that took another company thirty minutes. That’s 8,760 times slower. To grow quickly, you need to implement quickly, so our work with clients goes beyond suggesting what they should test; we build their in-house capability to “get stuff done.” This article describes a framework for speeding up your testing—so you can grow your profits quicker.

Many small changes or one big one?

If you’ve read the case study of our work with Crazy Egg on optimisation, you may recall that the winning challenger homepage was much longer and much more effective than the control: Our winning challenger was much longer than…

How run-of-site template design features can make a large impact

Is this a question often asked by online marketers? When it comes to website optimisation is the footer something you usually consider? Well if it’s not, it probably should be; because how you use your website footer can have a major impact on conversion rates. As part of a full conversion rate optimisation and UX-Driven MVT strategy, we conducted an A/B test for one of our clients, Radley+Co (the luxury hand bag people). It was a simple test, measuring how conversion and revenue would be affected by introducing a mega-footer containing the product list broken down into categories. Once our UX Designers had created the new version we used the Optimizely testing tool to implement this A/B test across every page of the website. We tested a smaller footer only featuring a simple one line navigation menu against a larger mega-footer. Both…

Two winning tests for a multilingual Ecommerce site

In this case study, we’ll show you how to increase your win rate by doing some diligent research. Plus, we’ll show you the results of two interesting tests. E-commerce sites have a particular challenge when it comes to conversion rate optimisation compared to landing page optimisation for example. Implementation can require significant technical resources. Because of this, it’s important that e-commerce marketers change only things that are likely to work. They don’t have the luxury of being able to “throw stuff against the wall to see what sticks.”No one can win every test, of course, but you can increase your win rate. At Conversion Rate Experts, we have the advantage of having worked on many e-commerce stores, but even if you don’t have the benefit of experience, there are simple things you can do to radically improve…