Jonathan has helped optimize...

« Google Pay Per Action Reaction | Main | Diving Into Google Directions »

5 Lessons From Multivariate Tests on Buttons

Button_optimizationConventional wisdom from the Information Architecture and User Experience gurus tells us that buttons and calls to action should be the same across the user experience. The thinking is that messaging, shape, size and color are all elements that if consistent will allow users to locate and click buttons and calls-to-action more easily and more often. This is wrong. The design and messaging of buttons can be optimized at each step throughout the conversion funnel.

At OTTO Digital we have been using Offermatica to do a lot of multivariate testing on buttons in recent months. We’ve learned a lot of things and have seen some tremendous lifts in CTR, conversion and especially revenue-per-visitor (RPV). Some of our learning turns conventional wisdom on its head. We like that.

What have we learned?

Different Buttons Perform Better on Different Pages:
To me this data is groundbreaking. It proves that the “best practice” of consistency in button design for improving user experience and increasing conversion rate is not optimal. Users are skilled enough to recognize “intelligent inconsistency” throughout the purchase path.

User goals and objective changes from page to page in-flow. The pages themselves change layouts and goals. It is therefore only logical that the buttons may benefit from being changed as well. It’s just so cool that as marketers we have technology that can define a best button from hundreds of variations in the matter of a few weeks.

Buttons Impact Revenue Per Visitor (RPV):
Don’t underestimate the power of these little designs. Most of the elements at some point in our testing have shown tremendous factors of influence on revenue, especially the color red. I know I caused a bit of a stir with some in the optimization community when I mentioned that red buttons should be a rule of landing pages. I will retract a bit and say that the only “rule” should be that you test. Occasionally we see other colors perform better. But you know what? I’ve got reams of data the past few months to back up that red buttons rock. Keep in mind when you are testing for revenue you should filter out extreme orders from the data so as to not skew the results.

Impact Changes Throughout the Funnel:
Strategically you optimize each page to get users one-step closer to their objective. As such, CTR should be the first metric you measure. You also should be tracking performance metrics every step until purchase or lead. One recent product page test had a best performing message of “buy now”. However one click deeper in the experience at the shopping cart step, the best messaging for that product page was “add to cart.” The performance of “add to cart” carried over to purchase and this button had an 8% lift in RPV vs. the control (existing) button. Did I mention this button was red?

Everything Matters:
Having run many MVT tests on buttons I can tell you that I’ve seen each of these factors be an overwhelming contributing factor to conversion.

• Color
• Shape
• Action/Not
• Action Design Pattern
• Location
• Message
• Icon/Not
• Icon

Segmentation Yields Interesting Data:
Looking at results by segments can help inform strategies especially for targeting. We’ve seen different button behavior based on new visitor vs. return visitor, entry point and source. Target your buttons and you start to take optimization to a whole other level.

Button and call-to-action tests can yield some really great results. They should also not be a big strain on creative resources so they are perfect tests to get your feet wet with multivariate testing and optimization. Not to mention everyone loves testing buttons. It just makes sense.

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341ccaa353ef00d83526b71f69e2

Listed below are links to weblogs that reference 5 Lessons From Multivariate Tests on Buttons:

Comments

Jonathan,

We never said red buttons don't always work. Just that we wouldn't say use red buttons as a rule. We agree, it should be tested. Your customers should always vote with their dollars on these decisions and shouldn't be decided by some designers whimsy of what looks good. The other point we try to make is that while testing buttons is fun and easy, it isn't always the most impactful of all the 1100+ variables that are out there in terms of maximizing your ROI from your testing efforts.

Thanks for all the sharing you do.

Bryan,

Thanks for the comment. I sorry if I gave the impression you guys said red buttons didn't work.

Of course we both agree everything needs to get tested and yes while buttons may not always be the MOST impactful element to optimize they can yield some quick results. When tested through mulitiple user touchpoints this can aggregate into some major value.

Jonathan

Do you mean rather than the standard "More info" button, you made it "Buy now", which took you to view the actual product?

Tom- We didn't change the route from any of the buttons. They all went to the same place. Only message, color and dynamic elements were changed and tested.

Jonathan

Sorry, Jon, don't wanna clog up your blog. I was just curious by where on the website you meant by "however one click deeper in the experience at the shopping cart step, the best messaging for that product page was “add to cart.”".

Is the "buy now" button on the product view page, as in, when you've a list of products around you. After you click on "buy now" to view the actual product, it changes to "add to cart"?

Just wanted to make sure I got that right, as you refer to them both as the product page, but one is one click deeper!

Thanks for your time.

Testing is the key to success for "Call-to-actions" you have to understand how your visitors will react to different words, colors, and placements. Good post!

The comments to this entry are closed.