Businesses should always strive to make improvements on their websites, but constantly looking for ways to upgrade your website can be exhausting, even discouraging. The reality is, however, that a website that attracts visitors, and keeps those visitors engaged, will make more money.
The difficulty comes when you are deciding exactly what changes are legitimate improvements, and which ones will actually turn your traffic away. In a business, you have many competing voices: the boss wants the website one way while the web designer wants it another and the marketers have yet another idea of how to change it. So how can you decide which updates will ultimately benefit your business?
The best way to determine the right changes is through A/B testing. A/B testing is exactly what it sounds like – you create two different sets of changes and randomly split your traffic so they see either website A or website B. Whichever iteration does the best (lowest bounce rate, lowest exit rate, highest conversion rate, etc.) is the one you need to implement for good.
A/B testing can be used for a number of different website elements, including:
- Landing pages
- Product descriptions
- Layout and style of the website
- The amount of text on a page
- Email marketing campaigns
- Product pricing and promotions
- Form lengths
- And more!
Some Tips for Your A/B Testing
Before you start your experimentation, there are a few things to keep in mind.
- Run the A/B tests at the same time. If you decide to test website A and then website B, there are a lot of factors that could account for the differences in their results. Instead, test both changes simultaneously.
- Be sure to run the test only on new visitors. Don’t put your current customers through the ringer with constant changes. You should also show your repeat visitors the same website each time so they don’t get confused.
- Look at all of the metrics before deciding whether website A or website B is the one you ultimately want to implement. Changing your website might affect one piece of data on your analytics without affecting another – or a decrease might happen in one area while an increase happens in another. It’s important to decide before you start testing which metric you want to use to measure your success. For instance, if your bounce rate increases and so does your conversion rate, you are still making more money than you were before. However, a high bounce rate can negatively affect your SEO. The answers that come from testing don’t always clearly guide you to which choice you need to make.
- Different audiences might react differently to your testing. For instance, website A might appeal more to a younger audience while website B attracts an older one. Compare the demographics on your traffic before making any decisions.
- Test small things more often than big ones. Building an entire webpage takes a lot more time, effort, and money than testing out call-to-action buttons. Small changes could make a big impact on your metrics, so it is worth it to you to create and test those nuances as well as much more sensational changes.
- If your changes hurt your page’s performance, they may not be worth it. All these fancy upgrades may hurt your website’s loading time. A long loading time is a huge detriment to any business, as users do not bother to wait more than 10 seconds for their page to load. Even the most beautiful upgrades might not be worth it if your audience is dropping off before they can even see it.
- Don’t give up on your testing. The chances are very high that your first batch of experiments will not go well. Instead, try some other updates. Also, don’t scrap your changes entirely; if you’ve built an entire webpage, trashing the whole thing is not only a waste of all your efforts, but it could easily be throwing the baby out with the bathwater. Save your work for another time.
For more information on email marketing campaigns, read our blog post “Why Your Email Marketing Campaigns are Missing the Mark (and How You Can Fix That).”
Chopra, Paras. “The Ultimate Guide To A/B Testing.” http://www.smashingmagazine.com/2010/06/24/the-ultimate-guide-to-a-b-testing/. (25 March 2014).
Jenkins, Wyatt. “A/B Testing and the Benefits of an Experimentation Culture.” http://blogs.hbr.org/2014/02/ab-testing-and-the-benefits-of-an-experimentation-culture/. (25 March 2014).