Your web pages and content needs testing to see how it connects with customers. Are there layouts and tweaks that work better than others? A/B testing can help you find out. Here’s how to get started.
Jeff Lambert (00:13):
Hello everybody and welcome to another episode brought to you by Rizen. I am your host as always, Jeff Lambert. Joining me in the studio today is, well, the virtual studio because we are practicing safe social distancing, is Nichole Mena, who is the creative director at Rizen. Nichole, thanks for stopping by over the web.
Nichole Mena (00:30):
Hi, it's great to be here with you, Jeff.
Jeff Lambert (00:33):
How is your quarantine life going so far?
Nichole Mena (00:36):
I can't complain. Working as usual. We're used to working remote, so it's been okay.
Jeff Lambert (00:44):
That's true. I think for us, it hasn't been as much of an adjustment as for other companies, because we do pretty much everything digitally as it is.
Nichole Mena (00:51):
Yeah, that's true.
Jeff Lambert (00:52):
So today we're going to be talking to our audience about something called A/B testing. And just before we get into that term and talking about why it's important to practice that strategy, can you just ... let's talk a little bit about versions of your content. Why is it important to try out different versions of content that you publish digitally? Like publishing the same version of a blog post or a webpage. Why is that a good idea to do?
Nichole Mena (01:19):
Right, basically no design or content is perfect on the first try. You want to appeal to different audiences and make sure you're capturing their attention, so testing is so important. And you can make a huge difference in the effectiveness of your marketing efforts by just figuring out what the most effective elements are, and then combine them. And that really applies to anything. You can do that on landing pages, blog posts, calls to action, emails, web pages, anything a potential customer will see. And you can make your marketing efforts that much more profitable and successful by testing.
Jeff Lambert (02:03):
So then we have this term A/B testing. It's a technical term. We hear it a lot thrown around, especially in digital marketing. We may have people think they're doing it, but why don't we go ahead and define terms. Can you talk about, at its core, what is A/B testing about?
Nichole Mena (02:15):
Sure, so basically A/B testing is a method of comparing two versions of something, either a webpage or a content piece, and you're comparing them against one another to determine which one performs better. It's essentially an experiment where two or more variants are shown to users at random. Some people will see one version, and then another portion will see that other version. And it's all pretty much random.
Jeff Lambert (02:45):
So when I go to a website, basically say I just type in a URL and I go to a homepage, you're saying that some people are going to see maybe one design, and then other people are going to see another design, and it's done completely a random.
Nichole Mena (02:59):
That's correct, yep.
Jeff Lambert (03:00):
Okay. All right. Talk to us a little bit more about A/B testing, I guess in terms of what it gives you in terms of data.
Nichole Mena (03:07):
Right, so a statistical analysis is used to determine which of the variations perform better towards a given goal. And then really A/B testing takes the guesswork out of optimizing your website. You can make more data-informed decisions, which is what you want, instead of making assumptions. So it's really shifting from the, "I think," to, "I know this is what's going to do better for my business."
Jeff Lambert (03:35):
So instead of asking all my cousins how my homepage looks, I can actually get a better view on things you're saying.
Nichole Mena (03:42):
That's exactly true. That data doesn't lie.
Jeff Lambert (03:44):
That's a good point. So can you provide maybe a concrete example of A/B testing? Like how would that work in the real world for, I always like to use the example of a pizza shop.
Nichole Mena (03:55):
Okay. So there's a few things, you can do two versions of a webpage where things are different. So like the location of where a call to action button is, or using a different headline on a blog post. There's two different versions of a survey, maybe one with open-ended questions and another with rating scales for example. And let's say a couple more examples. The location of a promotion, a banner in an email. Or something we do quite often is using different imagery in digital ads, something to that effect. So there's quite a few ways that you can go about it.
Jeff Lambert (04:39):
Sure. And the goal here, like you said, just to circle back, is to see what's working with customers. Maybe a button at the top of your blog post is getting more people to notice it and click as opposed to at the bottom.
Nichole Mena (04:50):
Yes, exactly. So testing that is so important and you'll see, you can see higher conversion rates and things like that. So it's definitely worth testing.
Jeff Lambert (04:59):
Now this sounds like it could be a very time consuming thing, right? I mean, you're looking at two different maybe blog posts that you publish. You're going to have to go into some sort of analytics page on the back end of whatever site you're using, comparing views and all these different stats. That sounds like something that people maybe don't have a lot of time to do. Is there a more streamlined way of approaching this?
Nichole Mena (05:24):
Well actually it's not as difficult as you might think. Just changing one element on a page let's say, and comparing two versions. It's not going to take up that much time, but the efforts are so worth it in the end. You'll see a big difference in conversions and click throughs and things like that. So it's really worth your time to do something, a testing strategy, for sure.
Jeff Lambert (05:52):
Is A/B testing something that's already built into a lot of platforms that maybe people are using like Squarespace or WordPress or HubSpot or any of those?
Nichole Mena (06:02):
I know HubSpot for sure has an inbuilt tool. There's also third party tools like Visual Web Optimizer, which we use also. There's Optimizely, and even Google has a tool. And some hosting providers do provide inbuilt A/B testing. So if you want to check with your hosting provider, you can start there. And really it's, like what we use is HubSpot, and it's really incredibly easy just to set up two versions and compare two webpages or two landing pages right in the tool that you already use. So it's very easy to set up.
Jeff Lambert (06:40):
All right, good to know. And I'll make sure to include links in the show notes to our listeners if you want to check out, like we said, Visual Web Optimizer, HubSpot, that way you can jump right in and start using A/B testing. Nichole, so yeah, we're talking about I guess the basics of doing A/B testing. We talked about some tools that you can use to get started. There's got to be an overall strategy that people can follow, maybe in terms of doing it the right way and the wrong way. Do you have some tips for people to approach doing A/B testing? Just from the first one that they try and do, ways to do it right?
Nichole Mena (07:11):
Sure, sure. So really you need to decide what you want to test first and foremost. Usually begin with something maybe sales related, some kind of element on the page that's sales related. They're usually easier to analyze. So start small, maybe with two versions of a digital ad, and then you can grow from there. Once you've made a list of items to conduct testing on, list the variables you want to test for a given piece of content. Again, with a digital ad, maybe you decide to test three items. You can test the location of the text, the font used, and maybe the size of the CTA button, something like that. You can also decide what stats you're going to be comparing. And this is really important too. Are you testing click through rates, sales completed, or just overall engagement of the site? You have to know what you're measuring between these two versions that you're comparing to, right?
Jeff Lambert (08:09):
Nichole Mena (08:11):
So you want to make sure to run tests simultaneously. So by this, I mean make sure both ads or blog posts or whatever you're testing are published and accessible for the same period of time. So you're really comparing apples to apples here, right? And keep the variables as uniform as possible.
Jeff Lambert (08:32):
That's a good point. If you publish a blog post in the morning and then you compare it to a blog post that you published in the afternoon, that's not keeping the stats, I guess like you said, apples to apples. There's going to be user behavior that's going to come into effect there. So you're saying, put them both out at the same time in the same format. That way you get the same feedback.
Nichole Mena (08:50):
Exactly. Exactly. And I think the major point there is you want to make sure that you're running your test for a proper period of time and at the same time. So a few hours really isn't enough to test something. Tests usually run for a minimum of one to two weeks from what I've seen. And it's really dependent on the amount of traffic, so the amount of visitors you have going to those pages is going to determine really the statistical significance of those tests at the end of the day. So, and then obviously you want to test, retest and retest again. There's nothing wrong with running multiple experiments, especially if it's an important page on your site like a homepage. You want to run a test, and then maybe run another one right afterwards, and keep doing the same thing since you can test different elements on that page.
Jeff Lambert (09:46):
Good practical tips Nichole. This should be enough for, like I said, if you're listening and you've never done A/B testing before, and maybe it's a strategy you're looking to refine, here's some very straightforward ways that you can improve your approach. Just to mention real quick, we have a blog post on this on Rizen's website. So I'd encourage our listeners too, if you want to see this in written form, in addition, go ahead and click on the blog post link in the show notes and you can read this, download a PDF, whatever you want to do to carry it with you. And Nichole, overall A/B testing, is this something that you find Rizen doing on a regular basis? Should it be an important part of your workflow as a marketer?
Nichole Mena (10:26):
Absolutely. We do this with Rizen for ourselves, and we do it with all of our clients, making sure that we are helping our clients get the most from their websites and the most people converting on their pages, which is essentially what they're looking for.
Jeff Lambert (10:46):
Excellent. So hey, we're passing it along to you now, the listeners, to give this a shot to refine what you're doing now and hopefully it brings you ... well, not hopefully. We have reason to suggest this, because it does work. So yeah. Please get started today. And I think you will see a difference. So Nichole, thanks for sharing these straightforward tactics and strategies to approaching this, and I wish you all the best until the next time we talk.
Nichole Mena (11:11):
All right, great chatting with you today.
Jeff Lambert (11:13):
You too. And to our listeners, thank you so much for listening to another episode. And remember, you can expect a new episode. We put ones out every week. We always make sure that they're filled with practical advice that's going to help you grow your business. And remember, if you're looking for an experienced, friendly and results-driven team, check out Rizen by going to gorizen.com, that's Rizen with the Z. You can also follow them on social media. They're on Facebook, Instagram, Twitter, and LinkedIn. You can find them just by searching for the username RizenInbound. That's one word, RizenInbound. And to our listeners overall, thank you so much for joining us. We appreciate connecting with you in this way on a regular basis. And you can do us a favor and help us get in front of new customers and listeners by just leaving a review on the podcast app that you're currently using. So overall, thanks for your support and we'll see you on the next episode.