Let’s say you wanted to support RAINN, the nation’s largest nonprofit against sexual violence. You design a landing page, build an email list and start a campaign to raise funds. You send an email out to your email list with some links to visit your landing page and donate. When your readers open the email, 50% of them are shown a “Donate Now” button in blue and the other 50% of your visitors are shown the “Donate Now” button in red.
You personally like the blue button. However, after you run the test, you notice that the red “Donate now” button performed significantly better than the blue button.
The red button is clicked on 20% more than the blue button, therefore it is the correct color choice to use for the campaign.
Welcome to the world of A/B Testing.
A/B testing is when you have two versions of an element and a metric that defines success i.e. click through rate, bounce rate, or exit rate. One may test out marketing content that best resonates with their audience, and use the data to increase conversions. Here is a quick step-by-step guide on using A/B testing to maximize raising funds for your nonprofit.
Tests should be made to give you actionable insights that show you the changes were not based solely on chance.
The best testing framework is to use the statement: “If ____, then ____, because of ____.”
For example, if you are testing a “Donate” button color, it would look something like: If I changed the color to Red, then the button will get more clicks because it will stand out more.
You can test the font typography such as Arial or Sans Serif to see which font typography resonates with your audience. A font change can make the page effortless to read and lower the bounce rate on a page.
If you can change it, you can test it. Brainstorm changes such as the location of the call to action button or the exact copy used to be tested.
You can test pretty much everything on your website, emails, etc…but that does not mean you should.
After you have a list of clear ideas to test, it is important to prioritize which tests will yield the most significant results.
Some of the most important tests are:
Again, instead of testing out every piece of color or copy on the website, prioritize the copy on the headlines and the call to action. These aspects will yield the biggest insights that are correlated with your business goals.
One of the biggest mistakes you can make while doing A/B tests is drawing conclusions too early. Imagine if you test a variant for a couple of days, and then declare a “clear winner”. You might be doing more harm than good to your campaign by selecting an incorrect variant.
Make sure your tests run out long enough and there is enough data collected to determine a clear winner. However, running a test for too long can also give skewed results. There are more variables that are not in your control over a longer time period that may skew your results.
Run tests for at least a month to increase the confidence in your results.
When in doubt about the accuracy of the test results, retest.
Now it is time to look at the data you collected to discover why your users preferred one variant over another.
For example, here are some results of an A/B Test:
A: 80% of your audience clicked on the button with the copy “Save a Life”
B: 65% of your audience clicked on the button “Donate blood today”
After running this test for a month, variant A is the clear winner.
This means you need to have a minimum number of users participating in each test and a long enough testing time period to gather statistically significant results. It is crucial to test only one thing at a time. If you change the color of a button, do not change the copy of the button. Later, you would not know if the color or the copy change made the difference.
Testing is worthwhile. Ultimately, it comes down to your specific audience, which is why I feel testing is valuable.