What is A/B Testing? (And what does it mean for agents?)
Successful agents know that a great website is one that is continually evolving and improving, and the A/B test (also known as split testing) is one of the most important tools used to determine how well the system is working.
It tells us what components of a website page are working and what aren’t. Two versions of a page (or more) are produced using only one set of differing variables, meaning everything else on the page is identical.
Almost every element on a page should be tested and done on a regular basis. In earlier posts, we’ve talked about the importance of loading speed tests and content production when it comes to making a website really provide value and boost an agent’s efficacy rate, but A/B tests are really about making sure the website’s performance is being continually enhanced with the help of data. Inevitably, some things that seemed like a great idea at the time of launch may not be as important six months down the line, or some things may work so well that they can be added to.
How does it work? The crash course version:
The ‘A’ refers to the original testing variable, i.e. the control, while ‘B’ refers to the variation or new test variable that’s running opposite the original A value.
For example, perhaps A refers to the way listing photographs have always been laid out on a page, and B refers to a new carousel or gallery system that an agent is starting to believe may work better in the future. It’s important to realize that this ‘hypothesis’ comes from data trends based on visitor behavior on the website that the agent has kept track of.
Being able to determine which version is moving the relevant metrics in a positive direction is extremely useful when it comes to optimizing website ROI performance, and gives agents a more comprehensive understanding of which business opportunities can be further pursued just because that much more data is being processed and given meaning.
What do agents need to keep in mind?
Although it’s recommended that every element of a page should be tested, they should be tested one at a time. Having said that, there are some elements that are even more relevant to real estate professionals than others:
Search filters: This is an element that visitors will be drawn to first - it’s what most have ultimately come for and how well it works will determine if they visit again and the duration of their journey with that agent. A/B testing can help an agent figure out which terms matter most to their audiences and make them more prominent, or identify search options that are not as important and take them out entirely.
Navigation - A/B tests can help determine which order and spacing of variables perform best on their site; for example, perhaps the Blog button gets more activity when placed higher up the navigation menu, or some tabs can be removed or replaced by new ones etc.
CTAs - A very important element to test because of its pivotal role in the visitor journey, CTA A/B tests can include testing wording, sizing, design, placement, even background color. Many professionals are surprised by how drastic the changes in visitor activity can be with the smallest adjustments.
Forms - Another important element that takes time and lots of testing to ‘get right’ gradually. Smaller fields tend to do better overall, but that depends on how well you know what your audiences come online for. Quicksprout has done remarkable research discovering that the less personal details visitors are asked, the higher the conversion rate. For example, by reducing the number of options in your form fields from 6 to 3, you can increase your conversion rate, on average, by 66%.
Description/Copy - Copy should be carefully crafted (and tested) on every page of an agent’s site but perhaps even more so on listing descriptions. An analysis of 24,000 home sales revealed that some words do far better than others in real estate: ‘updated’, ‘luxurious’, ‘landscaped’, ‘remodeled’, ‘stainless’, to name a few. Not only that, A/B testing should allow space to test out what tone, font size, headlines etc do better. Does familiar, friendly language do better in drawing in an audience, or does a more knowledgeable, to-the-point tone hit the spot more?
A/B testing can be a lot, but if done right, it can be worth the investment of time and energy as agents are better equipped in making data-backed decisions for the future.