Have you ever gone to a store and wondered why certain products are placed on one shelf versus another? Chances are that the coupon machine that’s sticking out at eye level was put there for a reason. Most successful companies spend lots of effort collecting data to determine how customers are looking and interacting in their store. Why should a website be any different?
Websites are often the first look at a brand or company. When a person visits a website, there is almost always a reason for doing so. Whether it’s reading about a new service, buying a product, or researching a new topic, the user knows what they are there for. Knowing WHAT a user is doing and WHY can help content creators better cater content to their users.
When a company uses a Data-Driven approach with their processes, it means they are making strategic decisions based on how collected data can be interpreted. These decisions rely on quantitative data that can be measured from a number of things. A/B tests determine how users interact with one design versus another. Analytics can determine what pages are getting more traffic or bounces. Sometimes even conducting surveys. These all are valid data collection methods that can be used to influence future optimizations.
To start working in a data-driven way, you’ll need to collect data. This can start with a simple implementation of analytics of Page Views, or could get as advanced as recording target audiences’ behaviors. No matter what data you have, it can be easy to determine problems within your UX or designs.
Let’s look at an example QD recently was tasked at fixing. A client was noticing that a particular form was not yielding many submissions compared to the amount of people visiting that page. We needed a way to optimize this page in a way that will provide more form engagements. We needed data.
Collecting data generally starts with a hypothesis. What could we change about this page that might affect form usage? Maybe a different layout? Maybe move it somewhere else on the page? A form is less likely to be filled out if users aren’t seeing it.
In this situation we started with this as our hypothesis: “I think this form will get more submissions if we had it closer to the top of the page.” From a UX point of view, that may be true or false, but instead of quoting articles or assumptions, the client will see more value from actual data. Let’s test it.
Using Adobe Target, QD frequently uses A/B tests to collect data for various tests. Using a quick drag and drop tool, we can manipulate the DOM of a page to move the troubled form closer to the top of a page. We can now serve this new “experience” to half of the traffic visiting this page.
After a week (or any block of time) we will quickly see that moving the form up actually does perform better. More importantly, we have the data to prove it.
Sometimes tests may not support your hypothesis, but it is important to remember that even a bad test does provide value. Maybe that new CTA button style does not improve page click-throughs. That’s okay – you now know that changing all CTA’s to this new style is a bad idea. Now what?
Let’s test something else. If it’s not the styles of the CTA that is the problem, maybe it’s the text associated with it. New hypothesis: “I think this CTA will perform better if we change the wording.” Setting up a test to replace text is just as easy and is a perfectly valid test in many situations. “Click Here” might be performing poorly, but through A/B tests, the data might show that “Read More” provides more clicks.
It’s important to know that optimizing never ends. Once you’ve improved your form submissions or CTA click-throughs, there will always be more metrics to improve. Finding pages that are suffering the most are a great place to start.
At QD we love running tests and optimizing websites. Drop us a line and let us know how we can help you become data-driven!