All them A/B tests that never happened

uxdesign.cc uxdesign.cc3 years ago in #UX Love477

It’s fine to be indecisive. Designers who are curious and constantly looking to learn are —in my opinion— more valuable to companies than designers-who-know-it-all-based-on-zero-facts. That’s the beauty of digital design: using data to make informed decisions on what an interface will look like. Art and science combined into a single discipline. The problem with proposing A/B tests as a shortcut to end that type of debate is that usually the discussion that leads to that conclusion is not about which version will perform best; the difference between the two options presented has more to do with different business priorities than different UI approaches to reach that priority. Those two versions are trying to achieve different things, and both are important for the business. Incapable of prioritizing one goal over another, teams appeal to A/B tests as the solution to end all problems. In many cases the team ends up performing the tests, and unsurprisingly, the results point to: Version A gets more clicks on our sign up button Version B gets longer session times exploring our content Finally, the team goes back to the question that should have been answered prior to testing: do we want more sign ups, or do we want longer engagement time? What is our priority in the first place? A/B tests are powerful tools, but designers and product managers need to understand more clearly what these tests are good for. Otherwise, these tests become an escape hatch for lack of consensus. For lack of a solid product strategy point of view. Luckily, most of those A/B tests never happen. URL Out – https://uxdesign.cc/all-them-a-b-tests-that-never-happened-10ea0eddec80Author – uxdesign.ccDate – 2019-11-03 08:23:36

Like to keep reading?

This article first appeared on uxdesign.cc. If you'd like to keep reading, follow the white rabbit.

View Full Article

Leave a Reply