UI Design: Using Machine Learning Thoughtfully

There are vast amounts of data available right now, and if you have any tracking on your website or app of clickstream data, chances are you are using analysis, data science, or machine learning to try making sense of it all. This is great! Whatever level of analysis you are using, moving to a data-informed culture is a great way to improve your product and the experience for your users.

It is important to highlight some potential pitfalls that might come up if you lean too heavily on machine learning to decide how to design your product, and ways to work around those pitfalls.

Machine learning algorithms are black boxes, which may lead to non-intuitive UIs

One potential risk of using a machine learning algorithm as the sole piece of information to design a UI is that it results in a non-intuitive UI. Users have learned how to use technology to do various tasks. For example, see a blue underlined piece of text and you might expect it to be a URL leading somewhere (and be frustrated upon clicking it to realize that it is not!). A lot of this learning has been done in the background in our lives on computers and online. But now we have a mental model of how we expect certain systems to work.

If machine learning algorithms are used as the sole piece of information to design a UI, this could lead to something that does not perfectly fit within a user’s model of how the system might work. Nielsen Norman Group found in a 14-person diary study that when people interact with systems built on machine-learning algorithms, the users had weak mental models and difficulties making the UI do what they want.

Part of this is because most machine learning algorithms are black box models: we put data in, we get outputs out… and we don’t necessarily know what happened in the middle to get those outputs. Further, in most cases, users do not necessarily know what pieces of their data are being used to personalize their experience. Give their full article a read, I found it had some extremely valuable ideas.

Even with this in mind, I think that machine learning can still provide valuable information to help inform UI decisions. The trick is to using machine learning to inform the decisions: Add in a layer of human experience. For example, supplement machine learning analyses with user interaction, like Nielsen Norman Group did in their diary study, or do user testing where possible. And before doing anything radical that provides a vastly different UI experience based on machine learning findings, make sure you know why the rules were in place in the first place so you know the new alternative still provides what users expecting that experience to provide. “Learn the rules like a pro, so you can break them like an artist.”

Data does not make us not mind readers: We do not know why users did what they did or even what they are trying to do

When we do any type of analysis looking at user behavioural data, we see what they did through their clickstreams. We do not see what they were trying to do.

So while we may notice that a particular element in a mobile app doesn’t get engagement, or that another element like sharing gets lots of clicks but no completions, we don’t know what the user was trying to do. Were they expecting clicking share to do something else? Are they not engaging with the element because they don’t know what it means or that the app can even do that?

This is where user testing can inform how real life users are intending to use your website or app. Another thing you might consider is sifting through reviews about your product: if someone is really unhappy and couldn’t make it do what they expected they could make it do, you can bet someone somewhere is complaining about it.

The model must be trained on historical data, which tends to perpetuate the status quo

A machine learning model must be trained. And to train it, you need data. That data is inherently going to be historical in nature (oh, what I wouldn’t do sometime to have a DeLorean sometimes). It will be a snapshot of your current UI and if you solely look at what is happening on it, you may end up finding you perpetuate the status quo. In the less extreme cases this can just lead to leaving some money on the table, but in the worst cases, this can lead to getting stuck in unfortunate discriminatory equilibriums we as a society may be trying to get out of (which I won’t get into here and will save for another blog post).

Let’s say that you have a web page with a giant green “Order Now” button in the middle of the screen. This accounts for 25% of order funnel starts, which is higher than any of the other entry points. Therefore, some might be hesitant to change anything about this: why risk breaking a good thing?

However, what if unbeknownst to your team there is some other completely different page layout that renames the button, changes the colour, and the location that could lead to an even higher click rate and order completion rate?

Of course, it’s probably not going to surprise some of you that I’m now going to bring up A/B testing. A way around this pitfall of historical data is to start doing A/B or multivariate testing. Create an alternative web page layout with incremental changes, show it to a randomized percentage of your customers, and compare the results of the A vs. B.

A/B testing allows us to think “What if” and see what might happen. It helps us get to causality.


So while machine learning might unveil interesting insights and recommendations, sometimes you will want to use A/B testing to confirm if your causal model of what is happening really is happening, or to ensure that you are not getting stuck only implementing changes that can only come from historic data. Shake it up!

Conclusion

Machine learning of clickstream data is just one tool you have at your disposal to inform your design.

Other possibilities include:

  • User testing
  • A/B or multivariate testing
  • Reading negative reviews
  • Interview product users
  • Examine competitor products
  • Thinking through mental models of how systems work