Storytelling with data: Making data analytics work for your user

This article is part of the CX School series, where UX leaders share the most valuable lessons they’ve learned about building and delivering exceptional customer experiences. 

When your data analytics game isn’t up to snuff, it can have a serious impact on your customer’s user experience.

Take Flybits.

Our user, typically an associate-level marketer at a mid-market or enterprise financial institution, has to process a lot of data.

That is just one of their many pain points.

They use our product to deploy personalized experiences across their digital channels and measure how these campaigns performed, focusing on metrics like open rates. Most of them have used (or are currently using) marketing tools like Google Analytics, Salesforce Marketing Hub, Monetate, Mailchimp, and so on.

They are being overwhelmed by a deluge of marketing data. And nobody wants that. Least of all us.

Thankfully, when your data analytics provides the user with the information they need in a format that’s easy to digest, you can make their lives a whole lot easier. Plus, you can help them make better decisions, faster.

And when users can make better decisions, faster, they can iterate more rapidly, from deployment, to results.

Which is exactly what our product design team wants to facilitate.

Now that I’ve briefly explained why data analytics matters, I want to highlight some of the exceptional work Jen, one of our UX designers, has been doing here at Flybits with push notification analytics.

The problem with push analytics

When our marketing users want to evaluate push notification performance, they typically refer to push open rate. It’s one of their KPIs.

Push open rate is total opens divided by total sends. Easy enough, right?

Well, not exactly.

You see, “total push notifications sent” tracks every push notification Flybits attempts to send to the end user through APNs, FCM, and other third-party push brokers.

I say “attempts to send” because not every push notification is successfully delivered to the end user’s device.

What does all of this mean?

It means that push open rates aren’t 100% accurate. They’re actually lower than they should be since we track every push notification sent, even those that fail to deliver.

So, when our marketing user consults their analytics dashboard to track push notification performance, they aren’t getting the complete picture. This can make it harder to understand whether a push was actually effective.

This is the problem with push analytics.

And it’s made more difficult by the fact that push delivery failures are pretty common.
They’re as high as 25% for Android and 10% for iOS, if you believe this article by CleverTap (and I do).

Non-delivery can happen for a number of reasons. Unreliable cell connectivity is definitely a big one. In some cases, it happens because the user has simply uninstalled your app or disabled push notifications. Too many requests made to the same device can also impact deliveries.

This is all to say that push non-delivery is a fact of life. That said, users should be able to see why non-deliveries happened, whether they were auto-resolved or require manual intervention.

Improving push delivery rates - Table

Scoping out a solution

Thankfully, Jen identified two potential solutions to the push problem.

  1. We can track push notification delivery to the user’s device. This approach would require us to send all push notifications as silent. That way, we’d be able to determine the total number of pushes that were actually delivered to end users. The tradeoff? We’d have to deprioritize our pushes, so in situations with high push volume, our notifications would be delayed.
  2. We can track push notification delivery to push brokers. Here, we’d be able to determine the total number of pushes that were actually delivered to the push broker. We’d use this number as a proxy for total pushes delivered to the end user’s device. The issue here is that we won’t have any insight into pushes that failed after they passed through the broker.

We decided to go with the second approach because we didn’t want to risk deprioritizing our push notifications, since many of our customers require them to be sent to end users at the right time.

There were two risks with this approach.

First off, imagine what would happen if we deployed pushes but a substantial number of them failed after passing through the push broker? Even though this wouldn’t be Flybits’ fault (not to mention that it’s an extremely unlikely scenario), it wouldn’t look good on us. So, that was a very real business risk.

This risk is an easy fix, though. We could simply provide marketers with an in-depth breakdown of where pushes failed and why. That wouldn’t help them fix the issue without having to contact their development team or Flybits. This was a useability risk.

Surfacing the right data

Once we’d settled on the right approach, it was time to brainstorm. The objective? To determine the optimal way of communicating push notification data to the user via our analytics dashboard.

So, we took to the drawing board and sketched out four different ideas to help us tackle the problem.

Idea #1:

First, we split data by device platform, and added columns for “Total Delivered” and “Bounces”.

Push analytics idea 1 - Table

Idea #2:

In this iteration, we changed the “Sends” column label to “Successfully delivered to push brokers” and provided a definition for the label in tooltip that would be activated whenever the user hovered over the table header.

Push analytics idea 2 - Table

Idea #3:

We realized that “Unique Sends” and “Unique Opens” didn’t make much sense as column labels, since a user might have both an iOS and Android device, so we changed them to “Total Sends” and “Total Opens”.

Push analytics idea 3 - Table

Idea #4:

We wanted to keep “unique” metrics, as they refer to the actual number of end users, so we decided to eliminate platform differentiation.

We also wanted to provide more context around push notification errors, without overwhelming the user.

That’s why we settled on an expandandable table. It gives users the option to investigate push failures on their own terms.

Push analytics idea 4 - Table

We decided to concept test our final iteration because it captured all the necessary data our user would need to evaluate push notification performance.

Getting market feedback

We tested with five users. Why five? you might be asking yourself.

Well, because after five, you end up observing the same findings over and over again (more on that here).

The results were clear.

Four out of five users found the metrics in the push notifications table to be very useful in determining push performance. The outlier noted that they could find that data in other places on the analytics dashboard.

Very useful to see how many push notifications you’re sending, and the open rates. Seeing send errors is also a nice touch. It gives you the opportunity to prevent some of the failed deliveries.

Three out of five users indicated that they would’ve liked to have seen push data split across iOS and Android.

I would like to see push metrics broken down by iOS and Android. We run multiple apps, we decide which ones we’re going to spend time on and that’s budget money. It’d be good to know where most of our users are, and what’s working where.

While the majority of users didn’t understand the data listed in the expandable errors table, they unanimously agreed that it was critical information to have, should they need support from their development teams.

I’d like to have more context into delivery failures. It’s really important to be able to understand exactly what the errors were for the development team to be able to fix them.

All users agreed that knowing why push deliveries failed would influence how they planned future campaigns.

When you understand how many push errors you had and why, you can determine whether the campaign failed or whether there were technical errors.

The big learning we picked up was simple. The more information we can provide marketing users with, the better—keeping in mind, of course, that there needs to be an information hierarchy, like the expandable errors table, to make sure they aren’t overwhelmed.

Where we landed...

Voila!

Push analytics performance - Table

As you can see, we’ve incorporated user feedback by providing data for iOS and Android, so that when marketers deploy push notifications, they can see where they performed best.

To be clear, this isn’t the final design.

Today, we’ve looked at how product designers research, test, and implement new designs. Next time, we’ll look at how data visualization figures into the equation by unveiling our final design for the push analytics feature.

Until then!

Subscribe to our blog

Stay up to date on the latest CX news, resources, and service tips.