It's the nightmare that keeps analytics leaders up at night:

Our analytics—the ones we spent so much time and money on—the analytics that were supposed to generate new revenue and new customers, are failing.

Whether you are building a customer-facing data product or implementing analytics inside your organization for internal use, the fact is that some analytics projects do fail. But when they fail, they almost never fail the way you think they're going to... At the start of an analytics effort, most project leaders are imagining a problem cloud of all of the technical implementation disasters that might occur:

  • We picked the wrong analytics vendor... Fail!
  • We can't access the data effectively... Fail!
  • The features we were promised don't work... Fail!
  • It's too slow... Fail!

While all of these issues certainly can occur, my experience has shown that they are very rarely the root causes of a failed data project. Technical failures such as these tend to be easily addressed. Modern analytics vendors are experienced in all manner of data configuration and are highly motivated to get your project implemented successfully. Between basic project planning and support from your analytics vendor/partner, it's almost never the technical issues that result in your downfall.

But analytics are complex and as a result our minds race to think about all of the bad technical things that complexity could bring. And that's the problem. Focusing on the potential technical risks masks what's really going to get you. It's like locking your gaze on the tiger on a distant shore while a crocodile is lurking just below the surface of the water by your feet.

When analytics projects fail, it's never that they just "don't work". It's rare that you simply can't access your data, render it into charts, or make those charts available to your users. What happens far more frequently is a "failure to thrive." The analytics are implemented successfully from a technical perspective, but they just never "take off." They don't attract the new users that were expected, they don't generate the revenues promised, or in the case of internal analytics, they don't deliver the insights required to run the business. This is actually far worse than a technical failure because the remedies aren't as obvious or as easy to fix. There's no code to correct, no data to cleanse, no patch to apply. It's a bewildering situation when your analytics are working from a technical perspective, but aren't achieving overall project goals.

As a data product strategist—someone who helps businesses design analytic and data-driven products—I've seen my share of troubled projects. What I've found is that these struggling analytic efforts tend to share common root causes. Three main problem categories account for the majority of the instances when analytics fail to thrive and deliver results. And the good news is that these problems are solvable.

 

Reason for Failure #1: You built it for you, not the customer

Early in my career, I joined a business that was looking to turn its huge volume of data into paid insights for its customers. It wanted to build a data product. I joined not long after the initial decision to build the data product had been conceived and right out of the gate, I made a giant mistake. The product lead had very strong opinions about the analytics we should be creating for our users. I took those opinions as fact and used them as the basis for the data product. It didn't work out too well.

The biggest mistake you can make in implementing your analytics is to create them for yourself, not the actual user. It doesn't matter if your user is a paying customer outside your company or one of your company's employees that will be using the analytics themselves. They are not you and you are not them. Your perspectives and needs are often very different.

In my situation, we built the analytics based on the opinions of our own team—a team of very savvy Engineers and technologists who understood our product completely and knew exactly how to interpret the data. Our customers, however, didn't share the same perspective. They were experts in their OWN businesses, not in our product. When they used our product which was designed based on our technical expertise, it was confusing to them. Our view of the business needs simply didn't match what they experienced every day. The product saw limited adoption in its initial form until we wised up and built functionality based on user pain points and not our own technology goals.

Solving this problem:

The best way to solve the problem of analytics designed for you and not the users is to avoid the situation in the first place. Start your project with these steps:

  1. Identify the 2-3 key personas to be served at launch
  2. Interview potential users who represent those personas and find out what they do in their jobs and how they solve problems today in the absence of analytics
  3. Create a diagram linking the persona, what they do, and the problems they are experiencing.
  4. List the analytics that you can provide to solve the problems you identified.

Every analytic you place on the page should be directly tied to a problem, workflow, and a persona. If you can't link a chart to a problem you found for a persona—don't include that analytic. It's likely the result of your desire to add elements rather than due to user's need.

But what if—like many analytics teams—you are too far down the development path to follow these steps? What if you've already launched only to discover that it's been for us instead of for our users? Don't worry, recovery is still possible. In this situation what you need to do is "fork" your analytics. Keep the "designed for us dashboards" up and running, but immediately start following the four steps outlined above. Once you've linked personas to workflows to problems, try to arrange your existing analytics by persona and problem. Often you'll find that some portion of the analytics can be re-ordered and re-used. If not, at least you've defined the roadmap for the next quarter.

 

Reason for Failure #2: You focused on features, not solutions

Analytics platforms these days are feature-rich. In fact, they have so many features that analytics teams often start their search for an analytics provider by creating a checklist of features. Does MegaBI vendor have built-in ETL? Yes! +3 points! What about easy integration of outside predictive analytics models? Nope. Bummer. -3 for that miss.

Checklists of required functionality can be very useful as a basis for comparison, but they can to lead to the second big problem that causes analytics projects to fail: a focus on features, not solutions. This problem occurs when analytics teams emphasize adding functionality into the application and lose sight of the core problems they are trying to solve for their users.

I like to think of this as the comparison between early Apple iPhones and early Android phones. The Android commercials would point out all of the great features the phones contained—far more than the Apple competition. 2.4GHz Snapdragon processor: yes! SD card slot: yes! A special dock that turns it into a desktop computer: included! Apple commercials were different. They rarely mentioned specifications or individual features, instead focusing on how the product was used and how it could enhance a buyer's life. Compared to Android, those early iPhones lacked key features (remember the missing cut & paste?) but they showed the user how they would solve their "problems". Apple focused on the experience, not the specific features. We all know the results.

As an analytics leader, you don't get points for adding features to your data product. You get points by solving problems for your users. No one cares if you are missing a currently trendy bit of technology if you understood and solved a key analytics issue that was preventing your user from being effective at their job. Having a lot of features might be enticing to buyers initially, but once the newness factor wears off it will be the problem-solving aspects of your analytics that keeps users engaged.

Analytics that were designed in an attempt to check boxes on a features list will tend to have many options, filters, and methods of analysis but they won't guide the user from problem to solution. Instead, they'll present a series of options and hope the user is able to navigate the right path. Many users won't tolerate this frustration for long and will simply give up.

Solving this problem:

An easy way to determine if you've focused on features instead of problem-solving is to try and find themes or patterns in your dashboards. Specifically, patterns that guide the user to an understanding of where issues may be occurring. Do you have lots of analytics but no clear path for the user to follow? You might have a problem.

If find that you've built analytics dashboards with many features (or charts) but without an emphasis on solutions, re-organization is the answer. For each persona you're serving, list the problems they're trying to solve and the analytics that may help them solve those problems. Re-arrange your analytics into a workflow based on the problems to be solved for each persona. This will result in multiple dashboards (perhaps one per persona) each with various tabs or sections (one per problem to be addressed). In future development, focus your efforts on the roadblocks preventing the user from finding solutions to the problems ("I can't find top performers because I don't have a chart by employee") rather than adding new functionality that isn't directly tied to a user pain point. When you structure your analytics into this type of workflow, you will reduce the mental load on the user as you guide them to a solution and reinforce your product's position as the trusted analytics advisor. The end result is increased user engagement with your analytics.

 

Reason for Failure #3: You launched it and forgot about it

It doesn't matter if your analytics are to be used inside your business or by paying customers, dashboards aren't "fire and forget" cruise missiles of analytical insight. You can't launch them, breathe a sigh of relief that the project was successful, and head to the bar for a drink. Analytics are living, breathing entities that must change over time to keep pace with the evolving needs of their users. Even if you were 100% correct in your assumptions about which analytics your customers needed on launch day, this likely won't hold true a year after launch. Your customer's needs will evolve as their business and processes change and as they become more proficient at using your analytics. Unfortunately, many analytics teams forget this. Analytics are frequently a side project performed in addition to the "core product" work and launch day is often viewed as the end of a journey, not the beginning. This is a problem.

Imagine that you decide to pursue your lifelong ambition and run a marathon. You spend months training, buying the right shoes, learning good technique, and preparing for race day. On the big day, you get your number pinned to your shirt, line up and wait for the starting gun to fire. Upon hearing the crack of the pistol, you take take your first three steps and then veer off to the sidelines to relax. Good race everyone! As absurd as this example is, it's what many analytics product teams do. They spend vast resources racing to launch day... and then relax after the analytics are released. As a result, the analytics tend to spike in early user engagement, and then slowly decline in use as users find flaws, gaps, and become frustrated. Eventually you're left with a situation where you still have operating costs for your analytics platform, but you've got too few engaged, paying users to generate a profit.

Solving this problem:

This is a straightforward problem to solve. Treat your analytics like a business-essential product line and give them the care and attention they deserve. Don't expect the analytics to drift along satisfying users with no focus from your team.

After launch, dedicate resources specifically to the analytics "product". Monitor user behavior—see which analytics are frequently used and which gather dust on your dashboards. Read every analytics support ticket or feature request that is received and start developing themes for future improvement. Even better, go and visit your analytics users. Whether inside your business or outside your walls, sit down and watch how they use your analytics to solve their problems. You'll likely witness gaps and ideas for future development firsthand using this technique.

Frequently examine who is using your analytics, what they are using, and how they're using the data. Staying in touch with your real-life user personas and the evolution of their analytic usage is the best way to ensure that your data product doesn't wither after launch.

 

So there's good news and bad news for leaders looking to creating analytical insights for their users. The good news is that you don't need to worry so much about the technical stuff. Find a good vendor and stay on top of the project and it will work out just fine.

The bad news is that even a technically flawless project can fail to thrive and deliver those results that you promised at the last board meeting. But this is only a problem if you don't take action, if you scratch your head and chalk up your dying analytics to users just not "getting" the application. The problem isn't them and it isn't technical—it's the result of your decisions. Make sure you've developed for your personas, not your engineering team. Build dashboards to solve user problems, not to show off the impressive features you've created. And once you've done these things don't stop assuming that because users like it today, they'll continue to be satisfied tomorrow.

Building great analytics isn't hard, but it does require discipline and attention to the non-technical side of the project. If your data product is failing to thrive and isn't delivering the results you expected, take a close look at the three areas described above. By taking note of the problems and then taking the right actions, you can change a failing project into a success.