When I received the task of monetizing the data generated within a company that first time, it’s possible that I was… a bit… perhaps… overconfident. I mean, it’s theoretically accurate to say that I thought I knew way more than I really did when I look back at that first project. I’ve heard it said that the human body completely regenerates itself every ten years, so I feel completely confident in saying — that other guy made a lot of mistakes. 

The new me, however, has learned from that other person’s missteps.

Building a data product isn’t easy. There are technical considerations, sales, marketing, pricing, go-to-market, operations, support, and strategy issues that need to be addressed. I thought I had a firm grip on absolutely everything I needed to know with that first embedded analytic project for our customers. Here’s what I wish I knew back then:

#1: White labeling? What Does it Mean?

These days, it’s hard to find a vendor that doesn’t claim that you can white-label their product when embedding analytics in your application. But what exactly does “white labeling” mean? It turns out that it can mean a lot of different things depending on the vendor. Some allow you to add your logo to the analytics; some let you suppress their “Powered by X” tagline at the bottom of the page, others eliminate any mention of the underlying analytic engine. It’s not the same from vendor to vendor — you need to check.

I found this out the hard when on my first analytic product. I asked (what I thought were) the right questions and learned that “yes, you can completely white label the analytics when embedding.” Later, when there was an outage on the analytic platform, our perfectly white labeled dashboards displayed the message “ERROR: UNABLE TO REACH (analytics vendor) CLOUD ENGINE. PLEASE CONTACT (analytic vendor) SUPPORT FOR ASSISTANCE”. Not good. Our customers were confused, and we had a lot of explaining to do. Make sure you understand what white labeling means to the vendor you select and that it matches your requirements.

#2: Get Vendor References

Embedded analytic capabilities used to be much rarer than they are today. Now, it’s tough to find a vendor that doesn’t promise the ability to add their analytics to your application (in days, not months!). But, I’ve learned that it’s a good idea to check references when it comes to embedded capabilities. On my first project, I did what everyone does — I scoured the Internet and found the latest Enchanted Grid* report from a reputable analyst firm. I picked the vendors in the upper right quadrant grid square and connected them for demos. The problem is, vendors that rank highly on the report are those that have functionality that closely matches what the analyst firm thinks is essential.

Unfortunately, these reports rarely are focused on embedded analytic data products and are nearly always written by people who simply have never developed a data product themselves. They don’t understand what a product team creating monetized analytics requires to be successful long-term.

The analytic platform that is best-in-class for internal marketing or sales analytics may be an awful choice for customer-facing products that require an ecosystem of management tools for deploying and maintaining customer instances at scale. Rather than relying solely on reports (yes, I still use them), I like to ask vendors for references. I want to speak to customers that are using the analytic platform to display dashboards or other functionality to real, live, breathing customers. I’ve found that a shocking number of vendors are unable to provide these references. The might claim embedding capabilities, but they can’t show battle-hardened examples of the functionality performing at scale. In these cases, I move on to a vendor that has real-world examples of long-term successful customer projects.

#3: Pricing Will Be a Challenge

Determining the price of your data product isn’t an easy task. In fact, in several of my projects, establishing the price at which to offer analytics took longer than the technical implementation itself. Yep, the analytics were ready to go, but we couldn’t launch because we couldn’t figure out a pricing plan. You can avoid this problem by starting early — like, at the beginning of your project when determining scope.

Decide “if” you’ll be charging for analytics (hint: yes, always charge for analytics” and then proceed to decide the functionality you could offer to command the price point you’d like to achieve. Pricing should be tied closely to product design. Determining price after all of the code is deployed, and the data is flowing is a recipe for a lengthy pricing battle.

#4: Quality Assurance is key

I could be wrong, but I’m not sure that anyone races through the design part of a product because they are really, really looking forward to performing the quality assurance steps. I mean, maybe there are people like that, but I’m not one of them. In fact, it is theoretically, conceivably, maybe, sort of possible that I’ve given QA less attention than I should have on past projects. On that first data product, we were completely ready to go when I scheduled the first review with a customer. Not the customer’s development team or product managers — the CEO. The dashboard looked fantastic and was a tremendous improvement over the static reports we’d displayed in the past. The customer’s CEO was impressed. He commented on how great they looked and how he could see the potential for improving the way he ran his business. And then he dropped the death blow, “Of course, these numbers are wrong.” Wait, what?!? “You’re showing repairs of $27 million last quarter. That’s more than my entire annual budget. Your numbers are wrong.”

It turns out that we failed to catch that we weren’t converting currency exchange rates correctly. We weren’t finding the issue because we had no formal process for performing quality assurance. The result is that we spent enormous time getting that customer to trust our data again after seeing that errors not only could occur but were likely. It wasn’t a fun process. These days, I make sure that quality assurance is a key step in the development process and that we are 100% confident in our numbers before even the first demo outside our walls.

#5: Scale is About More Than Data

It used to be that one of the key questions I’d ask was about scale. “Can you scale to 100 million rows? How about a billion? What’s the response time at that scale?” These days I find that scale isn’t an issue. Almost all of the major analytic platforms can easily handle massive datasets given the correct implementation. But I still ask a different scale question: “Can you scale to 1,000 customer instances? How about 10,000? At this scale, how would I manage, upgrade, downgrade, etc. in bulk?” This scale question is still essential for the data product team to understand before they commit to a platform for their product. 

Analytic engines that might look great, be easy to use, and be well-priced might turn into a nightmare once you realize that you must manually implement changes across each of thousands of customers. I found this out when I realized that the platform I had selected worked great for our five beta customers, but would require an entire support team to operate at the customer scale we needed. Ask about scale, but not “data scaling.” Ask about “customer scaling.”

 

Not surprisingly, I made more than five mistakes that I'd like to go back and tell myself before that first project. I'll cover the next things I wish I knew in the next post.


Footnotes:   
*    “Enchanted Grid” copyright 2018, Kevin Smith