I had just accepted a new role at a small technology company when an email from my soon-to-be boss changed the direction of my career:

"We've got all this data we've gathered from our application.  We should do something with it, like adding some sort of business intelligence.  Start thinking about what we could do."  

Back then, I had no idea just how little I knew about selecting a business intelligence vendor, getting the system implemented and integrated, and creating an actual product offering based on BI capabilities.

Three years and four analytics implementations later, I know just how naive I was and I've got the scars to prove it.

For me that first BI project was an exciting (if a little nerve-wracking) experience — I had no concept of what was around each corner and which decisions were going to haunt me later.  I've implemented business intelligence into existing products multiple times now and learned quite a bit with each additional project.

I'd like to share the lessons I learned, the best practices, and the worst mistakes.  You'll still make plenty of mistakes with your own project, but you can avoid the pitfalls I encountered by using the techniques that I learned through painful trial and error but which made my implementations successful.

The Quest Begins:  Trying to Find a Vendor

If you do a quick search on Google for "business intelligence tools", you will find 110,000,000 results.  Assuming 10% of those results are for BI tools, that you review two vendors per day, and that you have a team of five people to help in the assessment process — it would take you 30,000 years to complete your selection process.  

This is a problem.

How do you cull through the massive number of BI offerings available today in a reasonable amount of time and without missing key players?  Analysts' reports are your friends...

I started my search by making a list of all the potential players in business intelligence/analytics that were reasonable solutions for our needs (more on that to come).  The problem was that I didn't have access to any Gartner reports and I didn't have the budget to pay $1,995 for the latest "Magic Quadrant".  What I learned early on is that you can call any of the BI vendors such as GoodData, Birst, Tableau, or Alteryx and they will give you a copy of the latest Gartner report with little convincing.  I called one of the vendors and explained that I needed to implement a white-labeled BI solution inside an existing web-based application and I had a copy of the latest Magic Quadrant report in my email within the hour.  In fact, many of the vendors now allow you to get a copy of the report from their Web site after filling out a quick form.  The only caveat is that it's often a truncated version only highlighting features of their product.  Call a vendor and ask for the full report.  

Why do you care about the full report?  Because it has a comprehensive listing of all the vendors your should be reviewing — it becomes your master list from which to start narrowing down solution candidates.  But how do you know who to include and exclude from your search?  Even the Gartner report has too many options for you to fully demo each application.  Start by defining your goals and personas.

Who Will Be Using This and Why?

One of the mistake I made was that I didn't frame our goals crisply.  We had started early on by listing project goals so we knew that:

  • We wanted to offer BI as part of our core product
  • It had to be embedded in an existing software-as-a-service application and maintain the look and feel of our application
  • It had to be easy to install and maintain (we didn't have many Engineering resources available to help in the implementation)
  • We had to be able to control costs as the application grew
  • It had to have a great user experience — both for us creating dashboards and for the end user consuming the analytics

I thought I had a good handle on exactly what we needed.  I didn't.  Basic project goals are necessary, but not sufficient for BI implementation.

The second time I implemented analytics, I corrected a major oversight from project #1.  I created personas describing exactly who would be using the system.  By persona, I don't mean a marketing persona: "35-45 year old male who lives in San Diego and an enjoys the soulful sounds of the Backstreet Boys."  I mean a user experience persona that describes the user you are serving and the problems that keep them up at night.

It is important to fully define these personas because your BI application will likely have a wide-range of users — from tactical users to executive-level strategic users.  I overlooked this step during the first project and I ended up co-mingling tactical and strategic charts and graphs on the same dashboards.  As a result, I ended up creating pages that didn't quite fit any one user type well.  Charts that made sense for a front-line manager were useless to the COO.  The second time around, we had a workshop and defined the personas that would use our system such as:

  • Chief Marketing Officer
  • Sales manager
  • Head of Customer Advocacy

For each of these personas, we then created a quick persona brief, an example of which is shown below.

The purpose of the brief was to get us thinking about the users that needed to be served by each dashboard we would create.  We still ended up with a mix of tactical and strategic roles, but on the second project we realized this and separated the charts more logically into different role-based pages.

I recommend that you start your process of vendor selection by making a list of the project goals and then move on to thinking about the specific users whose lives you are trying to improve through the use of analytics.  

Is There a Company Out There That Doesn't Sell Analytical Tools?

At this point in the implementation process you will have three key pieces of information:  a long list of vendors from whom to choose, a list of your project goals, and a list of personas that you need to serve.

The next step is to eliminate those vendors whose products don't align with the needs of your users.  For example, we quickly eliminated several "Traditional BI" vendors because although they had well-established products, they couldn't operate embedded into our web-based application.  One strike — you're out.  From the twenty initial vendors we started with (selected via the Gartner report), we narrowed the list to seven potential solutions by eliminating those couldn't fit with our operating model.  We continued our narrowing process by evaluating the remaining candidates against a combination of factors.

To simplify the process, I created a matrix of both objective and subjective criteria.  I listed the project goals as criteria, but also added items such as "ease of working with the vendor".  Do not overlook the subjective criteria.  While meeting the basic technical requirements is critical, it's the subjective stuff that will make your life either fun or hellish over the next few months.  

For example, during the initial discussion with a particular vendor, the salesperson told our he'd of Engineering that he should go back and read the product specs webpage so that he'd be able to ask better questions on the next call.  That incident gave us a hint into what we might expect working with the vendor's team and factored heavily into our "ease of doing business" category.  And there wasn't another call.  

Here's a sample of the matrix we used (vendor names sanitized to protect the guilty):

This wasn't the only mechanism we used to evaluate the vendors.  For each candidate, I prepared a one-pager that covered items such as:

  • Years in business 
  • Size of the vendor
  • Number of implementations to date
  • Number of embedded or "Powered by" type implementations
  • Cost of year 1 (setup plus service fees)
  • Cost years 2-5 (mostly the on-going service fees)
  • Cost of professional services for implementation 
  • Support ecosystem (did they provide training, marketing support, etc.)

We combined all of these factors into our evaluation and selected three business intelligence vendors to perform proof-of-concept trials.

"Our Pricing Is Based on Several Random Factors Designed to Make Comparison Impossible..."

One item I didn't anticipate when starting the BI journey was how different the pricing models would be from vendor to vendor.  The schemes I encountered included:

  • Pay per user
  • Pay per server/CPU
  • Pay based on volume of data
  • Pay a percentage of your product's base cost to the BI vendor 
  • Pay a straight annual fee and pay for services

The incredible disparity in the pricing models created huge headaches for me and made it tough to compare pricing across systems.  I solved the problem by calculating the cost of implementation through the end of year one, and then the cost of each year past year one.  A key here was establishing the project assumptions based on the goals and personas.  Our assumptions on that first project included:

  • 20 million rows of data (we could control this number to limit our potential cost)
  • 1 refresh of the data per day
  • Professional services to help determine what data we needed, built the loading processes, create the initial charts, and teach us how to use the system
  • Two changes to the underlying data model per year

Using this information, I went back to each of the seven finalists and asked them to calculate the costs for the first five years.  If you use this approach — making the sales team do the costing for you — you'll save a lot of time.  

With all the permutations for pricing that exist, I can't overstate how vital it is that your selected vendor's pricing model matches your intended product offering model.  If you get it wrong, you can limit the potential ways in which you can offer the product to your users.  As an example, we wanted to include basic BI for all existing customer accounts, all users, but with limited data.  If we had chosen a vendor that based pricing on the number of users, we'd be in the position of either reducing margin as more people used the system or tying to limit the total users per account.  This was the opposite of what we wanted to achieve — we wanted everyone using the system and getting hooked!

As a result, the vendors that based pricing on user counts were eliminated.  Our product also had a very thin profit margin and we eliminated the vendor who wanted a percentage of our revenue from further consideration.

You Talk a Good Game, Now it's Time to Show Us

It was down to three vendors and time to see the applications in action using our data and our desired charts.

The proof of concept (POC) trials were designed to mimic what we might see in a production situation — similar data, similar charts — just less data and no integration.  We wanted to see how fast each vendor could get up and running with our data as well as how quickly they would respond to changes to the requirements and the little challenges that always pop up in a complex project.  The vendor that we eventually chose was able to perform not one, but three separate trials for us faster and with less stress than the second place vendor was able to get anything up and running. 

As an added bonus, the POC also allowed us to "test" the personalities of the implementation team.  We would be working with these people for the next 90 days — this was a great chance for a preview.  If a problem came up on a Friday evening when the demo for CEO was on Monday morning, would they be willing and able to help?  Would they quickly pick-up on our business challenges or would we need to explain our company's business model repeatedly?  The answers to these questions were as valuable as the technical aspects of the POC and factored heavily into our final decision.  We ended up choosing the vendor that had great capabilities, a great team, moved quickly, and felt more like a partner than a vendor.  But now, we needed to negotiate the deal.

Send in the Lawyers

The work on the contract details was perhaps some of the most enjoyable, fulfilling time of my life.  I look back on those hours and days fondly...  

Oh wait, sorry, it was the most grueling, exhausting portion of the project.  Not because of the lawyers (they were delightful people who just happened to enjoy hour-long conversations about mutual indemnification) but because of me and what I overlooked.

Here's where I made another major mistake in the first project — I got the legal and finance teams involved way too late.  We'd already conducted the POC and had picked the vendor that we thought would be best for our product.  We'd even settled on the pricing both for the product and implementation services.  We're done, right?  No.  Wrong.  It's details time.

As soon as the legal and finance teams became involved in the project, a couple of things became clear.  First, I would need to spend significant time reviewing the project with them, explaining the business purpose, explaining our selection process, discussing security, etc.  We even had extensive discussions about why we were buying a solution instead of building our own (hint: building is always more expensive and time-consuming).  It was frustrating and took a significant amount of time. 

I should have gotten these teams involved from day one so they could understand our decisions as we made them.  My fault, lesson learned.

The second thing that became very obvious very quickly was how many little details I missed. Here are some of the items that I now know are essential to address early:

  • Contracts with existing customers — you'll need to change the terms so that they understand that data will be handled/processed by a third party (the BI vendor)
  • Right to audit security.  Do your customers have the right to view security audit documents provided by your BI vendor?
  • Service-Level Agreements:  Does the BI vendor's contractually obligated uptime match what you are promising to your users today?
  • Professional service budget —  what happens if you don't use all the money?  Can it be used for other projects?
  • How do you define a "project"?  Is the BI tool only to be used for a single application or can it used to power another effort such as your internal metrics?
  • What happens if the vendor decides not to review the contract after the term expires? How much time do you have to implement a new solution?
  • When does the meter start running on pricing?  Do you pay for the BI system while it's still in development or only after sign-off is complete?
  • Intellectual property:  Where does the vendor's application end and yours begin?

There are many more finance/legal issues that I've learned need early resolution, but these are the ones most deeply seared into my psyche for the future.

Wait, You Mean We Still Need to Implement This Thing?

Vendor chosen, contract signed — it's time to get going!

And...  wait.  What are we building again?  Yet another of the mistakes from the first project that I corrected in project #2 was a failure to have a good understanding of exactly what we wanted to show our users.  Having the persona developed for the second project helped, but so did having a complete "dictionary" of the charts, metrics, and dimensions we needed.

For the first project, we burned through a significant part of our services budget just trying to get a handle on what kinds of questions we needed to answer, what data those answers required, and where that data was located.  For the second project, we got a little smarter.  The first thing we handed the vendor's project team was a spreadsheet with the following for each analytic to be displayed:

  • A sample of the chart you want to show (e.g. a bar chart with dual axes)
  • A list of the metrics required by the chart (e.g. net revenue, customer count)
  • A definition of each calculation required (e.g. uplift = current month performance - baseline performance, baseline performance = rolling 12-month mean)
  •  The dimensions required (e.g. by year, quarter, month, day, region, product, team, etc.)
  • The next level of drill-down (e.g. click the bar for the month to see the specific items that make up that month's performance)

It doesn't have to be pretty, but providing a simple dictionary of charts, matrices, definitions, and dimensions will accelerate the initial phases of implementation and save you money along the way.

Qualtity Assurance is Key

I'll never forget sitting at a meeting with our CEO where we revealed the results of our hard work from the past month.  One of his first comments was "why does this chart say the cost of this type of repair was $27 million?  Our whole budget is less than $20 million.  This can't be right."

Oops.  We hadn't performed quality assurance (QA) to the level we should have in that first project.  Although we spent hours reviewing charts, testing performance, and tweaking colors and layout — we had missed an issue with the data we were providing the vendor.  We had built a process to convert all expenses from local currency to U.S. dollars, and then forgot to change the loading process so that it used the new converted data.  Rubles mixed with Yen mixed with Pesos.  We never noticed it, but the CEO sure did.  With business intelligence you've got one chance to get it right.  Show someone data that they know is inaccurate and you'll have a hard time getting them to trust the system ever again.

The second time around I made sure this wouldn't happen again.  At the start of the project I formed a small team of people with expertise in various parts of the business to review each and every chart and make sure it made sense.  We provided the team with a copy of the same data we provided the vendor for the POC and had them run the same calculations manually.  No silly calculation errors this time around.  Again, lesson learned.  The hard way.

All That Other Stuff...

We made it through that first implementation successfully and in less than 6 months.  The second, third and fourth implementations took less time and were equally successful.  But, some things took far longer than others and I recommend that you start on these tasks early on in your project.  

  • Product tiers:  Do you have levels of BI such as basic, plus, and pro?  What are the differences?  How are these priced?
  • Sales training:  You need to get the Sales team up to speed so that they can sell your new BI functionality.  How will perform both initial and on-going training?
  • Support processes:  How does a problem get initiated, triaged, and handed to the right party for resolution?  How do you define what gets handled by the BI team vs. what is handled by your team?
  • Customization:  If a customer wants a completely customized dashboard, are you willing to do it?  How will you price it?  How will you handle support for that unique instance?
  • Marketing:  You might need new logos, sales collateral, press releases, etc.

As I performed each implementation project, I kept a running record of all the details that could be easily overlooked.  Below is a sample of the mind-map I use:

Pulling It All Together

Analytics are hot, they are addictive, and they sell products.  But implementing business intelligence isn't a simple 1-2-3 type process.  You can learn as you go as I did, but it's easier if you know where the trouble spots may lie and how to avoid them.  I've wrapped up the steps I followed (during the fourth implementation, not the first!) into a flowchart that shows the steps you need to consider.  They don't all have to be done in this exact sequence, but it will give you a sense of the basic process.  

Pick a good vendor, understand your users, start on the little details early, and don't forget QA and you'll get your project successfully implemented and look like a rockstar.

Good luck with your project!