This wasn't the only mechanism we used to evaluate the vendors. For each candidate, I prepared a one-pager that covered items such as:
- Years in business
- Size of the vendor
- Number of implementations to date
- Number of embedded or "Powered by" type implementations
- Cost of year 1 (setup plus service fees)
- Cost years 2-5 (mostly the on-going service fees)
- Cost of professional services for implementation
- Support ecosystem (did they provide training, marketing support, etc.)
We combined all of these factors into our evaluation and selected three business intelligence vendors to perform proof-of-concept trials.
"Our Pricing Is Based on Several Random Factors Designed to Make Comparison Impossible..."
One item I didn't anticipate when starting the BI journey was how different the pricing models would be from vendor to vendor. The schemes I encountered included:
- Pay per user
- Pay per server/CPU
- Pay based on volume of data
- Pay a percentage of your product's base cost to the BI vendor
- Pay a straight annual fee and pay for services
The incredible disparity in the pricing models created huge headaches for me and made it tough to compare pricing across systems. I solved the problem by calculating the cost of implementation through the end of year one, and then the cost of each year past year one. A key here was establishing the project assumptions based on the goals and personas. Our assumptions on that first project included:
- 20 million rows of data (we could control this number to limit our potential cost)
- 1 refresh of the data per day
- Professional services to help determine what data we needed, built the loading processes, create the initial charts, and teach us how to use the system
- Two changes to the underlying data model per year
Using this information, I went back to each of the seven finalists and asked them to calculate the costs for the first five years. If you use this approach — making the sales team do the costing for you — you'll save a lot of time.
With all the permutations for pricing that exist, I can't overstate how vital it is that your selected vendor's pricing model matches your intended product offering model. If you get it wrong, you can limit the potential ways in which you can offer the product to your users. As an example, we wanted to include basic BI for all existing customer accounts, all users, but with limited data. If we had chosen a vendor that based pricing on the number of users, we'd be in the position of either reducing margin as more people used the system or tying to limit the total users per account. This was the opposite of what we wanted to achieve — we wanted everyone using the system and getting hooked!
As a result, the vendors that based pricing on user counts were eliminated. Our product also had a very thin profit margin and we eliminated the vendor who wanted a percentage of our revenue from further consideration.
You Talk a Good Game, Now it's Time to Show Us
It was down to three vendors and time to see the applications in action using our data and our desired charts.
The proof of concept (POC) trials were designed to mimic what we might see in a production situation — similar data, similar charts — just less data and no integration. We wanted to see how fast each vendor could get up and running with our data as well as how quickly they would respond to changes to the requirements and the little challenges that always pop up in a complex project. The vendor that we eventually chose was able to perform not one, but three separate trials for us faster and with less stress than the second place vendor was able to get anything up and running.
As an added bonus, the POC also allowed us to "test" the personalities of the implementation team. We would be working with these people for the next 90 days — this was a great chance for a preview. If a problem came up on a Friday evening when the demo for CEO was on Monday morning, would they be willing and able to help? Would they quickly pick-up on our business challenges or would we need to explain our company's business model repeatedly? The answers to these questions were as valuable as the technical aspects of the POC and factored heavily into our final decision. We ended up choosing the vendor that had great capabilities, a great team, moved quickly, and felt more like a partner than a vendor. But now, we needed to negotiate the deal.
Send in the Lawyers
The work on the contract details was perhaps some of the most enjoyable, fulfilling time of my life. I look back on those hours and days fondly...
Oh wait, sorry, it was the most grueling, exhausting portion of the project. Not because of the lawyers (they were delightful people who just happened to enjoy hour-long conversations about mutual indemnification) but because of me and what I overlooked.
Here's where I made another major mistake in the first project — I got the legal and finance teams involved way too late. We'd already conducted the POC and had picked the vendor that we thought would be best for our product. We'd even settled on the pricing both for the product and implementation services. We're done, right? No. Wrong. It's details time.
As soon as the legal and finance teams became involved in the project, a couple of things became clear. First, I would need to spend significant time reviewing the project with them, explaining the business purpose, explaining our selection process, discussing security, etc. We even had extensive discussions about why we were buying a solution instead of building our own (hint: building is always more expensive and time-consuming). It was frustrating and took a significant amount of time.
I should have gotten these teams involved from day one so they could understand our decisions as we made them. My fault, lesson learned.
The second thing that became very obvious very quickly was how many little details I missed. Here are some of the items that I now know are essential to address early:
- Contracts with existing customers — you'll need to change the terms so that they understand that data will be handled/processed by a third party (the BI vendor)
- Right to audit security. Do your customers have the right to view security audit documents provided by your BI vendor?
- Service-Level Agreements: Does the BI vendor's contractually obligated uptime match what you are promising to your users today?
- Professional service budget — what happens if you don't use all the money? Can it be used for other projects?
- How do you define a "project"? Is the BI tool only to be used for a single application or can it used to power another effort such as your internal metrics?
- What happens if the vendor decides not to review the contract after the term expires? How much time do you have to implement a new solution?
- When does the meter start running on pricing? Do you pay for the BI system while it's still in development or only after sign-off is complete?
- Intellectual property: Where does the vendor's application end and yours begin?
There are many more finance/legal issues that I've learned need early resolution, but these are the ones most deeply seared into my psyche for the future.
Wait, You Mean We Still Need to Implement This Thing?
Vendor chosen, contract signed — it's time to get going!
And... wait. What are we building again? Yet another of the mistakes from the first project that I corrected in project #2 was a failure to have a good understanding of exactly what we wanted to show our users. Having the persona developed for the second project helped, but so did having a complete "dictionary" of the charts, metrics, and dimensions we needed.
For the first project, we burned through a significant part of our services budget just trying to get a handle on what kinds of questions we needed to answer, what data those answers required, and where that data was located. For the second project, we got a little smarter. The first thing we handed the vendor's project team was a spreadsheet with the following for each analytic to be displayed:
- A sample of the chart you want to show (e.g. a bar chart with dual axes)
- A list of the metrics required by the chart (e.g. net revenue, customer count)
- A definition of each calculation required (e.g. uplift = current month performance - baseline performance, baseline performance = rolling 12-month mean)
- The dimensions required (e.g. by year, quarter, month, day, region, product, team, etc.)
- The next level of drill-down (e.g. click the bar for the month to see the specific items that make up that month's performance)
It doesn't have to be pretty, but providing a simple dictionary of charts, matrices, definitions, and dimensions will accelerate the initial phases of implementation and save you money along the way.
Qualtity Assurance is Key
I'll never forget sitting at a meeting with our CEO where we revealed the results of our hard work from the past month. One of his first comments was "why does this chart say the cost of this type of repair was $27 million? Our whole budget is less than $20 million. This can't be right."
Oops. We hadn't performed quality assurance (QA) to the level we should have in that first project. Although we spent hours reviewing charts, testing performance, and tweaking colors and layout — we had missed an issue with the data we were providing the vendor. We had built a process to convert all expenses from local currency to U.S. dollars, and then forgot to change the loading process so that it used the new converted data. Rubles mixed with Yen mixed with Pesos. We never noticed it, but the CEO sure did. With business intelligence you've got one chance to get it right. Show someone data that they know is inaccurate and you'll have a hard time getting them to trust the system ever again.
The second time around I made sure this wouldn't happen again. At the start of the project I formed a small team of people with expertise in various parts of the business to review each and every chart and make sure it made sense. We provided the team with a copy of the same data we provided the vendor for the POC and had them run the same calculations manually. No silly calculation errors this time around. Again, lesson learned. The hard way.
All That Other Stuff...
We made it through that first implementation successfully and in less than 6 months. The second, third and fourth implementations took less time and were equally successful. But, some things took far longer than others and I recommend that you start on these tasks early on in your project.
- Product tiers: Do you have levels of BI such as basic, plus, and pro? What are the differences? How are these priced?
- Sales training: You need to get the Sales team up to speed so that they can sell your new BI functionality. How will perform both initial and on-going training?
- Support processes: How does a problem get initiated, triaged, and handed to the right party for resolution? How do you define what gets handled by the BI team vs. what is handled by your team?
- Customization: If a customer wants a completely customized dashboard, are you willing to do it? How will you price it? How will you handle support for that unique instance?
- Marketing: You might need new logos, sales collateral, press releases, etc.
As I performed each implementation project, I kept a running record of all the details that could be easily overlooked. Below is a sample of the mind-map I use: