Business Discovery Blog

5 Posts authored by: Brad Peterman

QlikView skill sets are a perfect fit for IT teams. To illustrate this point, the chart below breaks down the skills you need for QlikView development in two areas: general BI skills and QlikView-specific skills. The vast majority of skills needed for an enterprise deployment of QlikView are not only common to deployment of other BI platforms--but most likely ubiquitous in most IT departments.

This is important because new technologies can seem disruptive. The ability to use existing skills to get a new technology established and scaled out gives an IT group flexibility, continuity, and predictability in sourcing, training, and hardening a deployment.

There are a few skills that are specific to QlikView, although most people who have designed or developed in QlikView will tell you that these are intuitive features that take a fraction of the time to learn compared with other BI technologies.

When a BI tool has this close a match to an already-existing IT skill set, it opens the doors to many deployment and co-development options. IT groups can own and control as much of the development and design as they like, while still enabling business groups to participate in the delivery of Business Discovery applications that can expand the value and reach of a BI deployment more broadly throughout the organization.

QlikView and IT skills are like chocolate and peanut butter together (without the peanut allergies).

Part 2 of 2

 

In the previous post, I discussed how prior "successes" in data warehousing and data mart creation in my business life had not necessarily translated in to business intelligence successes. Beautifully modeled data marts can go large unused. Rock solid and comprehensive data warehouses can be all but ignored by business users and BI teams. So how and why does this happen? What are we doing wrong?

I submit that the problem is not with the modeling and building the BI data structures, but rather with the process of scoping them in the first place. How would a business user know what requirements to include in a data warehouse that is supposed to meet his/her data mining needs to perform value-added business discovery and data mining? Consider that this business person has little or no knowledge of the contents of data sources around his or her company. The business users and I.T. both need a way to visualize and communicate the information opportunities that are locked away in their data sources. Once we know where these information opportunities are, we can proceed to gathering requirements against them for a data warehouse or data mart. This is not a discipline that is mature, technically enabled or staffed today in most companies. So how will we do this?

Simply put, BI is the best discipline available to pinpoint warehouse needs. Yet, we're told that the warehouse needs to get built and perfected prior to BI. A catch-22 at best. A land mine at worst. So what do we do about that?

I'm seeing a new approach crop up with large QlikView clients. They are, of course, using QlikView as the front end to navigate and explore data structures like data warehouses and data marts. But they are also now starting to use QlikView as a discovery tool prior to building new BI data structures. Data warehouse teams are sitting down with business users and performing interactive discovery sessions using QlikView to connect data sources as potential inputs to the warehouse. Is that the chicken before the egg? Maybe so. But to business users it's like have a full refrigerator waiting for them to experiment and lead us to the menu they would like for lunch. What a great thing for I.T. and for the users!

With this approach, IT gets a seat at the table of discovery. IT is there for the "ah-ha" moments of epiphany and information opportunity. IT learns more about the business and potential value of BI in these meetings than in any others. The business teams see the value of IT in collaborative sessions like this, increasing the likelihood of funding and support for new projects and efforts. If you haven't seen this happen before, bring QlikView into a data requirements meeting and try it. Your users will thank you.

What's your opinion? If you've seen your company stuck in that rut of creating BI data structures in anticipation of use (if we build it they will come) send me a reply and let me know how (or if) you've gotten them out of that rut. I know there are great successes out there leading the way to better BI. Tell me what you think.

This post is part 1 of 2

The first data mart I created was back in the late 1990's. I did everything right (or at least by the book). The users knew what they wanted, the data existed, the ETL tool was adequate and the data model came together like a dream (a geeky dream, not a cool dream like having super powers or winning the lottery). I was stoked! We unveiled the masterpiece to our stakeholders to great fanfare and excitement. Then, a funny thing happened over the next 6 months. It barely got used. I had a hard time accepting this, and pushed for answers (my wife says I do that a lot).

Here is what I learned: It turns out that having a full set of easily attainable requirements doesn't always mean you are creating something of business value. In my case, I was creating a data mart and BI solutions that were based on what was already known. Sure, the data was now easier to report on since it was consolidated, so the users thought it was a great candidate for a data mart and a BI solution. Once it was in front of them, they were mildly impressed and basically went back to just pulling standard data from pre-defined parameterized reports. They were less interested in navigating, data mining, and performing what-if analysis than I was anticipating.

Fast forward 12 years and many failures later...I've come to realize that the most valuable epiphanies in business are much like those in our personal lives. The combinations of things taken in new ways create wonder and enlightenment. Consider your own experiences. Do you remember the first time you had strawberries with cream? How about a sandwich with mayonnaise? Did the discovery of ketchup impact your creativity and desire for lunch as a child? Did new options and combinations come to mind? Did you open up to new possibilities and explore the wonders of food? I know I did (wish I could stop now that I'm...ahem...older).

I submit that "real" BI is reaching into our companies' data refrigerators and coming up with new combinations of business facts and dimensions to create wonder and enlightenment. Unfortunately, most BI vendors and industry research firms (the usual suspects) would have us retreat to the safety of the most common and well conformed data in our companies. This helps guarantee success of the warehouse, or the data mart. But when was the warehouse our finish line? When did completing a well modeled data mart become the value in the BI chain? It hasn't. We need to own up to that and realize that the reasons to create BI data structures are to promote the extraction and discovery of business value from them. Without that, we are presenting users with the same peanut butter sandwich on wheat bread that they have eaten for years.

Look for Part 2 of this blog very soon. In the meantime, reply with thoughts or experiences in this area. I'd love to hear them. I'll be here eating my pickle-banana-tomato sandwich while you type.

I'm continually amazed (and horrified) to see the annual BI maintenance and development budgets of clients that I visit. I think they are truly amazed themselves, and many times they wonder how they got to that point in the first place. It got me thinking about the parallels to automobile ownership. Did you ever have a car that you bought new, and then 5 years later you looked back at the cost of ownership of the car and said to yourself "HOW did I spend this much over 5 years on a single car?!"

Does the cost cycle feel like this?

Purchase price, warranty, add-ons, upgrades, repair parts, labor, tires, plugs, oil changes, parts, labor, towing, tune ups, new parts, recalls, repairs, labor, break job, tires, labor, etc...

It's likely you were caught in the complexity trap of cost. Some cars come off the line for $20k and cost you $5k to operate over 5 years. No real problem there. You have to expect some cost of ownership and maintenance. So why is it that many cars that cost $50k (new) will cost you an additional $30k just to operate over 5 years? Shouldn't they be even cheaper to operate, given the high initial cost? The problem is the complexity cost formula. It always prevails.

That which is more complex and requires greater specialty of skills will naturally cost more to maintain over time.

Think about it. Doesn't this same cost formula hold true for anything you own? Cars, houses, power tools. How about your company? Your government? It always holds true. That's why it's no surprise that this formula is at the center of the skyrocketing maintenance costs for traditional BI solutions. Here is the formula in its simplest form:

Cost of BI = (# of moving parts) X (# of specialty skills needed)

Using this formula, break down the BI solution(s) you already have, or even those you might be evaluating.

  1. How many tools are involved to get data from source to user? How many layers and processes does a column of data have to go through? How many repositories, databases, cubes, warehouses, data models, scripts and libraries does the data need to travel through to get onto a dashboard? These are your moving parts.
  2. How many specialty BI skill sets and software skill sets are needed to deliver BI with that solution? How many data modelers, ETL designers, DBAs, data architects, BI tool specialists, OLAP analysts, warehouse technicians, report developers, project managers and BI designers does it take to produce a dashboard or BI analytics interface? These are your specialty skills.

Multiply these together and you quickly get an accounting for those massive development and maintenance costs of traditional BI. The costs of specialty skill sets have gone up rapidly in I.T. over the past 10 years and will continue to rise. This puts great pressure on companies to either do less BI or to find a way to outsource some of the skills. If you haven't already seen this happening then keep your eyes open, you will.

The good news is that more and more companies are rejecting the tendency to add even more complexity and specialty to their cost formulas. This is helping bring about simpler, more powerful BI tools that can utilize traditional skills sets and business knowledge that are abundant in our companies today. Do the same. Be aware of the BI Cost Formula, and don't fall victim to it.

 

Brad Peterman

What is Scalability?

Posted by Brad Peterman Mar 15, 2010

One of the best buzz words in BI is "scalability". Heck, it's even fun to say...scay-luh-bill-uh-tee. The problem, like any good BI buzz word, is that is has way too many meanings that are convenient, but misleading. Here is the Wikipedia definition of scalability:

"scalability is a system's ability to either handle growing amounts of work in a graceful manner or to be readily enlarged."

So, this means that mowing your lawn with scissors IS scalable. Or put another way, "Scissors are a scalable solution to cutting your lawn".

Uh, no. Not so much. The Wikipedia definition is missing something to ground it in reality. If we add something to the Wikipedia definition, we get....

"scalability is a system's ability to either handle growing amounts of work in a graceful manner or to be readily enlarged, at a cost threshold under which you are willing to perform the scaling."

Ah, better. Now scissors are only a scalable solution to cutting our lawns if we are willing to hire 60 people to simultaneously snip away for 4 hours each week. OK, so what does this mean to the business intelligence world? It means less of these conversations with your vendor:

Vendor: "Our tool scales to 10,000 users."
Client: "Yes, but I will need 100 servers to do it."

Vendor: "Our tool scales to 10 TB of data."
Client: "Sure, but I will need dozens of cubes and 16 hours of batch processing at night."

Vendor: "Our tool can scale to the largest enterprise needs."
Client: "As long as I have the largest enterprise support staff to keep it running."

The answer to whether or not a BI solution is "scalable" is, in real life, directly correlated to your ability to perform the scaling under acceptable cost thresholds. Put another way, scale is not relevant when it requires costs that exceed your tolerance. Make your cost tolerances part of the BI tool selection process and you will be able to focus on much more relevant and meaningful requirements.

  • How much does a 3-tab dashboard with 12 charts cost to build?
  • Can I add 5 new columns and a new drill-down metric to the dashboard, with testing, in under 8 hours?
  • How long does it take to build 4-5 dashboards with 500 MM rows of data for a user base of 1,000 users?
  • How much will it cost me (in time and hardware/software) to scale out my solution from one production server to a cluster of 3 servers?
  • How many skill sets (and which ones) are needed to maintain a 15 dashboard solution for a user group of 2,000 users?

These questions get to the heart of our "new" definition of scalability, which only accounts for scalability "at a cost threshold under which you are willing to perform the scaling". So the next time your BI vendor starts throwing the "scalability" word around, remember that scissors are made for paper, not grass.

Note: this blog was not endorsed by or funded by any lawn mower manufacturers, nor does the author promote, sell or profit from the purchase of said machines. ;-)

Filter Blog

By date:
By tag: