I.

In my last post, we began a discussion about the goal becoming a data driven organization. I shared my opinion that it’s often not so much about the tools, as it is about the leadership, philosophy, and decision processes of a company that help to reach a more data-driven state.

I also shared that I believe the role of analytics is to deliver the highest degree of situational awareness to the leadership team –comprehensive understanding of the current state of the operation, a thorough examination of historical data and the story that it tells, and insight into the near, mid, and long-term future. That awareness informs decisions.

The end game of becoming data driven is allis in service of advancing the core goals of every company:

Growth: Data driven companies leverage analytics to determine how to sell better and sell more. They understand what constitutes good service from the perspective of their customers, how it is measured, and how to leverage it.

Efficiency: Data driven companies operate on a cycle on continuous improvement, wherein the underlying cost models are constantly examined. Analytics are used to model that cost equation, and to drill down to understand unit-level contributions that canbe targeted for improvement.

Defense Against Disruption: Data driven companies anticipate disruption. If you are a leader in your company, this topic should keep you up at night. Disruption occurs in both small and large ways –small ways are when a competitor outmaneuvers you to be first to market with a new product or service that degrades your share, for example. Large disruption? Let’s put it this way –if anybody BUT you execute a strategic disruption of your marketplace, it’s probably already too late. When someone has found a way to do what you do, at a fraction of your cost –you’re an also-ran. The leader of a data-driven company is constantly leveraging analytics to exame all aspects of their operation’s business assumptions, and relentlessly seeks to understand where they are vulnerable to a radical change of state–the core disruption.

If you are data driven then your analytic tools and insights are helping you drive another dollar of revenue, reduce another dollar of expense, finding ways to do more with less, and securing your future against disruption.If your analytics aren’t doing these things –why bother?

We also talked about a mechanism for leaders that are trying to make ground on improving their progress towards being more data driven: by doing a Business Threat Assessment. The outcomes of this were to arrive at a list of the three tactical threats, three strategic threats, and the three most persistent challenges to your operation’s efficiency. It’s a way to get organized by establishing a meaningful priority set that should be evergreen. It’s as much a way of thinking as it is a process, and it should be scalable up and down your team –your managers should be able to give you their interpretation of this, as relevant to their areas of responsibility.

In this post, we’ll talk about leveraging the first level of analytics –descriptive analytics –to begin to leverage the outputs of your BTA, and to align with the fundamental business goals we’ve listed as part of becoming more data driven. Descriptive Analytics encompasses how your existing data sets are produced, manipulated, and displayed, and consequently, has an emphasis on historical data (as opposed to predictive, or forward-looking tools).

II.

Let’s talk a little bit more about what makes up the building blocksof Descriptive Analytics at a typical company, and some first steps at the most basic level that will let you begin to make progress against the targets from your BTA. We want to learn the differences in this domain that separate “reports” from “analytics”. In order to do that, a few things need to be in place. Here are the foundation elements:

1)The data assets. These are the data elements generated and captured by your operation. This is the ocean, so to speak. In our very first blog post, we talked about creating a data map that inventoried these assets, and we presume, for purposes of this conversation, that the operation has a good handle on this inventory.

2)The data management infrastructure. This is the various repositories where you are storing data today –it might be organized, it might be not, or somewhere in between.

3)The data presentation layer –your existing reports.

There are a few other things that may, or may not, be present depending on the maturity of the data infrastructure and governance.

4)A data model. This is an overlay that ties all your data elements together, defines their types and values, and illuminates the relationship and sequencing between them. This can be as-built, meaning a reflection of what has grown over time, or it can be optimized–more to come on that down the road.

5)A data archive. This represents an advancement over run of the mill data management and storage –it constitutes a designed repository and database infrastructure that typically integrates and organizes your data elements into an efficient structure that is more easily accessed and manipulated for reporting and analytics. These can be data marts, data warehouses, data lakes, etc., etc.

6)A Business Intelligence Tool. Toolkits optimized for the visual display of data and reporting, and typically integrate user configurable dashboards, reporting schedulers, and distribution and publication functionalities. You’ve certainly heard of the big ones like Tableau, QlikView and Qlik Sense, Alteryx, and the up and coming contender, Microsoft Power BI.

Finally, if you’re well down the path you might have one more thing in place:

7)An Analytic Data Set and Toolkit. In short, this is a specialized data repository created by your analytics team, that is populated by the critical data element subsets that are most often relevant to your analytic requirements. This has been typically statistically validated through exploratory data analysis (a topic for another day) as being the most useful subset for query and investigation. In fact, you will likely have a process for creation of custom analytic data sets that are oriented to the specific lane of investigation you might be interested in – or our purposes here, in the overview of descriptive analytics, we’ll really be talking more about the capabilities it can afford than the specific structure or structures.

So, reports vs. analytics? In this case, Wiki has a pretty darn good definition of analytics: “Analytics is the discovery, interpretation, and communication of meaningful patterns in data, and the process of applying those patterns towards effective decision making.” Let’s go to the Business Dictionary to get a good working definition of reports: “A document containing information organized in a narrative, graphic, or tabular form, prepared on ad hoc, periodic, recurring, regular, or as required basis.”

Descriptive analytics manipulate the assets we’ve listed here(including your reports), examining your current and historical state, to allow you to make more effective business decisions, and to advance your progress against strategic business.

III.

Now, let’s talk about applied Descriptive Analytics, in the context of building your game plan around the parts of your Business Threat Assessment, and your business goals.

One of the targeted parts of the BTA was to outline the three greatest ongoing obstacles to increasing efficiency in your enterprise. The goal of increasing efficiency in a systematic matter, fundamentally, is a commitment to the philosophy and process of continuous improvement. Measure, analyze, respond, act, measure, analyze, respond. Repeat.

Descriptive Analytics has a lot to offer in this arena. For example, common initiatives that fall under the Descriptive domain to increase efficiency are:

1)Reports Audit and Data Model Optimization

2)KPI Review and Testing

3)Attribution Analysis

The Reports Audit and Data Model Creation or Optimization.

In short: Our experience tells us that most organizations spend money and resources on reports that offer marginal utility. Here’s an example:

One of our clients is a financial institution, that had decided to implement a new BI tool, and wanted help migrating the reporting infrastructure of their investment accounting team. We found a reporting infrastructure of 200+ spreadsheet-based reports, that had been accreted over a period of a decade. Each was essentially hand built, and hand operated, and tied to critical processes of month end and quarter end close cycles.

There was no data model, and data feeds driving these reports came from over 50 discrete sources. We conducted a reporting audit and found that many of the report elements overlapped.

Our team finished the audit (an exhaustive, field-level review of content and source), created a data model that established the key sources and data elements, and their relationships and location, and built a data repository for the BI tool, populated based on the results of the model development. We reduced the absolute number of reports by over 50% and returned the equivalent of 1.5 FTE headcount in hours saved, to the director.


Going forward, new reports are built via the BI tool, and data is sourced from the repository. The only thing this took was TIME –the BI tool was in place, the reports were there, the location and database toolkit for the repository was present.

You are very likely to have similar opportunities. If you have a data map constructed, you are well on your way to being able to do a reporting audit. Understanding the type, frequency, and audience of all the reports you produce means you can establish control and impose efficiency where it probably is not currently present. I won’t trot out the old “every report should drive a decision” axiom, because we all know that’s not true –some reports are regulatory, some are for context, and some are supporting service level documentation, for example. BUT (and it’s a big but) if you haven’t consciously reviewed the time,dollars and staff that are supporting your reporting, you’ll never know what opportunity you missed.If you can’t point to a data model that classifies and organizes your data elements, you can’t control the evolution of your reporting infrastructure. It will STILL evolve, but instead of coming out the other end with a best in breed prize winner, you’re going to have something from the Island of Dr. Moreau.

And finally –a commitment to continuous improvements means that this topic should, of course, be periodically revisited. The data model should be refreshed, and the reports re-evaluated. It’s a lot easier to do this once you’ve cleaned house.

KPI Review and Testing

Your reports feed a set of KPIs. They may be formal (in a dashboard) or informal (the elements in the reports that the management team routinely go-to from practice and experience).

The question is: Do these reports inform KPIs that actually reflect the critical criteria that define both your short-and long-term business success? We all can think of examples of teams, departments, and companies that spend lots of time watching dials and metrics that tell them what they want to hear, while the bottom falls out beneath them. Aviation folks call this controlled flight into terrain. If your reporting apparatus is not delivering the right KPIs then your decision making is not data-driven.

Example: We helped a company that provided social services to various states, primarily placing at-risk kids into home solutions (adoption, foster, family, etc.). We were discussing how analytics could help increase efficiency, and one of the clients brought up a problem they faced. Periodically they had to provide a summary to the states they contracted with, of where the kids were, in the system, at a given point in time (just entering, placed in care, receiving counseling or other services, exiting the system, for example). We were astounded to hear that it took 2 to 3 weeks to assemble the information to populate this report. And, it should be noted that since the company depended on these state contracts, that the impact of this report is essentially, strategic to their future.

Think about that –this company produced hundreds of reports to audiences internal and external but had to manually gather data to answer what many of us on the outside would expectto be the most common information demand. These folks weren’t tracking the right KPIs, and consequently were suffering a very inefficient process.

Your KPIs should be testable –they should be able to demonstrate that they are predictive of the outcomes you’re driving for, as a company. Once common problem is that KPIs are created from the bottom up –this metric is available in this report, so it becomes de facto selected. Useful KPIs should be created top-down. In other words, the executive and leadership team should be responsible for determining the strategic definitions of success, and the metrics defining those conditions should be either selected or created. There’s a place where having a data-model gives you a leg-up, as the analytics team can do the testing to confirm that the various elements are, in fact, statistically powerful in terms of the outcomes you’ve defined as positive.

One last note here –KPIs are not fixed. They, like all we’re discussing, are subject to a process of continuous improvement. We’ve worked with companies that review them, for example, after every quarterly financial period is completed. What has changed in the make-up of the company in that time period, and do the critical KPIs still reflect the desirable strategic goals? If you’re not asking these questions, the more time that elapses, the more likely that your KPIs will drift, and you’ll be wasting time, money and hours monitoring the wrong metric.

Attribution Analysis

Attribution modeling is a tool that lets you examine how multiple variables contribute to outcomes. The classic efficiency play here revolves around market channel spending. If your company spends money on multiple on-line channels to drive outcomes (typically sales or conversions) do you know if you’re spending the right amount in each channel? Do you know how much each channel contributes to the desired outcome? How much credit should be attributed to each channel?

Here’s a fun example we like to share. We worked with a financial client in Hong Kong, that was spending millions on multiple channels on-line, and off line, to sell a product. They wanted to make sure that they were allocating their funds correctly to optimize their growth. We worked with their digital team to examine the reporting streams associated with each of the digital channels and built an attribution model to measure contribution. What we found in the data was interesting –the digital team was seeing a very high number of arrivals to their site from various social media channels, rather than the more direct digital marketing channels. The contribution of social to conversion was higher that the channels that they were spending more on. Why?

Examination of the specific content in the digital channels was done leveraging various listening platforms, and a truly fascinating result was found: the social chatter revolved around people talking about a series of advertisements that the institution had taken out on the billboards affixed to buses that were seen around the city. The selection of male and female models used in those advertisements, and the clothes and accessories they were wearing had caught the attention of the public. That attention translated to pull-through visits to the website, and sales of product.

In other words, the attribution model not only showed that the cheapest digital channel was contributing to more conversions –further analysis of the results of the model showed that the true root cause of the conversions was a ridiculously inexpensive series of physical ads. And yes, those male and female models got more work, thanks to the power of analytic attribution modeling.

In this case, leveraging analytics to build models using descriptive data resulted in major spending re-alignment for the client –increasing marketing efficiency.


IV.

These are just some of the examples of how Descriptive Analytics can help drive value for your company, and make your decision making better informed, and more data driven. In our next issue on the topic of Becoming Data Driven, we’ll shift gears and look into the realm of Predictive Analytics. One of the themes you’ll start to see is that data builds on data, in terms of increasing the depth and maturity of your analytics. The building blocks we’ve reviewed here will be part of that continuing story!