Lately I’ve been working with clients who are trying to build analytics and data-driven strategy into the core of their businesses—with varying degrees of success.
This has led me to research how organizations are set up to succeed or fail at growing a true analytics practice within a company as a core competency. And that got me thinking about the Software Capabilities Maturity Model (SCMM). The same concepts of chaos, organization, standards and innovation apply to software development as they do to analytics and data-driven strategy.
Ten years ago I first came across the Software Capabilities Maturity Model.
The SCMM was developed in 1987 by the Software Enterprise Institute as the result of research they did into how software was developed. Pretty quickly it became apparent that there were distinct levels of organization, skills and quality control that separated the great from the good, the good from the mediocre, and the mediocre from the insane.
I was working for Knight Ridder in San Jose, which at the time was the second largest newspaper group in the US. We were going through a massive organizational shift in the online group, and I was responsible for leading the development of a single e-publishing platform for all 32 daily newspapers owned by Knight Ridder. At the time each of the newspapers ran its own little Website and these 32 sites contained everything from Windows NT to Appletalk.
We were re-organizing the digital group at the same time, so I was reading a lot of organizational and software development books. One of them was “After the Gold Rush” by Steve McConnell of Microsoft, which was built around the SCMM. The SCMM is the logical result of what is known as Conway’s Law, that the structure of a computer program reflects the structure of the organization that built it. You can apply Conway to almost any organized human endeavor.
Level One is easy to understand: everyone runs around with their hair on fire.
There is no project management, and all quality of product results from extraordinary individual effort. The structure of the resulting programming—to apply Conway’s Law—is a mess.
Level Two adds dedicated project management, management tools, and quality controls, Level Three is about codifying and building standards into the organization—there is a way we do things around here. Level Four organizations have made quality in design, development and quality assurance core values and as a result, software development is a core competency—these can be ISO 9000 shops. Level Five is for the true alpha innovators, the shops with the leading thinkers creating real innovation.
You get the idea. Each of us in our careers has instinctively know how well—or poorly—a project or a company is operated when we work there. We hope for the best, but we can feel in our bones when the train is definitely not on the rails.
So let me present my version of the Analytics Capability Maturity Model.
I’ve taken my own experience and research, and after vetting it with colleagues, developed a description of the five levels organizations go through to become truly innovative, strategic companies because of their expertise and practice of analytics, and using those to drive the strategy of the company.
(click on image to enlarge)
LEVEL 1 – INITIAL – Ad Hoc
Gathering, managing and analyzing data about the business is ad hoc, sometimes even chaotic. Few processes are defined and success depends on individual effort to find data and perform analytic processes. The process can’t be repeated, and the people performing analysis have widely different definitions of metrics and their uses. The only tools for analysis and reporting are Excel, Powerpoint and E-Mail.
Basic processes and systems are established to import, manage and analyze data. The necessary organizational discipline and structures ensure that these processes are followed consistently. Data is structured consistently to allow for simple queries and reports. There is a strategy of long-term data management, with processes to support that strategy. Specialized tools and systems are in place to handle segments of the over-all data management process. But analysis is not built into the central practices of the business.
LEVEL 3 – DEFINED – Standardized
The ingest, management and analysis of data is built into the standard procedures of the business. Systems dedicated to the analysis and reporting about data are in place to provide high quality tools and templates for analytics. There are business practices and long-term strategies effectively in place for the analysis of data. Some people in the organization perform analysis queries in their daily duties, and regular reports are established. There are people dedicated to analysis of data. Consistent business value is created through the analysis of data in the regular business practices. There are tools and training to create queries and reports in near real-time.
LEVEL 4 – MANAGED – Strategized
Data management and analysis is built into all facets of the business. There is a long-term analytics strategy which is integral to the success of the business. Analytics has become a significant competitive advantage. The processes for ingest and management of data include creating robust and consistent metadata that can be brought into interactive queries and calculations in “if/then” dialogues. Interactive queries and visualization happen in near real-time. All major operational functions of the organization rely on the ability to draw insights from data. Tools are integrated and available to most people in the organization, who regularly receive training and support in data analysis wherever appropriate.
LEVEL 5 – OPTIMISING – Invented
The organization has the systems, tools and experience to innovate and evolve how data is used in the business. There is continuous research, piloting and introduction of new data practices within the business and as in integral component of products and services delivered to the businesses’ customers. Data has become a critical barrier to entry from competitors. The leaders of data analysis within the organization are part of the key management team and expected to provide not only insights from data to drive business success, but lead innovation in how data is managed within the organization.
When I’ve asked clients to self-rate themselves during workshops invariably they land somewhere between a one and a two. Sometimes I’ll get threes if I’m talking to the marketing department of a large retail or consumer package goods company, but talk to people in the other departments of those same companies and you’ll hear more ones, twos and “what’s analytics?”
It’s astonishing to me that in the midst of this financial crisis, with all of the demands for accountability, that senior management isn’t demanding more attention be paid to analytics and data-drive strategy. Technology isn’t the problem—it’s organizational change and corporate culture that are the big challenges.
That’s after the boss says s/he wants analytics to become a core competency first.
Often time analytics is confined as a word to Website analytics. The rest of analytics is called “market research.” This false connection suggests that both are “finger in the wind” exercises that have nothing to do with the real business of business which is selling and making money.
Analytics should stand for the collection of data that provides insights about company performance characteristics in order to inform strategic decision making. Note that this also draws a distinction between analytics and strategy. The same people may do both jobs, but usually not. Analytics teams develop the data, organize and examine it, and then provide analysis. Smart analytics people will also provide ideas for strategies, but analytics should support all aspects of a company’s activities, from marketing to sales, to supply chain management to customer support.
But all this begins with the first steps from level one to level two.