10 Signs Your Analytics Program Will Fail

by CXOtoday News Desk    May 15, 2018

analytics

Are you sure about your analytics initiative is delivering the value it’s supposed to? Well, it a rare sight to know about CEOs that don’t have the knowledge that businesses are analytics driven.  Many business leaders have, to their credit, been charging ahead with bold investments in analytics resources and artificial intelligence (AI).  Implementation of analytics programs, appointed chief analytics officers (CAOs) or chief data officers (CDOs), and hired all sorts of data specialists have been given importance by many CEO’s. But there has been many cases of failure as it is believed that company executives have failed to convert their analytics pilots into scalable solutions, according to a recent report by McKinsey

Overall, McKinsey has observed that only a small fraction of the value that could be unlocked by advanced-analytics approaches has been unlocked as little as 10 percent in some sectors. The authors of the report lists out 10 reasons why your analytics initiatives are bound to fail.

1. CXOs not having a clear vision on advanced-analytics programs

There seems to be a lack of understanding between traditional analytics (that is, business intelligence and reporting) and advanced analytics (powerful predictive and prescriptive tools such as machine learning). To make sense to the argument, the authors give example of an organization that built a centralized capability in advanced analytics, with heavy investment in data scientists, data engineers, and other key digital roles. In practice, the company ran a lot of pilot AI programs, but not a single one was adopted by the business at scale. The fundamental reason? Top management didn’t really grasp the concept of advanced analytics. They struggled to define valuable problems for the analytics team to solve, and they failed to invest in building the right skills. As a result, they failed to get traction with their AI pilots.

2. Not determining value that the initial use cases can deliver in the first year.

Sometimes the idea of analytics tools and methods like wallpaper so that it benefits the entire organization it is applied for, can result into large-scale waste, slower results (if any), and less confidence, from shareholders and employees alike, that analytics initiatives can add value. The story went by this way for a conglomerate The company identified a handful of use cases and began to put analytics resources against them. But the company did not precisely assess the feasibility or calculate the business value that these use cases could generate, and, lo and behold, the ones it chose produced little value, the authors explained.

3. There’s no analytics strategy beyond a few use cases.

The example where , the senior executives learned a tad bit about advanced analytics; they had identified several potential cases where they were sure the technology could add value.  The strategy was missing on how to generate value with analytics beyond those specific situations. A competitor began using advanced analytics to build a digital platform, partnering with other manufacturers in a broad ecosystem that enabled entirely new product and service categories. According to the report, by tackling the company’s analytics opportunities in an unstructured way, the CEO achieved some returns but missed a chance to capitalize on this much bigger opportunity. The missed opportunity will now make it much more difficult to energize the company’s workforce to imagine what transformational opportunities lie ahead.

4. Analytics roles present and future are poorly defined.

A few executives can describe in detail what analytics talent their organizations have, let alone where that talent is located, how it’s organized, and whether they have the right skills and titles.  The authors state that one large financial-services firm’s CEO was an enthusiastic supporter of advanced analytics. He was proud and a supporter of the plan that his firm had hired 1,000 data scientists, each at an average loaded cost of $250,000 a year. Later, as it was quite obvious that the new hires were not delivering what was expected, it was discovered that they were not, by strict definition, data scientists at all.  The suffering of the financial company was because of the fact that neither the CEO nor the firm’s human-resources group had a clear understanding of the data-scientist role nor of other data-centric roles, for that matter.

5. The organization lacks analytics translators.

If there’s one analytics role that can do the most to start unlocking value, it is the analytics translator. This sometimes overlooked but critical role is best filled by someone on the business side who can help leaders identify high-impact analytics use cases and then translate the business needs to data scientists, data engineers, and other tech experts so they can build an actionable analytics solution. Translators are also expected to be actively involved in scaling the solution across the organization and generating buy-in with business users. They possess a unique skill set to help them succeed in their role a mix of business knowledge, general technical fluency, and project-management excellence.

6. Analytics capabilities are isolated from the business

The organizations who have made a mark in successful  analytics initiatives embed analytics capabilities into their core businesses. Those organizations struggling to create value through analytics tend to develop analytics capabilities in isolation, either centralized and far removed from the business or in sporadic pockets of poorly coordinated silos.

7. Costly data-cleansing efforts are started en masse

There’s a tendency for business leaders to think that all available data should be scrubbed clean before analytics initiatives can begin in earnest. McKinsey estimates that companies may be squandering as much as 70 percent of their data-cleansing efforts. Not long ago, a large organization spent hundreds of millions of dollars and more than two years on a company-wide data-cleansing and data-lake-development initiative. The objective was to have one data meta-model essentially one source of truth and a common place for data management. The effort was a waste. The firm did not track the data properly and had little sense of which data might work best for which use cases. And even when it had cleansed the data, there were myriad other issues, such as the inability to fully track the data or understand their context.

8. Analytics platforms aren’t built to purpose

Some companies have a modern architecture as a foundation for their digital area. A mistake in the making are thinking that legacy IT systems have to be integrated first and the building of a data lake before figuring out the best ways to fill it and structure it. In many instances, the costs for such investments can be enormous, often millions of dollars, and they may produce meager benefits, in the single-digit millions. The findings donate that that more than half of all data lakes are not fit for purpose. Significant design changes are often needed. In the worst cases, the data-lake initiatives must be abandoned, according to the authors.

9. Nobody knows the quantitative impact that analytics is providing.

Given the fact that a lot of companies are investing tones of dollars on advanced analytics and other digital investments but the results are in vain. According to the authors, analytics applied to an inventory-management system could uncover the drivers of overstock for a quarter. To determine the impact of analytics in this instance, the metric to apply would be the percentage by which overstock was reduced once the problem with the identified driver was corrected.

  10. Not identifying potential ethical, social, and regulatory implications of analytics initiatives

One large industrial manufacturer ran afoul of regulators when it developed an algorithm to predict absenteeism. The company meant well; it sought to understand the correlation between job conditions and absenteeism so it could rethink the work processes that were apt to lead to injuries or illnesses. Unfortunately, the algorithms were able to cluster employees based on their ethnicity, region, and gender, even though such data fields were switched off, and it flagged correlations between race and absenteeism. Luckily, the company was able to pinpoint and preempt the problem before it affected employee relations and led to a significant regulatory fine. The takeaway: working with data, particularly personnel data, introduces a host of risks from algorithmic bias. Significant supervision, risk management, and mitigation efforts are required to apply the appropriate human judgment to the analytics realm.

The identification of the ten red flags, according to the authors, is what can help companies to get back on track and some money out of the millions invested. The returns have been on the low for companies. It is high time that businesses get analytics right.