Concentrating businesswoman analyzing the results and doubting their accuracy - predictive analytics concept
Feature

Are Predictive Analytics Trustworthy?

6 minute read
Dom Nicastro avatar
No question that predictive analytics is a hot topic but according to recent data more than half of CEOs don't trust the accuracy of their data. Here's why.

Do CEOs trust predictive analytics? According to a report by KPMG, most do not. More than half of the CEOs “less confident in the accuracy of predictive analytics compared to historic data,” according to the report, 2018 Global CEO Outlook.

But the opportunity for valuable business impact from Artificial Intelligence (AI)-powered business statistical analysis like predictive analytics is beyond great, according to some reports. McKinsey and Co. reported three deep learning techniques — feed forward neural networks, recurrent neural networks, and convolutional neural networks — could enable the creation of between $3.5 trillion and $5.8 trillion in value each year (PDF).

Yet still, KPMG in its 2019 Global CEO Outlook report found only 16% of CEOs said that they have already implemented AI to automate their processes; 31% are just piloting the tech; 53% have begun limited implementation.

What is Predictive Analytics?

As a refresher for some, SAS defines predictive analytics as “the use of data, statistical algorithms and machine learning techniques to identify the likelihood of future outcomes based on historical data."

Iba Masood, co-founder and CEO of Tara AI, cited this example of predictive analytics in action: you can understand on a long-term software project that's over-budget where things went wrong in terms of engineering output. Predictive analytics-based systems can signal earlier on in the project if things may not go according to plan. Humans can only process so many code commits, whereas predictive systems can start to signal earlier on in the product development life cycle that the marketing team may not hit their planned product launch dates. 

“Predicting employee performance requires data from workforce management systems, along with ongoing check-ins with systems that monitor output in the department,” Masood said, citing another example. “With engineering output, ongoing monitoring of code commits in version control systems can provide a level of insight into product velocity and enable management teams to make decisions around product launches and/or overall product velocity.”

Related Article: A Pragmatic View of Predictive Analytics

Use Caution With Predictive Analytics in Changing Markets

However, CEOs and particularly startups in general need to be careful when using predictive analytics, according to Mike Volpe, CEO of Lola.com. “These models use historical data, so they cannot always account for changes in behavior by buyers and competitors,” Volpe said. “Plus, the faster your buyers and the market are changing, the harder it is to use models to predict the future. With many industries experiencing rapid and massive change in recent years, CEOs need to proceed with caution where predictive analytics is concerned.”

Predictive Only Considered When Data Is Clean

Decisions based on predictive analytics start and end with data, according to Masood. Consider the quality of the data gathered, methods for data collection and ask if the data has been de-biased and are you working with scrubbed/cleaned data. These are all considerations for CEOs when relying on making decisions with predictive analytics, she said. Executives such as CDOs and CIOs need to spend time investing in ensuring data is clean before management teams and CEOs start relying on recommendations from predictive analytics-based systems, according to Masood. 

Those considering deploying predictive analytics for their businesses should note that a well-prepared prediction, and well-conducted predictive analytics program can be trusted, according to Krzysztof Surowiecki, managing partner at Hexe Data. “However, I do not trust bad prediction or badly conducted predictive analytics. Therefore, probably that is the main issue why people hesitate when it comes to predictions. Sometimes they cannot verify whether they are properly made.”

Related Article: Why a Cat May Be Better Than Predictive Analytics in Picking World Cup Winners

Learning Opportunities

Wrong Design Staff, Tools Could Cause Problems

So where do problems with predictive analytics exist? The essence of the problem, Surowiecki said, boils down to recognition when predictive analytics is badly conducted. Look for these potential problems:

  • Selection of design staff without relevant experience and knowledge
  • Selection of inappropriate tools and prediction methods
  • Wrong selection of data, i.e. incorrectly selected time series (e.g. too short, too long, lack of seasonality, etc.)
  • Wrong reasoning, especially in the context of assessing the correlation between variables, e.g. that there is a strong statistical correlation between the consumption of cheese and the amount of drowning in the pool; this does not mean that from a logical or business point of view conclusions should be drawn.

“It is also crucial to remember two important phenomena: the self-destructive forecast and the self-fulfilling forecast,” he said. Forecasting that sales will fall will automatically start processes that are to prevent this phenomenon. "As a result, the forecast does not work, because sales increase or its decline is suspended," Surowiecki said. "However, this is not a sign that the forecast was bad. It lets us act, so it fulfilled its role.”

Related Article: Put the Power of Predictive Analytics in Business Users' Hands

Starting Small May be the Answer

Success with predictive analytics usually requires you to start small, flesh out the use cases and define what success looks like when it comes to business outcomes, according to Chris Connolly, vice president of product marketing at Genesys. “Data analytics is not data science,” Connolly said. “You need to understand the outcome of data and the organization and not just the number.”

The machine learning capability in the predictive analytics solution should be left unimpeded, Connolly said. “Do not expect to influence the machine learning scoring of agents because the system will adapt over time to any coaching or staff performance improvements, and conversely, performance degradation,” he said.

Focusing on the Analytics

Lisa Loftis, principal consultant for customer intelligence at SAS, said overcoming any barriers to successful predictive analytics programs requires a top down approach, leadership that is willing to learn enough about analytics to gain confidence in its outcomes and a retooling of culture and organization. “Finance, marketing and technology are all areas that are analytically driven today,” Loftis said. “The CEO can rely on executives in these business areas to champion analytics, to detail how analytics are being applied to improve outcomes and to highlight via solid metrics the positive outcomes that analytics are yielding.”

Actively look for areas where analytics teams exist and ensure that they are involved in the entire decision process, from planning through measuring outcomes. “This will help to eliminate the walls between analytics groups and the rest of the organization and also help to reinforce the value and reliability of analytics to not only business outcomes but also decision planning,” Loftis said. “Positive outcomes will help the entire organization understand the value analytics can bring.” 

The Bottom-line

CEOs can only rely on their gut and instinct to a certain extent, Masood said. “There are limits to human processing,” she said, “as we can only glean insights from small data sets — vs. predictive analytics systems that can rely on terabytes or more of cleaned data.”