If the digital workplace is being driven by the emergence of more effective collaboration and communication apps, the development of these apps, in turn, is being driven by the availability of enough data to build apps that can address organizations’ problems. However, it is becoming clear that a large amount data that is being used in app development, or to drive these apps, is not the latest data available – it is not "fresh" data.
In a recent survey of IT decision makers, Palo Alto, Calif.-based Actian, a hybrid data management, analytics and integration company polled over 300 IT professionals with key decision-making power in a company of at least 250 employees and asked them what their critical data management pains are.
Using Not-So-Fresh Data
The Actian Datacast 2019: Hybrid Data Trends Snapshot found that enterprises are struggling to keep up with the growing demand for business-as-usual reporting that leaves no viable bandwidth for generating new insights.
This requires that enterprises provide not only the necessary data management infrastructure to make data from all their sources available to analytics discovery, but also that they invest in and empower data teams that can pursue new insights from new data spaces.
Unsurprisingly, it also identified hybrid as the future, but then we have already seen that many times in the past and in many other areas of technology.
There were three major findings in the report:
- 94 percent of IT Decision Maker(ITDMs) say it is important to have a system that ensures users are receiving current data, yet 58 percent of ITDMs say it is somewhat likely or not likely that they are using fresh or current data.
- Only 34 percent of enterprises using data to drive decision-making are using it to drive breakthrough insights and innovations vs. business as usual operational reporting. Business as usual operations keep enterprises running from day-to-day.
- 84 percent of enterprises would deploy more data if it were cheaper and easier to do. In addition, over 50 percent of these businesses say data complexity issues, due to siloed applications, are a top barrier to entry for accessing data.
Related Article: 9 Voice Datasets You Should Know About
The Impact Of Fresh Data
There are two factors that typically drive a decision to be successful or erroneous. The timing and the accuracy/availability of data in data-driven decisions. According to Rahul Nair, director of strategic engagement at Optymyze, the freshness typically affects the timing of the decision assuming ethical decisions are being made to ensure only clean/verified/usable data is being used.
It’s the process and the time it takes to get the data to a clean/verified/usable state that causes delayed decisions rather than agile decision making. As the volume and sources of data grows, so do the complexities that can quickly stifle business growth and decision making. To solve these challenges, enterprises should not solely focus on how "new" the data is, but instead, they should focus on how it is collected, stored, analyzed and how efficiently they make it usable. “This may sound like a daunting task to undertake, but thankfully by implementing a data management framework, organizations can ensure that they are collecting the most accurate, up-to-date and error-free data sets that can be exploited and used across the organization to drive strategy and action throughout the enterprise,” he said.
As a part of a well-structured data management framework, he advises enterprises to automate the data-gathering process to ensure that it is being consistently updated and cleansed. For sales organizations in particular, the CRM data collection process is typically a manual exercise that reps must conduct themselves. By automating the process it makes it easier for sales reps to report their data, leading to a more complete and accurate data set.
Related Article: Knowledge Management and Big Data: Strange Bedfellows?
Using The Data You Have
Anil Kaul is CEO of Alemeda, Calif.-based Absolutdata. He agrees that "freshness" is not the real issue but rather it's about using what data organizations have and in what ways they use it. With the proper tools, such as AI and machine learning, companies can use data they’ve stored in the past alongside real-time data to receive business recommendations in a matter of seconds, rather than the weeks or months it may take to collect and interpret new data.
Market research has typically been viewed as an expensive tool that’s used only for high-stakes marketing projects. Though there are more inexpensive options, like surveys, most initiatives take expensive consultants and several weeks to complete. AI and ML give businesses the opportunity to bring together a range of research methodologies to collect and deploy data and also take that data out of silos and apply it across the enterprise rather than leaving it segmented in different departments.“With the right technology, enterprises can make quicker and more accurate business decisions based on research they’ve already collected as well as relevant up-to-date data,” he said.
There is a problem though, Irina Farooq, chief product officer of San Francisco-based Kinetica, said. The more latency you have, the more out of date your insights. On top of that, traditional data platforms can’t analyze the entire data corpus. So not only is data out of date, but it also tends to be based on a sample. Both issues can lead to decisions that either don't reflect the current situation, or are too late to the game to matter.
The problem, she said, can be solved by a new approach to data analysis. Platforms purpose-built to combine and analyze billions of live and historical data points continuously and automatically can shape decisions instantly. GPU-accelerated platforms analyze massive volumes of data in real-time.
Machine Learning (ML)-capable data platforms complement human analysis with AI algorithms that can process data much more quickly, so data scientists can focus on the most relevant details. It can take years to build custom, in-house data platforms that ensure users are receiving current data. “The best solution is to find a unified platform that combines continuous analysis of streaming and historical data, location analysis, and predictive analytics using AI and machine learning, and instead focus development efforts on building the smart, analytical applications that deliver a competitive advantage,” she said.
All that being so, without fresh data, there is no chance to make any smart decisions in today's world, Marek Talarczyk, CEO of Poland-based Netguru, said. In the past companies operated in a highly predictable environment in which projects and contracts were signed for years ahead, providing stability and security.
“Companies’ financial statements were created after each quarter. Those days are long gone, as the world today is way more dynamic than ever before,” he said.
These constantly changing conditions affect predictability, so you have to be agile in your approach. Not only is it necessary to assess performance on a regular basis, but also to plan for all possible scenarios that might lie ahead.
To make this possible, enterprises need to collect only relevant data in one place, have a single source of truth and constantly iterate over the model. This allows them to predict what might happen in the future. Agility and flexibility plays a big role in this process. If there is something that could not be foreseen in advance, you need to have options to minimize negatives and maximize the positives.