a stopwatch
The time element, which not too long ago drove the analytics industry, now is a given PHOTO: Veri Ivanova

We, as an industry, need to retire our obsession with “real-time,” “near real-time,” “speed-of-thought” and every other time-specific type of analytics.

The demand for accurate, real-time data analysis spawned a feeding frenzy of analytics product design and shaped the entire market. Yet the evolution of analytics technology and the acceptance of "good-enough" analytics as being more than good enough turned the time element from an issue, to table stakes. 

Time is an absolute. However, with new technology, it's far less of a restrictive factor. Now time serves more as an elastic window through which we can observe change and make decisions.

Effective and accurate data analytics are now within reach, regardless of time frame. 

With the rise of high-performance computing (HPC), in-memory computing power and deep and wide data lakes, the insights we need are there, when we need them.

3 Factors Putting the Real-Time Analytics Obsession to Rest

So what's pushing us to finally rid ourselves of our “real-time” analytics obsession?

1. HPC Is the Powerful Engine

HPC is the engine driving high-speed analytics. Over the years, we’ve seen the democratization of HPC as it becomes more cost-effective and accessible to organizations. 

In fact, industry analyst group IDC reported HPC systems have a solid upward trajectory, outpacing other types of server sales. HPC has also become synonymous with lightning-fast analytics. To extract value from data, companies are coupling big data platforms with HPC-powered data analytics systems, making any time element moot.

2. In-Memory Computing Provides the Platform and Capacity

We also now have access to huge in-memory stores so we can do in-memory calculations and computations. Today’s in-memory data analytics solutions allow users to consume all the data points they need to make the most well-rounded analysis. 

Perhaps we are closer to the 360-degree view of the customer than ever before?

3. Evolved Data Lakes Contain Rich Fuel

Data lakes used to be wide, but shallow. Today’s data lakes are wide and deep, accommodating the volume, variety and velocity of today’s big data. Data lakes that have evolved in this fashion allow us to pull all data out of their deep reservoirs, providing the most robust and comprehensive analytics.   

We do not need 100 percent of the data, 100 percent refreshed, 100 percent of the time. However, we do need the data to be in a mode where we can easily go in and ask all manner of questions, especially those complex questions that are most important to business insights and success. 

HPC, in-memory computing power, and deep and wide data lakes have all matured to the point where they fuel analytics so time no longer is a constraint.

Goodbye Real Time

So what does that mean for the execution and delivery of analytics?

We can lay to rest our concerns about whether real-time analytics is robust or accurate enough. If you build a back engine with HPC, a middle-tier with in-memory applications, and a data feeding system that originates from deep and wide data lakes, you'll have the analytics you need at the moment you need them. 

And when the industry stops worrying about the time element, we can finally hone in on other areas of analytics, such as how to encourage broader access to everyday business users.

Share your experiences of time-unbounded analytics in the comments — I would love to hear what you have seen and learned.