Who can say what will be the next big business need for computing?
More to the point, when we do identify what this use case will be, what about its time to value — that is, the length of time it could take for the business to realize value from that project?
With such developments as the Internet of Things (IoT), artificial intelligence, wearables and mobile everything, the next big business need could be anything.
Even, let's say, getting an audience of thousands of people to log onto a special site via their smartphones and then wave said smartphones in the air over and over while the man on the podium then tells the audience how many are waving their devices from what companies, what states they represent and even which devices are being waved.
Like Splunk did this morning during its keynote for its annual conference.
The 'oh wow' moment for the audience — besides the fact that in a crowd of very serious IT geeks there are more than a few Windows Phone owners — was that Splunk has cracked the code of not only collecting streaming data in real time but also quickly aggregating it and rendering it in easy-to-understand visuals.
Time to value for Splunk: I would guess about 2 minutes.
This functionality is one of the main features of Splunk Enterprise 6.3, the company's flagship platform.
It boasts of many features including improved reporting, querying and index performance and speed, more advanced analytics and visualizations and a deployment model that includes on-premise, the cloud or a hybrid model.
Then there is the HTTP event collection feature, displayed so aptly at the keynote
Yes, all of us waving our smartphones gleefully in the air (although, full disclosure, mine wouldn't register because it was set on private browsing and there it will stay, no exceptions) were "events" being collected by the application and translated into actionable data.
How Splunk Did It
How? Well, there is a custom alert action feature that allows the user to define the event that needs to be collected relatively easily.
There is also a standard API through which the applications and devices are able to send the events — which can number in the millions per second — directly to Splunk Enterprise or Splunk Cloud for analysis.
It is a developer-standard HTTP/JSON API that allows for "agentless" or direct data onboarding.
This is how Splunk was able to analyze the smartphones in the audience after users logged into the website that the Splunk executive on the stage, Nate McKervey, director of Technical Marketing, provided.
In the bigger picture, this means the event collector can be integrated into developer services such as Amazon Web Services' Lambda and Docker.
IoT services are part of the mix, with enablement offered for Citrix Octoblu and Xively by LogMeIn.
This also explains how the "events" at the keynote were analyzed so quickly and matched up with complementary data such as device type and corporate users at the conference.
Any Kind of Data, Any Format
But to really understand the event collector and its potential let's get to the crux of it all: It has been architected to collect any kind of machine data and retain it in its original raw format — that is, the format generated by the actual devices.
In other words, Splunk doesn’t force a schema on the data when it is collected.
Only when the user asks a question like "tell me about the smartphones in the MGM Grand's Conference room right now" does the system set about figuring out the best way to structure the data and give it that higher value.
Collecting Data in a Single Cluster Environment
Before the start of the conference Shay Mowlem, vice president of product management and product marketing at Splunk, explained why the company knew it needed to take this open approach.
Most APIs being used by an IoT device are either proprietary or by a third party that has granted permission to interface. Which is fine, but it defeats the purpose — the time to value — of IoT, which is already made up of countless moving parts.
"We look at IoT and data mining in a more holistic way," Mowlem told CMSWire. "We have architected the event collector so that it can collect data at extreme velocity, but in just in a single cluster environment.
"Then, once the data is in Splunk it can be correlated with other data," an approach that Mowlem described as "unique and powerful."
BMW and the Art of Predictive Maintenance
There are other advancements in the platform worth noting (and we shall in a moment) but the event collector is clearly a major leap forward not just for Splunk but also the industry.
It takes direct aim at the next generation of data usage, namely streaming and unstructured data and/or data stored in containers or transmitted from IoT.
With this data, a whole world of new services and capabilities can be developed, Mowlem said.
"If I can stream data, analyze it immediately and compare to insights and data from other industry norms, I can potentially issue alerts about an event that is about to happen," he said.
One of Splunk's early adopters of Enterprise 6.3, BMW, described something similar only it called it predictive maintenance. As in, it could monitor the production process of its high-end cars, analyzing the incoming IoT-based data and deciding that a potential problem might be developing with a particular process or machine part.
Enterprise 6.3 comes with a range of new features besides the event collector.
- The speed of search, reporting and data on boarding has been doubled from earlier version
- The hardware requirements have been reduced by over 50 percent compared to version 6.0
- It now has advanced analysis and visualization of large datasets
- There is anomaly detection aimed at uncovering rare events for further investigation
- New geospatial maps offer location-based insights
- There is a single value display for an 'at-a-glance' visualization
- It offers new monitoring and visualization for operational management
- New data integrity controls focus on compliance as well as protect against data tampering
Title image from Twitter.