We are seeing the major technologies driving the digital workplace, including artificial intelligence (AI), big data, cloud computing and Internet of Things (IoT) starting to converge. The growing need of enterprises to access, use and exploit data harvested from intelligent apps is driving the convergence. In fact, it could be argued that data is the cornerstone of the intelligent digital workplace.
This has all been enabled by easy and economical access to cloud computing which, because it is always on and always attached to everything in the enterprise, is able to access and analyze all the data found in all the different repositories across the enterprise. Data in the cloud is the common thread that is keeping everything together — the digital workplace would not be possible without it.
Jake Freivald, VP of Information Builders, shared some emerging trends regarding intelligent data that enterprises are attempting to manage with new technologies.
Centralize, Contextualize and Manage Data
Healthcare organizations, for example, need to bring information about patients back from walk-in clinics, hospitals, home nursing services and other sources to get a complete view of every patient. Since physicians and nurses can work at different hospitals, facilities and their own practices, they need to be able to do this as well.
In every industry, there are related situations where information about people, products, customers and processes need to be assembled, put into business context and managed. He said the cloud is a natural place for this to happen because:
- It is connected to every system or process.
- It automatically accommodates increases in big data volumes and sources.
- It only requires payment for computing power when something needs to happen with all of that data, which allows businesses to dump data in without worrying too much about exactly when they’ll use it.
Related Article: Edge Computing vs. Fog Computing: What's the Difference?
Action at the Edge
At the same time, the cloud enables us to use computing power "at the edge." This enables autonomous action closer to individual locations of devices to improve response times and reduce traffic to the central processor. The human analogue to this is the “reflex arc” — when you touch a hot stove your nervous system responds by making you remove your hand immediately, even before your brain knows what’s going on, thereby reducing the damage done to your hand.
The greatest strength of the reflex arc is that it enables rapid, “intelligent” action without waiting to engage the brain. With cloud computing, you can do things such as feed a data stream locally into an AI system that governs local response, while still packaging up information to be sent back to the central processor for centralization, contextualization and management.
Big Data Is Ordinary Data
In fact, the cloud is actually changing data itself, Freivald added. “Big data is becoming plain old data, with an increased emphasis on governance, while “normal” data is adopting appropriate big data techniques. The cloud has allowed us not to care as much about where data gets stored, and SQL interfaces to big data have made big data look a lot more like ordinary data,” he said.
At the same time, the old way of handling big data — just dumping it somewhere and hoping you’ll use it someday — has been the root of some significant failures. People are placing increased emphasis on data quality, governance and mastering, and that makes it look much more like the normal data sources we’ve always used. In that way, cloud is facilitating the adoption of more big data while also making that big data look more familiar.
Data Leads Digital Societies
Just like the invention of the wheel led to an agricultural society, today's inventions of data storage, analysis and decisions are leading to a digital society, said Ankur Teredesai, a professor of computer science at University of Washington and also co-founder and CTO of KenSci.
He explained that the confluence of that along with cloud computing is giving us scale. The algorithmic advances due to availability of data and computing is giving us probability driven decision-making.
It also provides the ability to make these decision frameworks hyperlocal and on the edge with advances in IoT, which is completing the vision for a smart digital framework. The next set of challenges now is to make this smart digital framework applicable for social impact, for the betterment of humanity in verticalized domains such as healthcare,” he said.
Real-Life Use Cases
So how does this work in the real world? The digital world we live in has changed the way we interact with physical spaces, which has bred the idea of Space-as-a-Service. This new mindset and focus on our surroundings has set high expectations that the buildings in which we live, work and play will function according to our needs, flawlessly, according to Maya Gal, co-founder and chief revenue officer at Okapi.
For example, maintaining high tenant satisfaction is the No. 1 priority for 97% of Commercial Real Estate (CRE) owners and operators. But CRE managers are not paying close enough attention to the actionable insights at their fingertips that will help address tenants’ needs, address inefficiencies and understand business opportunities.
“The most common issue that we see in CRE organizations that are not leveraging the full power of their data is that their data exists in silos. In these cases, managers can’t be expected to allocate the right staff or reduce response times; both reported as the top challenges for property managers,” she said.
When it comes to multi-campus enterprises, valuable data with real insights on tenant satisfaction — such as the nature and frequency of inbound service requests, work orders analysis, tenant leasing patterns, ticket response and completion times — can only be improved with insights from AI, which detects anomalies and patterns invisible to the human eye. However, the raw material to enable all of this is data combined with intelligent applications.
In fact, without applying AI applications, operational teams and managers are forced to dig through endless spreadsheets, manually looking for patterns and behaviors among tens, hundreds and sometimes even thousands of tenants.
“In a world of faster, seamless user experiences, operational teams and CRE owners and operators need to be able to analyze vast amounts of tenant data to automatically produce insights on patterns and anomalies in order to gain a better understanding of their tenants’ needs and difficulties, and address these issues as quickly as possible,” she said.
The Rise of Fast Data
The convergence of these technologies is being driven by a single common thread. AI's power and value is directly correlated with how much data it can access, and big data is attracted to clouds for simplicity and scale, Hazelcast CEO Kelly Herrell said.
While systems are indeed getting smarter, another key observation is that big data is beginning to be overshadowed by fast data. In the digital world, latency is the new downtime and microseconds matter.
This means that data processing must happen at precisely the moment data is generated. This stream processing will simultaneously change the data processing paradigm and, for latency reasons, push the compute activity back out to the edge and away from the centrality of large clouds. This addition of fast data to the macro trends is driving the era of in-memory computing as a default element of emerging architectures, which is enabling net-new digital innovation.