That we're all suffering from information overload comes as news to no one. And while some people's eyes glaze over at the mention, metadata can act as the key to reducing the noise.
My last article highlighted what the NSA can teach us about the importance of metadata, in particular how metadata can be used to categorize and later find enterprise documents. But as more work gets done out of the office, another type of metadata is gaining prominence: mobile metadata.
Mobile metadata is information created by a mobile device about its user, its location or about related objects. Mobile devices automatically generate metadata in the course of their everyday operations, so mobile metadata can provide far more utility than just classifying documents. In fact, the NSA subpoenaed a form of mobile metadata from carriers, specifically call detail records (CDRs) to learn about callers and their conversation habits.
For business purposes, auto-generated mobile metadata can define work "contexts," which can be used to reduce information overload and simplify your workday -- by presenting "the right information at the right time." How does this work?
Where Does Mobile Metadata Come From?
The sensors built in to today’s mobile phones inherently generate a wide variety of data. These sensors and their associated systems include chronographs, accelerometers, global positioning (GPS), indoor positioning, near-field communications (NFC) and imaging (i.e. cameras).
The importance of these metadata are related to the fact that information generated by the phone can be aggregated and cross-referenced with other enterprise information, like application data, to expose just the things you need to see "right here, right now." Combining this information in a useful way is contingent on our ability to define smart work contexts.
Context: 'I Know It When I See It'
Context is a difficult concept to nail down. When pressed to define context, many experts fall back on the response given by US Supreme Court Justice Potter when asked to define hard-core pornography: Potter replied: “I shall not … attempt … to define … and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.” This "definition" may have worked for the Supreme Court, but it’s not enough to be useful in business.
A more appropriate definition for context was provided by Anind K. Dey, now an associate professor in Human-Computer Interaction (HCI) at Carnegie Mellon University:
[Context is] any information that can be used to characterize the situation of a … person, place or object that is considered relevant to the interaction between a user and an application.”
Simply put, context defines a situation which makes certain kinds of information particularly interesting or relevant.
In this regard, mobile devices generate information that, when context is applied, can accurately pinpoint, filter and highlight information for a given time and place. By providing a method for raising up important information snippets and suppressing others, context is our best hope for addressing information overload.
Let’s see how this works in practice.
Mobile Sensors, Metadata, Context
Sensors in today’s mobile devices generate metadata for context engines that identify the information we need right here, right now. Here are some examples of how this works:
- Chronographs -- internal clocks that keep track of the current time. When cross-referenced with calendar appointments, an app on your phone can display information about upcoming meetings or task deadlines. In this case, the calendar provides a time-based context for defining which information is important.
- GPS/geo-location -- location sensors and associated positioning technologies can define which information is important based on your current location. For example, when a salesperson visits a customer, information related to this customer account can automatically be displayed on your mobile phone. Location-based context can be particularly important for people who work outside the office or away from their desk.
- Near-field Communications (NFC) -- NFC sensors help your phone exchange information with other smart devices within close proximity. For example, a smart phone badge scanner (using NFC) at a trade show, can automatically incorporate contact information from a badge directly into your phone’s address book. When this information is merged with profile information from other applications, relevant business information can be surfaced. For example, when badge information is cross-referenced with CRM records, a trade show attendee can be flagged as an employee for a company that is already an active prospect; important information for salespeople.
- Cameras and image sensors -- photos taken with mobile phone cameras can be cross-referenced with image recognition databases, so photos of people can be automatically ‘tagged’ and associated with existing contact profiles. Then, when information related to this person is displayed, it can be accompanied by an updated profile picture.
- Gesture and eye-motion sensors -- which follow eye movements (like Samsung’s Smart Scroll™ technology) can be used to automatically scroll a mobile phone screen, to provide a smooth reading experience for a user perusing a document. This could be particularly useful for creating a "deep thinking" mode on your phone. When in a deep-thinking mode, the phone could block a user from being interrupted; notifications and updates could be suppressed, thereby reducing stressful productivity-killing distractions, and allowing users to focus on the task at hand.
- Accelerometers -- motion detectors that can be used to define an "on the go" context, during which the phone could suppress updates and notifications, so a user could focus on the task at hand … like driving a car.
Where to Next? Where No Man Has Gone Before
The rapid deployment of sensor-rich smart mobile devices, coupled with the proliferation of distributed, heterogeneous cloud services provides a fertile ground for almost limitless opportunities to define contexts that could pinpoint and surface the information you need "right here, right now."
Validation of this trend was provided by Microsoft’s recent announcement of the Office Graph. Microsoft’s Office Graph uses “signals from email, social conversations, documents, sites, instant messages, meetings and more to map the relationships between the people and things that make your business go.” Apps that can tap into the intelligence of Office Graph and related sources, might finally be able to crack the information overload problem.
The Internet of Things is ultimately the top level of sophistication available for context-aware situations. Specifically, when devices will be able to communicate amongst themselves, the sky is literally the limit about what is possible. The opportunities to reduce information overload afforded by the coupling of sensors, context and machine-machine interactions will be covered in a future article.
Title image by Monkey Business Images (Shutterstock)