2014-18-November-Unisphere.jpgCIOs and CMOs have stopped talking about big data and data analytics as something they're exploring or planning on looking into in the near future. Analysts and experts rarely, if ever, call big data the "next big thing" any more. Does all this mean big data is over? Just the opposite.

Big data has finally arrived and is quickly maturing. IT leaders are now shifting from thinking about the possibility of making investments in big data platforms to thinking about how to get more out of the investments they’ve already made.

And with good reason. When it comes to extracting true value from big data, “build it and they will come” doesn’t suffice. Standing up a Hadoop cluster or investing in a new big data platform or analytics application is just one step (and it shouldn’t be the first -- identifying a business question that your company would benefit from answering should always be your first step) on the path to turning data into actionable business insight. With that in mind, let’s examine four ways businesses can optimize their big data investments.

Replicate Key Content

Replication remains one of the most critical and yet most often overlooked ways companies can optimize big data investments. Successful data analysis today requires bringing together modern unstructured data with traditional structured data. This data blending remains a challenge for many organizations, one that can be solved with replication.

By replicating data from your transactional systems to a big data platform such as Hadoop, not only can you save money on licensing costs and achieve better performance by offloading demand from those transactional systems, but more importantly, you can get the 360-degree view of data that’s needed to create an optimal analytic environment. With replication, you can set up multiple analytic sandboxes and share information across the enterprise, facilitating greater data discovery in the process.

Leverage the Power of Prediction

As valuable as replication is, it would be a mistake to think that true analytic value is achieved solely by bringing data together into a single platform. Doing so may enable you to achieve a clear view of what’s happened in the past, but what’s needed today more than ever is the ability to make predictions about what’s going to happen in the future. Prediction is a powerful asset in the world of big data -- arguably the single best way to maximize the value of your data reservoir.

In spite of the obvious need for them, predictive analytics tools have gained momentum somewhat slowly over the years. Some of this is due to the lack of skill set (i.e., data scientists), but it’s mostly a function of the changing data landscape. Data that’s available today simply wasn’t available yesterday. The competitive need for predictive analytics is only really starting to take off -- but is it ever taking off? The ability to make predictions is fast becoming a competitive necessity for modern businesses, especially those looking to keep pace with nimble, online competitors. Invest now to stay ahead of the curve.

Augment Pre-Packaged Analytic Applications

Many vendors have taken to offering pre-packaged analytics applications that deliver a certain degree of analytic functionality, and many customers have invested in these offerings. These pre-packaged applications do have value -- they abstract complexity and roll up many hours of analytic work and development into a simplified offering.

Just be aware that most of these pre-packaged offerings are vanilla in nature, and will only meet a portion of your analytic needs. To optimize this investment, organizations should consider augmenting pre-packaged analytic applications with smaller, complimentary data marts that better align to the specific analytic needs of the business. This is likely where you’ll uncover the secret sauce, so to speak, that drives companies to invest in data analytics in the first place.

Make Use of Metadata Management

The well-documented explosion of data impacting organizations large and small has created a corollary explosion in metadata -- or data about your data. Smartly managing that metadata layer is critical to deriving maximum value from your big data investments. You can track and analyze metadata directly in a system such as Hadoop. Doing so will tell you things like how old a dataset is, how frequently it changes, how often you run reports against it, and whether or not it needs to be archived. In other words, understanding your metadata will tell you whether a given dataset is or isn’t valuable, so that you can create a more robust system around genuinely high demand data. Going a step further, with today’s metadata management capabilities, you can even start to predict which data is the most likely to need retiring or archiving at a given point in time.

Metadata management can be one of those easier-said-than-done initiatives. For the most part, the vast majority of architects are loathe to touch their data warehouses, believing that any change to the warehouse architecture will cause a breakdown in the way it functions. However, if you take the time to truly understand your metadata, you can take action on datasets that you know for a fact are dormant, and as a result, will have no impact on future queries. The payoff -- faster, better performance and more accurate analytics -- will be well worth the input.

Whatever path you’ve taken and whatever investments you’ve made to date, remember that implementing big data technology doesn’t have to be an either/or proposition. If you’ve already made investments and haven’t quite achieved the return you’re looking for, don’t immediately jump to a costly rip-and-replace solution. Instead, consider what you can do to further optimize the solutions you already in place. Success just might be simpler and less costly to achieve than you think.

Title image by llahbocaj (Flickr) via a CC BY 2.0 license