Hortonworks Heats Things up for the Competition
The Yahoo spin-off may have a better handle on Open Source Hadoop than any of its competitors.
Consider that 24 engineers from the original Yahoo team that developed the Open Source software framework that supports data-intensive distributed applications work at the company. And that they say they have contributed more code to the Apache Hadoop project than any of their competitors. (Namely MapR and Cloudera.) This isn’t hard to believe because Hortonworks writes nothing (in house) that is proprietary and holds nothing back.
Why does this matter? Because it wouldn’t be too far a reach to conclude that the folks who wrote the most code are in the best position to support it and to provide services around it.
Not only is that an attractive proposition for Enterprise customers who don’t want to venture into unknown, difficult waters alone, but ditto for third party vendors who want to build ancillary products to sell in the rapidly growing Big Data world. Hortonworks competitors will, no doubt, have a problem with much of the aforementioned. (I’m all ears.)
There’s another differentiator between Hortonworks and the other two vendors. Hortonworks products are 100% Open Source and are free as opposed to some of MapR’s and Cloudera’s Enterprise grade and/or value-adding Hadoop products, which are not.
While it’s easy, but possibly irresponsible, to say that “free is better”; all things being equal, free is better. And if Hortonworks’ Data Platform (HDP) actually proves to be better than MapR’s and/or Cloudera’s, in the long run, that spells trouble for one or both of them.
The Latest (Free) Release
That being said, earlier this week, Hortonworks released its enterprise-grade data platform HDP 1.2. This is the third Hadoop distribution they’ve delivered in less than a year. “We’re on a quarterly cadence,” says Jim Walker, Director, Product Marketing at Hortonworks.
This release includes the latest version of Apache Ambari, a stable, next generation management console tool for the comprehensive management, monitoring and provisioning of Apache Hadoop clusters. It is likely to become an operator’s best friend because when there’s a problem it provides root cause analysis and insight.
Walker says that Hortonworks has a customer that is running 18,000 jobs on 200 nodes and that without a tool like this version of Ambari, solving problems would be more than difficult. With it, the operator can tell on which service and on which node the problem resides, what’s causing it, and so on.
HDP 1.2 also introduces additional new capabilities for improving security and ease of use.
It’s remarkable that not only can Hortonworks’ customers download HDP 1.2 right now, free of charge; but so can its competitors’ customers and its competitors themselves.
For those who are new to this sort of Open Source model, you might be scratching your heads wondering how these folks are going to make money and if selling support and services by giving product(s) away for free is smart.
But neither Walker, nor Hortonworks CEO Rob Bearden, seem to be anything other than confident about the company’s strategy. They are certain that they’ll win the market by having the most stable, easily adoptable Enterprise-grade platform and by selling the support and services to go with it. “We are uniquely positioned to do so,” says Walker, referring to the strength and depth of the company’s development and support teams.
And as for all the code they’re writing for free, they say that it’s part of their mission to create a reliable, highly available and recoverable Hadoop platform so that the vast majority of Enterprises can adopt it, put it to work and reap rewards from it as soon as possible. The money will come, it seems, when all those companies need services and support and the “guy who wrote the code” will be their obvious partner of choice.
As an afterthought, it’s worth noting that, at this very moment, Hortonworks is more than likely earning a pretty penny from, at least part of, its long list of partners like Microsoft where they provide services around getting Hadoop clusters run on Windows.