Thumbnail image for 2015-27-February-wise-owl.png

A two-year-old technology is at the spearhead of a genuine revolution in data center architectures, for both software and hardware.

With yesterday's release by the Docker organization of open source tools for orchestrating the deployment of containerized applications anywhere from a data center cluster to a single laptop, the very definition of a business application is changing.

Docker is a means for deploying a Linux program (although it won’t be just Linux for long) on any system. That may not seem like too big a deal, when it’s phrased this simply.

So let me put it this way: It eliminates all the dependencies between a program and the operating system of the processor that hosts it.

This way, you don’t have to install a program to run it.

Instead, a containerized program runs within a virtual machine that contains only the resources that the program requires to run, and those resources can be transported anywhere — thus, the docking container analogy.

As CMSWire writer Virginia Backaitis explained so simply a few months ago, "In non-geek speak, [Docker] is an open platform for distributed applications that makes the lives of developers and sysadmins a lot more pleasurable and easier. It takes away the non-value adding drudgery of your job."

Why Docker Matters

This new system of conducting digital business is significant for the following reasons:

Operating Systems Matter Less

Up until the advent of virtualization, there were two classes of business applications: those that were hosted on servers and those that the IT department installed on client PCs. We don’t think very deeply about it today, but installation is the process of connecting software with the resources that operating systems provide, which enable it to run.

Deploying software in virtual machines (VMs) made it portable and more manageable, and subsequently improved the utilization rates of the processors that ran them. But those VMs still had operating systems of their own — self-contained environments that often were not aware they were being virtualized.

Business Applications are More Portable

A Docker container is essentially shrink-wrapped. It is a virtual machine, but with the exclusive purpose of running the program. So if your program is a service (for example, a CMS), then finding a host for it becomes a lot easier, like finding an open table at a restaurant.

This is especially important in the realm of Linux, where several brands and multiple distributions introduce seemingly infinite combinations of dependencies. Once an application is installed in a Linux system, it’s thoroughly rooted to that system, even when it’s being hosted in a VM.

By contrast, a Docker container provides just enough Linux for the application to run. That container may then be moved from system to system without much effort.

Application Architectures Can Become More Flexible

The categories of applications upon which this publication was founded — the content management system most prominent among them — have historically been monolithic constructs of lock-tight code. Features and functions can only be added or updated every few years, and even patches for security vulnerabilities require significant efforts.

Encasing these applications within Docker containers, if you will, provides them with a few benefits, such as easier manageability and portability. But the true opportunity made feasible through Docker is the complete re-architecture of the applications themselves, from imperialistic monoliths to distributed modules.

Each module can be developed to provide a discrete service to the other modules, and their work products can be simply directed to a Web browser. So instead of a central control room where every interaction with the application takes place, imagine a variety of bureaus to which specific tasks are delegated.

Here is where Docker may not just alter but completely replace the fundamentals of businesses’ information technology. For a quarter-century, work processes have been centered on the workings of monolithic applications. When software becomes deconstructed into interoperable modules, those modules no longer need to be subdivided into vertical industry categories (for example, ERP, CRM, CMS) to be useful.

Client systems become more mobile

All these modules may be collected together into something that looks to the user like a cohesive platform. The thing which collects them could be the app that resides on an iPad, a Chromebook, a smartphone or a much thinner, smaller PC.

Consider if all the “processing” done by a word processor or a CMS took place in modules hosted within relocatable Docker containers. The identity of the job these modules perform as “a word processor” or “a content management system” may be a function of the client-side app.

To the user, it looks like all the functions are installed on whatever device she’s using at the moment. In the background, the app is dispatching function calls to specific container services someplace in the network.

Full Orchestration

Now let’s bring back yesterday's big headline: The Docker team has released, on schedule, two components intended to radically simplify the way this new kind of software can be deployed.

Docker Machine is a command-line tool that enables a containerized program to be launched on any device. By “any device,” I’m including not just Windows, Mac and Linux, but virtual hosts such as Oracle’s VirtualBox, as well as cloud-based PaaS platforms such as Amazon’s Elastic Beanstalk, DigitalOcean and Microsoft Azure.

In fact, Microsoft is a Docker partner in this release, making Azure one of the first and most prominent platforms to support this Linux-derived technology.

In a blog post yesterday, Azure’s director of program management, Corey Sanders, said, “Docker Machine provides you the ability to manage and configure your hosts from a single remote client. You no longer have to connect to each host separately to perform basic monitoring and management tasks, giving you the flexibility and efficiencies of centralized DevOps management.”

Additionally, a new tool called Docker Swarm enables data centers with clusters of servers not only to deploy containers within those clusters, but also to orchestrate how individual servers in those clusters utilize them. All of a sudden, huge virtualization systems such as VMware’s vSphere begin to look a little overweight.

“I think the announcement from Docker is a significant expansion of the Docker effort towards a fuller suite of tools to adapt it for larger and more complex applications,” said Al Hilwa, IDC’s program director for software development research, in an email to CMSWire.

“The company has been actively developing early betas of these tools and clearly sees them as ready for more adoption. This is likely to generate even more adoption of Docker, especially pushing it in the enterprise.”

Hilwa continues to be impressed with the broad and growing list of Docker partners, which now not only includes Microsoft (for some reason not shown) but also such key players as Red Hat, Rackspace and the OpenStack organization.

While Docker Machine and Docker Swarm may compete with partners’ products, he believes, “Docker is in a good position to push its tools as standards evolve. "I think the industry sees Docker as a Switzerland of container technology and that will help Docker move its tooling forward.”

Keep in mind, the 1.0 version of Docker emerged from testing only last summer. If the technology infrastructure of our businesses can truly be changed this quickly, perhaps such a change was long overdue.

Creative Commons Creative Commons Attribution-Share Alike 2.0 Generic License Title image by Airwolfhound.