A two-year-old technology is at the spearhead of a genuine revolution in data center architectures, for both software and hardware.
With yesterday's release by the Docker organization of open source tools for orchestrating the deployment of containerized applications anywhere from a data center cluster to a single laptop, the very definition of a business application is changing.
Docker is a means for deploying a Linux program (although it won’t be just Linux for long) on any system. That may not seem like too big a deal, when it’s phrased this simply.
So let me put it this way: It eliminates all the dependencies between a program and the operating system of the processor that hosts it.
This way, you don’t have to install a program to run it.
Instead, a containerized program runs within a virtual machine that contains only the resources that the program requires to run, and those resources can be transported anywhere — thus, the docking container analogy.
As CMSWire writer Virginia Backaitis explained so simply a few months ago, "In non-geek speak, [Docker] is an open platform for distributed applications that makes the lives of developers and sysadmins a lot more pleasurable and easier. It takes away the non-value adding drudgery of your job."