March 04, 2017

Containerization Is The New Virtualization

What is Docker?

Docker is an open-source project that automates the deployment of applications inside software containers.

At a high level, Docker is a utility that efficiently builds, ships, and runs containers. It is a container management tool.

Software containers are self-contained, immutable execution environments. They do not change as it's promoted through the pipeline or development cycle. Each container has its own compute resources, and share the kernel of the host operating system.

Containers offer an environment as close as possible to that of a virtual machine (VM) without the overhead that comes with running a separate kernel and simulating the hardware. A Container could be correctly described as "operating system virtualization", which facilitates running multiple isolated user-space operating environments (containers) on top of a single kernel.

User-space is that portion of system memory in which user processes (i.e., everything other than the kernel) run. This contrasts with kernel-space, which is that portion of memory in which the kernel executes and provides its services. User processes can access kernel-space via the use of system calls.

Docker containers wrap, by default, a single application in an environment that contains everything it needs to run: code, runtime, system tools, system libraries.  It does it with minimal duplication of resources in a maximally isolated environments.

Containers Transform Applications, Infrastructure and Processes:

Applications: decomposing development into services that can be developed independently, improving efficiency, agility and innovation
Infrastructure: unlocking away from traditional DC to Cloud to a more flexible Hybrid model
Processes: enables easy adoption of Agile and DevOps processing over traditional Waterfall model, the goal being improving flexibility, innovation and go-to-market speed

From: Why containers - Beginning of the buyer’s journey -- IT Leader audience by Red Hat

Containers provide functionality to both the infrastructure and application:
  • Infrastructure
    • Isolate application processes on a shared OS kernel
    • Create light, dense execution environments
    • Enables portability across platforms
  • Application
    • Create portable, immutable environment packaged with application and depencies
    • Facilitate continuous integration and continuous development (CI/CD)
    • Easy access and sharing of containerized components

The goal of the container is to guarantee that the application will run the same, regardless of the environment. It does this by defining an abstraction of required machine-specific settings. With containers, "It works on my laptop." is no longer an excuse for delays in moving to production; with it, we know that if it works on the developers' laptop, it works in production.

The Docker container can be executed in any Docker-supported platform with the guarantee that the execution environment exposed to the application will be the same in development, testing, and production.

  • User Space:
  • Why containers - Beginning of the buyer’s journey -- IT Leader audience by Red Hat
  • Containers for the Enterprise: A Red Hat Virtual Event

No comments:

Post a Comment