What is Docker? An Introduction to Containers

Photo by Ian Taylor on Unsplash

What is Docker? An Introduction to Containers

Software Delivery without Containers

Software delivery is a process that spans from conceptualizing an Application idea to developing and deploying it to the market. Over the years, software delivery processes have changed as new technologies have been developed.

Software delivery relied on a method that involved manually configuring your software stack and services on your local environment or server; the configuration process would differ from one operating system to another.

Imagine you’re a developer working on a complex web application with a team, your routine would be to write code, run tests on your local machine, and deploy on a server.

Now here are some of the challenges associated with this method;

  1. Inconsistent Environment: You’ve probably faced a scenario where your code works well in your development environment but has bugs when it is deployed to a production environment. Differences in operating systems and server configuration can lead to such issues.

  2. Dependency Management: Manually installing and configuring dependencies can lead to code conflicts, especially when working with a team. Different operating systems may require different dependency versions and configurations.

  3. Scalability Issues: As more users adopt your product, there will be a need to increase its capacity so it can handle more traffic and demand. However, this would require manual intervention, and this comes with some tradeoffs.

  4. Pushing new Updates: Updating the application would be a source of concern because of unforeseen bugs, code conflicts, and issues with backward compatibility for users that haven’t updated.

  5. Isolation Issues: This is an issue that occurs when multiple applications or processes in your local environment are running on the same server, leading to mysterious bugs. These bugs arise from contention for system resources and dependency conflicts.

Containers to the Rescue

The intermodal containers depicted in the image are used to facilitate the transportation and delivery of goods to various destinations around the globe. For example, a car produced in Italy can be transported to the United Kingdom using these containers and still function optimally regardless of the environment. Software containers work similarly, they are primarily used to ship software from one environment to another.

Containers package software code with all the necessary tools and services it needs to run in any environment. This includes the Public cloud, Private data centers, and even your personal computer. This makes software development and deployment fast and efficient.

For instance, when working on a web application, you can store your core code base in one container and access database tools like PostgreSQL and MongoDB from different containers. You don’t need to configure any of these tools, and you can access them by simply running a simple command.

The idea of containers and virtualization can be traced back to Unix version 7 and its chroot system, which was introduced in 1979. The system isolation feature that chroot provided paved the way for the evolution of container technology, starting from Free BSD jails to Solaris Containers and finally leading to the introduction of Linux Container in 2008. This version of container technology was the most stable at the time, and newer container technologies such as Warden and Docker were built upon it due to its reliability.

What is Docker?

Docker is an open-source technology that allows you to build, test, and deploy applications in isolated environments called containers by virtualizing the application layer of an operating system. Docker can also be used to refer to a suite of tools and components that aid in the management of containers and are managed by the Docker community.

The term "container" is frequently used to refer to Docker due to its current level of adoption. However, there are still alternative containerization choices available, such as Linux containers. So why use Docker?

Why use Docker?

Docker and other containerization technologies have the same fundamental concept of creating an isolated environment for applications. However, they differ in a few aspects, some of which include community support, architecture, and its suite of components.

Here are some reasons why you should use Docker;

  1. Container Portability: Since the creation of the chroot system in 1979 till now the container technologies that have been made available have been everything but portable. For instance, LXC containers require configuration for specific environments due to kernel dependencies. On the other hand, Docker containers can run across different environments without any modifications.

  2. Automated Container Building: This is a feature that allows you to automatically build a Container image as you make changes to your application code repository.

  3. User Experience: Docker provides a range of tools, including Docker CLI and GUI, along with various components and features that make management and usage easier.

  4. Ecosystem and Orchestration: Docker is a cross-platform technology, which means that it has been made compatible with Windows, MacOS, Linux, and other operating systems. It also has strong support for container management and orchestration tools, such as Kubernetes and Docker Swarm.

  5. Strong Community Support: As of the time of writing, Docker is the leading container technology, with more than 4 million developers using the Docker desktop. Additionally, there are over 3,000 contributors and more than 105 billion container downloads.

How does Docker work?

Docker works by virtualizing the host operating system. It is usually compared to a virtual machine because of its method of virtualizing a layer of the OS, however, there are still some fundamental differences between the two. To get a better understanding of how Docker works, let’s take a look at these differences, and how an operating system works in relation to virtualization.

An operating system has two main layers;

A Kernel and Application Layer from The Great Learning Blog

  1. Applications layer: This is where Applications are installed and run.

  2. OS kernel layer: This is the core of every operating system, and it is responsible for interacting with the hardware components and allocating system resources to the software components that run on the Application layer.

Containers vs Virtual Machines from Wikimedia

The main difference between virtual machines and Docker is that the former virtualizes both the OS kernel layer and the application layer, while the latter only does so for the application layer. This makes Docker packages or images occupy less disk space and run faster than virtual machines, which require more space and take more time to boot.

One challenge that Docker encounters with its method is compatibility, unlike virtual machines. For instance, attempting to run a Linux image on a Windows or macOS operating system would not be feasible. This is because Linux images require a Linux kernel for proper functionality.

One reason for this is that Docker was initially developed only for Linux. However, they introduced a solution for other operating systems to make use of Docker, and it is known as Docker Desktop. This application utilizes a hypervisor, which is software that creates and runs virtual machines, in combination with a lightweight Linux distribution in order to create Linux images.

Docker components and tools

Some of the tools you’ll encounter while using docker include;

  1. Docker File: This is a text file that contains a list of command line instructions on how to build a Docker image.

  2. Docker Image: A Docker image contains your application's source code and all the tools and dependencies it needs to run as a container. Usually, developers get their images from repositories like Docker Desktop or build them from scratch.

  3. Docker Container: This is basically a running instance of an image.

  4. Docker Deamon: This is the background service that allows you to create images, run containers, monitor containers, and handle communication between containers.

  5. Docker Registry: This is like a huge online library that stores Docker images and allows anyone to access these images. Developers and organizations can also upload their own images and make them private.

How does Docker fit into software development?

You’ve understood how Docker works, now let's get into its practical application in software development. When including docker in your development workflow, your Process should look like this;

  1. Conceptualization: You decide what you want to build, the Tech stack of your choice, and the features of your software.

  2. Setting Up Your Development Environment: You set up your directory that includes the files for your application in your desired development environment (Vscode, Sublime Text). You also need to install the Docker desktop application on your machine.

  3. Dockerizing your application: You create a docker file for different components of your application (frontend and backend). Your docker file would include instructions on how to build a docker image, and it would include instructions that copy your software code and dependencies into the image.

    With your docker in place, you also get to use tools like docker-compose, which helps in the management of multi-container applications and your development environment. In simple terms, docker-compose helps you manage the different containers that make up your application and ensures that they are properly linked together (frontend, Backend, and database services).

  4. Code and Debug: At this stage, you write the code for your application and run it using docker-compose.

  5. Version Control: This is where Git integration comes in, you can also make use of any Repository of your choice (Github, Gitlab, Bitbucket), For the purposes of collaboration and code storage.

  6. Deployment: At this stage, you deploy your finished application to a production environment. Now, when deploying a small-scale application, you most likely use a single server, but for large-scale applications, you would make use of multiple servers for the sake of scalability, or choose to go serverless by using cloud-based services like AWS and Azure.

Conclusion

Docker is no doubt a valuable tool, it has streamlined the process of developing and deploying software for developers. Also, it has become an essential tool in the DevOps field alongside other CI/CD tools (tools used to automate repetitive tasks and speed up the software development process).

To learn more about Containers and Docker, check out the following resources;

What is Docker by IBM

Understanding Docker Containers by Lucero del Alba

A Docker Tutorial for Beginners