Docker is a containerization tool developed by Docker Inc. in 2013, which helps you to create, deploy, and run containers smoothly. In other words, Docker acts like a wrapper, around software, IoS, and dependencies for a particular code to work. Even though the concept of containerization was existing years before, Docker made it easy and candid to handle containers in an efficient manner.
Docker originally built for Linux, but now works fine on Windows and Mac IoS as well. It is a common scenario that issues occur between developers and testers while running the same code in different environments. All these issues will get removed if we utilize Docker since it uses an o.s level virtualization, also known as containerization.
Differences between Virtualization and Containerization
Containerization and virtualization are two entirely different technologies where both have their advantages and disadvantages.
- In virtualization, you can run entirely different OS over the same hypervisor. For example, you can run a Windows VM on a Linux hypervisor and vice versa. However, since you are sharing the hardware resources, there is always a limit on the number of VM you create, and performance might also vary as hardware limitations.
- On the other hand, containers use the same OS kernel and applications run over the docker engine. Besides, containers are lightweight and easily portable. The container size can be as little as 50MB.
Why is Docker relevant?
Imagine a developer creating a website in PHP. After the completion of the code, he wants to test the site. For the same, he needs to run it on a server that contains an OS, PHP software, and necessary libraries. After obtaining the desired output, the developer will pass it on to the tester for examining the website. The examiner also has OS, PHP software, and necessary libraries installed in his system. But since there are multiple PHP versions, there are chances that the website won’t work as intended. On top of it, the production environment also can be different.
Here comes the relevance of Docker. Using Docker, the developer can create a docker image with necessary software and libraries, and it will run the same everywhere as intended, where the docker platform got installed. In addition to that, making updations and changes to the existing setup makes it a lot easier, if we use Docker, as it is only a matter of updating the image and sharing it.
1. Docker File
A docker file is a set of instructions to build a docker image. It includes the required O.S, environment variables, file locations, ports, and the collection of commands to run while the container works.
2. Docker Image
Once you write all instructions in a docker file, its time to make it portable by converting the instructions into a docker image, you can use the docker build command for creating images. Once the image gets created, you may share the same with everyone, and they can download and run the picture.
3. Docker Run
The run command launches the container. You can run multiple boxes at a time and also have the option to start and stop them. Docker uses a unique name, which you may specify explicitly and container ID to identify each container.
4. Docker Hub
Docker hub is a central repository that helps you find official Docker images, and also you can use the same for sharing your custom images. You can upload content to the docker hub using docker push and download pictures using docker pull commands.
5. Docker Engine
Known as the heart on which the containers run. Docker engine has two different versions:
- One is Community edition – which is free and open-source
- The next one is enterprise edition, which is paid but offers advanced features than CE like cluster and image management etc.
Docker Compose and Docker Swarm
Docker-compose is generally used for defining and running multi-container applications written in YAML. The docker-compose allows users to run commands on multiple containers at once, like, building images, scaling containers, running containers that got stopped, and much more.
Docker swarm is a container orchestration tool which helps users to manage multiple containers deployed across multiple host machines. Docker swarm clusters the boxes, and there will be a swarm manager for controlling the activities and devices that have joined the cluster and commonly referred to as nodes.
Advantages of Docker
- Docker is easy to install, configure and highly portable.
- When Continuous Deployment and Testing becomes the primary concern, Docker plays a vital role to ensure consistent environments from development to production.
- Docker provides process isolation were each container is isolated from each other
- Docker containers offer security where one container does not know what process is running inside other containers. Each box gets segregated, and you can manage them individually
To recapitulate, as per reports, Docker holds an 80% share of containerization technology. Portability, flexibility, and simplicity are the key reason why Docker has been able to generate such strong momentum.
– By Sabu Thomas Vincent
There are no revisions for this post.