Docker enables developers to easily pack, ship, and run any application as a lightweight, portable, self-sufficient container, which can run virtually anywhere. Containers gives you instant application portability.
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in terms of usage of system resource. Instead of virtualizing hardware, containers sit comfirtably on top of a single Linux instance. This means you can “removing the useless 99.9% of VM useless things, leaving you with a small, neat package containing things only required for your application”.
Therefore, with a perfectly tuned containerized machine, you can have almost five times server applications running at a time smoothly as you can using different Virtual Machines on the same hardware.
Another reason why containers are popular is they are very useful in Continuous Integration/Continuous Deployment (CI/CD). This a DevOps process created to encourage developers to integrate their code into a shared repository early and often, and then to deploy the code quickly and efficiently.
With technology like Jenkins, you can create a pipeline which will build your application code, build a docker image, push the docker image to any docker registry, and also deploy your docker image in any docker environment.
In addition, Docker containers are easy to deploy in a cloud, and with the advancement of serverless techologies and services like AWS fargate, deploying containers is cheaper than ever before because you will pay for only when your containers are in use.