Microservices win in popularity. While users get used to classic, monolithic applications, they are a nightmare for developers each time they need to make codebase changes and update hundreds of software copies running on hundreds of servers. They have to keep in mind server configurations, various interdependencies and lots of other things that might block successful build.
Can developers cope with it? Sure, they can—and do.
This is why we interviewed our leading web developer, Artem, and asked him to tell us why his team opted for Docker and how it helped them optimize app work with less effort.
Q.: Hi, Artem! Thanks for your time. Can you explain to us: What is Docker?
Docker is an open-source project that enables to automate app deployment and management on the operating-system-level, e.g., using LinuX Containers (LXC). It allows packing single pieces of software with everything needed to run (codebase, system libraries, system tools, runtime) into a container. These containers can be ported onto any Linux System supporting cgroups in kernel. Docker provides a layer for container management.
Q.: When do you need virtualization? What are the benefits of using Docker?
Here is an important note. Until now, the industry used virtual machines to run apps. Virtual machines start apps inside a guest OS that works on virtual hardware inside the main OS of the physical server. Virtualization is the best tool when you need to isolate your processes: almost no issues of the main OS can affect the software of the guest OS and vice versa. You do not have to start an epic battle to restore/port your environment settings each time you port your app from one server to another. All you need to do is start a virtual machine—and voila, your app is already deployed.
Certainly, such virtual machines are not without their drawbacks. The guest OS needs virtual hardware which, in return, needs visualization. Therefore, processing load substantially increases. Also, new app versions have a relatively long deployment time because restarting a virtual machine is time consuming. Needless to say, a virtual machine requires a lot of disk space.
But here is the good news: Docker is not a virtual machine!
Docker, or more precisely, containers use a different approach. Although they provide an isolation level similar to virtual machines, they use low-level technologies of Linux OS more effectively making load significantly less server load.
As result, we use the main operating system kernel, but apps run in a fully isolated environment.
Q.: What changed for you when you started using Docker?
Firstly, Docker is convenient. Developers do not have to think how they port and tune the new environment each time to ensure it is as close to the main server as possible.
Sometimes, porting requires too much time spent by the developers studying the architecture of the production server and reproducing it on their localhost.
Secondly, development teams might have Windows as their main OS. Also, large teams might work simultaneously on several projects, and some of those projects might require incompatible technologies. Docker provides a solution in this case.
Another important feature is the file that includes the list of instructions used to build an environment for virtualization. You can open it any time and see what packages have been installed for the current version of environment.
Also, we can save each of the created containers to a common or our own Docker hub. This makes it possible to make a version of the environment or even a version of the whole app with its environment. Later, developers control the versions of Docker images and see whether anyone committed any changes to the app or its environment and what kind of changes those were. If a component breaks the Docker image we can easily roll back to a previous version.
On top of that, Docker containers work just like GIT repositories, allowing you to commit changes to your Docker images and version control them. Suppose you perform a component upgrade that breaks your whole environment. It is very easy to rollback to a previous version of your Docker image.
We get a hub (which is similar to git repo) with all versions of our ready-to-use systems, so we can deploy any of those versions anytime on any platform. Cool, isn’t it?
Worth mentioning: Docker is suited to continuous integration as a code delivery and environment configuration tool.
To sum up, with Docker we have:
- Quick prototyping of ideas
- Day-to-day development
- Collaborate and distribute
- Continuous integration
- Continuous delivery
Q.: What project types need Docker? What types do not?
Docker is no silver bullet.
Before you start using any new technology, you should answer this one question: Does it help me? It is very possible that you are already using something that solves your problems and you are quite happy about it.
Q.: Can you tell us any success story?
Here is one about a project with no documentation about its environment. There were issues using different packages for the same code on one server. Sometimes deployment took too long (especially, when deployment included environment reconfiguration). Also, there was no version control. Instead, we had to maintain constantly a full isolation of the running code for not to interrupt the rest of the server. Of course, we had issues with porting to another server.
All those issues went away when we began using Docker.