Few words about containers
What is a container?
In few words, probably the most practical tool invented in software development!
In a nutshell, it is an isolated space in your machine that will contain a virtual system (In exact words, it is a lightweight portable environment) based on a system image (an OS like ubuntu for example). Additionnaly, it can encapsulate your application and its dependencies, allowing it to run consistently across the computing environment of your choice. For example if you have a Windows machine but you need to test your code on a specific distribution of Linux, you can recreate a container with the exact specification to run it, without the need to use another machine.
A really good way to understand the concept of container is to watch the conference of Liz Rice:
Now that we grasp the core concept, let's take a look at one of the most popular containerization platforms: Docker.
Docker, a whale that makes a big change:
Docker allows you to create, set, and manage containers, making it easier to develop, test, and deploy applications across various environments, from development laptops to production servers.
To understand how Docker works, we need to define few concepts:
A container is an instance of an image that can be executed and run in isolation. Containers share the host operating system's kernel but have their own isolated file systems and processes.
An image is a standalone package that contains everything needed to run an application, including the code, runtime, libraries, and system tools. It's a snapshot of a file system and configuration settings.
A Dockerfile is a text file that contains instructions for building a Docker image. It specifies the base image, application code, dependencies, and configurations needed for the container.
Container orchestration tools, like Kubernetes, help manage the deployment, scaling, and management of containers across clusters of machines. To orchestrate several containers you can use a Docker compose file as well.
Benefits of using Dockers include:
Consistency: Containers ensure that the application behaves the same way across different environments, reducing "it works on my machine" issues.
Isolation: Containers provide isolation from the underlying host system and from other containers, preventing conflicts between dependencies.
Portability: Containers can be easily moved between different environments, whether it's from a developer's machine to a testing environment or from an on-premises server to the cloud.
Resource Efficiency: Containers share the host operating system's kernel, which makes them lightweight and resource-efficient compared to running multiple virtual machines.
Rapid Deployment: Containers can be spun up and torn down quickly, enabling rapid development, testing, and deployment cycles.
Scalability: Containers can be easily scaled horizontally to handle varying levels of load, ensuring optimal resource utilization.
Version Control: Docker images can be versioned, making it easy to roll back to a previous version if needed (c.f DockerHub).
Containers have revolutionized applications development by addressing many of the challenges associated with compatibility, consistency, and infrastructure flexibility. They have become a foundational technology in modern software development and deployment practices, hence why we need to know how to use them.
How to master Docker?
A good way to learn how to use Docker is (spoiler alert) to go through their tutorial documentation (start here). You can find their cheat sheet here. Additionnally, you can take a look at my reminder repository that condense the Docker tutorial:
To go further, you can pass the official Docker certifictation here.
© Pierre Congiu Consulting - 2023