Microservices in containerised environments


As both containerisation technologies and microservices became more and more popular over the last few years it was almost an inevitable progression for organisations to start distributing (micro)services in some sort of standardised containerisation technology. It seems, for the most part, Docker took these places, due to its relative simplicity and open-source nature.
It's important to also remember that Docker isn't the only container technology out there, with Rocket (RKT) and PodMan being some of the more popular ones. It's also important to mention that containers aren't a new concept at all. The concept has been around for a long while and is even present in the Linux Kernel as early as 2008 when LXC offered one of the first all-round solutions as an engine. Whilst Docker development started around 2013 it hasn't become popular for another 4-5 years.


Docker is a containerization platform that allows developers to package their applications and dependencies into a single, portable container. This container can then be run on any system that has Docker installed, making it easy to move applications between development, testing, and production environments.
Kubernetes, on the other hand, is a container orchestration platform. This means that it allows for the management and scaling of multiple containers, making it easy to deploy and manage large, complex microservice architectures.
One key difference between the two is that while Docker is focused on the individual container, Kubernetes is focused on the overall system and how all the different containers interact with each other. This means that while Docker is great for packaging and running individual applications, Kubernetes is better suited for managing and scaling a large, complex system of multiple interconnected microservices.
Although less popular than Kubernetes, Docker Swarm is native clustering for Docker, which means it is tightly integrated with the Docker ecosystem and can be used to create and manage clusters of Docker engines. It is relatively easy to use and set up, making it a good choice for small- to medium-sized deployments.

What's the appeal

Delivering Docker images is like having your own personal pizza delivery service, but instead of piping hot pies, you get fully-configured application environments that you can share with anyone, anywhere, anytime.
Developers love Docker because it allows them to package up their code, dependencies, and configurations into portable images that can run consistently across different environments, from development to production. With Docker, you don't have to worry about setting up complex infrastructure, managing dependencies, or dealing with compatibility issues. You can simply focus on writing code, building images, and shipping them out to the world.
Docker images also make collaboration and sharing a breeze. Whether you're working on a team or contributing to an open-source project, you can easily share your images with others, who can then run them on their own machines without any setup hassles. This means you can work faster, iterate more quickly, and get feedback from others more easily.
And let's not forget about the benefits of containerization itself. With Docker, you can isolate your applications from the underlying host system, which provides enhanced security, reliability, and scalability. You can also run multiple containers on the same host, each with its own resources and network stack, which allows you to optimize resource usage and reduce costs.

So what's the catch

Complexity: Docker introduces a layer of complexity to the development process, requiring developers to learn new concepts and tools like Dockerfiles, containers, and registries.
Size: Docker images can be large, which can impact storage and transfer times. This is especially true for images that include large dependencies or binary files.
Security: While Docker provides some security features like container isolation and user namespaces, misconfigured or vulnerable images can still pose a security risk.
Compatibility: While Docker images are designed to be portable, they may not work as expected across different platforms or versions of Docker.
Maintenance: Docker images need to be updated regularly to ensure that they are running the latest security patches and dependencies. This can require ongoing maintenance and monitoring.
Learning Curve: There is a learning curve to working with Docker, especially for developers who are not familiar with containerization concepts.


For the correct use case Docker in combination with Kubernetes or Docker Swarm is definitely the answer for smaller and larger organisations that can speed up delivery and standardise across the board. If attention is paid it also enhances security and gives better overall consistency to your organisation's delivery methods.
If this is something that you or your organisation needs help with please reach out to us on the Contact page.