If you’re reading this, you’re browsing a website that relies on Docker to run smoothly. When I built this site, learning Docker was a critical step to ensure it could be deployed effectively using a Continuous Integration/Continuous Deployment (CI/CD) pipeline. In this guide, I’ll explain the basics of Docker and share practical examples, including the Dockerfile and docker-compose.yml files I used.
What is Docker?
At its core, Docker is a platform that enables you to package, ship, and run applications in isolated environments called containers. These containers include everything your application needs: code, runtime, libraries, and dependencies. Containers ensure your app runs consistently regardless of where it’s deployed—whether on your local machine, a server, or in the cloud.
Why Docker Matters
Docker is popular for several reasons:
- Consistency Across Environments: It eliminates the classic "works on my machine" problem.
- Efficiency: Containers are lightweight and start up faster than virtual machines.
- Simplified CI/CD Pipelines: Docker makes it easy to automate build, test, and deployment workflows.
- Portability: Containers can run anywhere—on a laptop, on-premise server, or cloud provider.
How Docker Works
Docker uses the following components:
- Docker Images: Read-only templates that define the contents and environment for a container.
- Docker Containers: Runtime instances of images.
- Dockerfile: A script that specifies how to build a Docker image.
- Docker Compose: A tool to define and manage multi-container applications with a single YAML file.
Here’s how it all comes together: you write a Dockerfile to create an image, and then you use the image to spin up containers. For more complex applications, you use docker-compose.yml to manage multiple containers.
Example 1: The Dockerfile
A Dockerfile is a script that automates the steps needed to build a Docker image.
Here’s a part of the Dockerfile I made for this website.
# Use an official NodeJS image as the base
FROM node:20-alpine AS base
# Set the working directory inside the container
WORKDIR /app
# Installing the dependencies
FROM base AS deps
RUN apk add --no-cache libc6-compat
COPY package.json package-lock.json ./
RUN npm ci
# Build the application
FROM base AS builder
COPY --from=deps /app/node_modules ./node_modules
COPY . .
RUN npm run build
# Expose the port the app will run on
EXPOSE 3000
# Command to run the app
CMD ["npm", "run", "start"]
What’s Happening Here?
- Base Image: The
FROMinstruction pulls a lightweight NodeJS image. - Working Directory: Sets
/appas the working directory inside the container. - Install Dependencies: Installing the dependencies from our package.json file.
- Build the App: Build the app to prepare it to run in production.
- Expose Port: Opens port 3000 so the app can be accessed externally.
- Run the App: The
CMDinstruction specifies the command to start the app.
Example 2: Using Docker Compose
If the website also requires a database, managing multiple containers can get tedious. This is where Docker Compose shines. With a docker-compose.yml file, you can define and orchestrate all the services your app needs.
Here’s an example docker-compose.yml file for a Flask app with a PostgreSQL database:
version: "3.9"
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- .:/app
environment:
- DATABASE_URL=postgresql://user:password@db:5432/mydatabase
depends_on:
- db
db:
image: postgres:14
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: mydatabase
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
postgres_data:
What’s Happening Here?
- Version: Specifies the Compose file format version.
- Services:
web:- Builds the app using the
Dockerfilein the current directory. - Maps port
3000on the host to port3000in the container. - Sets environment variables, including the database connection URL.
- Depends on the
dbservice, ensuring the database starts first.
- Builds the app using the
db:- Uses the official PostgreSQL image.
- Configures the database credentials and initializes the database.
- Volumes:
postgres_data: Creates a named volume to persist PostgreSQL data.
Running the Application
To start everything, simply run:
docker-compose up
This command builds and starts all the services defined in the docker-compose.yml file.
How Docker Helps My CI/CD Pipeline
Once the site was containerized, I integrated Docker with my CI/CD tool. Here’s an example workflow using GitHub Actions:
.github/workflows/deploy.yml
name: Deploy Website
on:
push:
branches:
- main
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Docker
uses: docker/setup-buildx-action@v2
- name: Build Docker image
run: docker build -t my-website:latest .
- name: Push Docker image to registry
run: docker tag my-website:latest myregistry/my-website:latest &&
docker push myregistry/my-website:latest
- name: Deploy to server
run: |
docker pull myregistry/my-website:latest
docker compose up
docker system prune --all --volumes --force
Key Steps
- Build: The workflow builds a Docker image for the site.
- Push: The image is pushed to a container registry (e.g., Docker Hub).
- Deploy: The image is pulled onto the server, replacing the old container with the updated one.
Conclusion
Docker has revolutionized how applications are developed, deployed, and maintained. It allowed me to build this website, integrate it into a CI/CD pipeline, and deploy it seamlessly. Whether you're running a simple static site or a complex multi-service app, Docker simplifies workflows and ensures consistency.