Understanding the basics of Docker
Written by Thomas Béchu
November 26, 2024

If you’re reading this, you’re browsing a website that relies on Docker to run smoothly. When I built this site, learning Docker was a critical step to ensure it could be deployed effectively using a Continuous Integration/Continuous Deployment (CI/CD) pipeline. In this guide, I’ll explain the basics of Docker and share practical examples, including the Dockerfile and docker-compose.yml files I used.


What is Docker?

At its core, Docker is a platform that enables you to package, ship, and run applications in isolated environments called containers. These containers include everything your application needs: code, runtime, libraries, and dependencies. Containers ensure your app runs consistently regardless of where it’s deployed—whether on your local machine, a server, or in the cloud.


Why Docker Matters

Docker is popular for several reasons:

  1. Consistency Across Environments: It eliminates the classic "works on my machine" problem.
  2. Efficiency: Containers are lightweight and start up faster than virtual machines.
  3. Simplified CI/CD Pipelines: Docker makes it easy to automate build, test, and deployment workflows.
  4. Portability: Containers can run anywhere—on a laptop, on-premise server, or cloud provider.

How Docker Works

Docker uses the following components:

  • Docker Images: Read-only templates that define the contents and environment for a container.
  • Docker Containers: Runtime instances of images.
  • Dockerfile: A script that specifies how to build a Docker image.
  • Docker Compose: A tool to define and manage multi-container applications with a single YAML file.

Here’s how it all comes together: you write a Dockerfile to create an image, and then you use the image to spin up containers. For more complex applications, you use docker-compose.yml to manage multiple containers.


Example 1: The Dockerfile

A Dockerfile is a script that automates the steps needed to build a Docker image.

Here’s a part of the Dockerfile I made for this website.

# Use an official NodeJS image as the base
FROM node:20-alpine AS base

# Set the working directory inside the container
WORKDIR /app

# Installing the dependencies
FROM base AS deps
RUN apk add --no-cache libc6-compat
COPY package.json package-lock.json ./
RUN npm ci

# Build the application
FROM base AS builder
COPY --from=deps /app/node_modules ./node_modules
COPY . .
RUN npm run build

# Expose the port the app will run on
EXPOSE 3000

# Command to run the app
CMD ["npm", "run", "start"]

What’s Happening Here?

  1. Base Image: The FROM instruction pulls a lightweight NodeJS image.
  2. Working Directory: Sets /app as the working directory inside the container.
  3. Install Dependencies: Installing the dependencies from our package.json file.
  4. Build the App: Build the app to prepare it to run in production.
  5. Expose Port: Opens port 3000 so the app can be accessed externally.
  6. Run the App: The CMD instruction specifies the command to start the app.

Example 2: Using Docker Compose

If the website also requires a database, managing multiple containers can get tedious. This is where Docker Compose shines. With a docker-compose.yml file, you can define and orchestrate all the services your app needs.

Here’s an example docker-compose.yml file for a Flask app with a PostgreSQL database:

version: "3.9"
services:
  web:
    build: .
    ports:
      - "3000:3000"
    volumes:
      - .:/app
    environment:
      - DATABASE_URL=postgresql://user:password@db:5432/mydatabase
    depends_on:
      - db

  db:
    image: postgres:14
    environment:
      POSTGRES_USER: user
      POSTGRES_PASSWORD: password
      POSTGRES_DB: mydatabase
    volumes:
      - postgres_data:/var/lib/postgresql/data

volumes:
  postgres_data:

What’s Happening Here?

  1. Version: Specifies the Compose file format version.
  2. Services:
    • web:
      • Builds the app using the Dockerfile in the current directory.
      • Maps port 3000 on the host to port 3000 in the container.
      • Sets environment variables, including the database connection URL.
      • Depends on the db service, ensuring the database starts first.
    • db:
      • Uses the official PostgreSQL image.
      • Configures the database credentials and initializes the database.
  3. Volumes:
    • postgres_data: Creates a named volume to persist PostgreSQL data.

Running the Application

To start everything, simply run:

docker-compose up

This command builds and starts all the services defined in the docker-compose.yml file.


How Docker Helps My CI/CD Pipeline

Once the site was containerized, I integrated Docker with my CI/CD tool. Here’s an example workflow using GitHub Actions:

.github/workflows/deploy.yml

name: Deploy Website

on:
  push:
    branches:
      - main

jobs:
  build-and-deploy:
    runs-on: ubuntu-latest

    steps:
    - name: Checkout code
      uses: actions/checkout@v3

    - name: Set up Docker
      uses: docker/setup-buildx-action@v2

    - name: Build Docker image
      run: docker build -t my-website:latest .

    - name: Push Docker image to registry
      run: docker tag my-website:latest myregistry/my-website:latest &&
           docker push myregistry/my-website:latest

    - name: Deploy to server
      run: |
	      docker pull myregistry/my-website:latest
	      docker compose up
	      docker system prune --all --volumes --force

Key Steps

  1. Build: The workflow builds a Docker image for the site.
  2. Push: The image is pushed to a container registry (e.g., Docker Hub).
  3. Deploy: The image is pulled onto the server, replacing the old container with the updated one.

Conclusion

Docker has revolutionized how applications are developed, deployed, and maintained. It allowed me to build this website, integrate it into a CI/CD pipeline, and deploy it seamlessly. Whether you're running a simple static site or a complex multi-service app, Docker simplifies workflows and ensures consistency.

2024-present Thomas Béchu. All Rights Reserved