Infrastructure

Introduction to Docker

Build, ship, and run applications with containers


Docker is a platform for developing, shipping, and running applications in containers. Containers package an application with all its dependencies, ensuring consistency across development, testing, and production environments.

Why Docker?#

Consistency#

Containers include everything an application needs to run, eliminating "it works on my machine" problems.

Isolation#

Containers run in isolated environments, preventing conflicts between applications and improving security.

Portability#

Containers can run anywhere Docker is installed—on your laptop, in CI/CD pipelines, or in production clouds.

Efficiency#

Containers share the host OS kernel, making them lighter than virtual machines while providing similar isolation.

Core concepts#

Images#

An image is a read-only template containing instructions for creating a container. Images are built from Dockerfiles.

1
FROM node:20-alpine
2
WORKDIR /app
3
COPY package*.json ./
4
RUN npm install
5
COPY . .
6
EXPOSE 3000
7
CMD ["npm", "start"]

Containers#

A container is a runnable instance of an image. Containers are isolated but can share resources with the host.

1
# Run a container
2
docker run -d -p 3000:3000 my-app
3
4
# List running containers
5
docker ps
6
7
# Stop a container
8
docker stop <container-id>

Dockerfile#

A Dockerfile is a text file containing instructions to build an image.

1
# Base image
2
FROM python:3.11-slim
3
4
# Set working directory
5
WORKDIR /app
6
7
# Copy requirements
8
COPY requirements.txt .
9
10
# Install dependencies
11
RUN pip install --no-cache-dir -r requirements.txt
12
13
# Copy application code
14
COPY . .
15
16
# Run the application
17
CMD ["python", "app.py"]

Docker Compose#

Docker Compose defines and runs multi-container applications.

1
# docker-compose.yml
2
version: "3.9"
3
services:
4
web:
5
build: .
6
ports:
7
- "3000:3000"
8
environment:
9
- NODE_ENV=production
10
depends_on:
11
- redis
12
13
redis:
14
image: redis:7-alpine
15
volumes:
16
- redis-data:/data
17
18
volumes:
19
redis-data:

Building images#

Basic build#

1
docker build -t my-app:latest .

Multi-stage builds#

Reduce image size by using multi-stage builds:

1
# Build stage
2
FROM node:20-alpine AS builder
3
WORKDIR /app
4
COPY package*.json ./
5
RUN npm ci
6
COPY . .
7
RUN npm run build
8
9
# Production stage
10
FROM node:20-alpine
11
WORKDIR /app
12
COPY --from=builder /app/dist ./dist
13
COPY --from=builder /app/node_modules ./node_modules
14
EXPOSE 3000
15
CMD ["node", "dist/index.js"]

Best practices#

Use specific image tags#

1
# Good
2
FROM node:20.10-alpine
3
4
# Avoid
5
FROM node:latest

Minimize layers#

Combine RUN commands to reduce layers:

1
# Good
2
RUN apt-get update && \
3
apt-get install -y curl && \
4
rm -rf /var/lib/apt/lists/*

Use .dockerignore#

Exclude unnecessary files from the build context:

1
node_modules
2
.git
3
*.md
4
.env

Run as non-root user#

1
RUN addgroup -S appgroup && adduser -S appuser -G appgroup
2
USER appuser

Integration with DevOps Hub#

Build and push Docker images in your pipelines:

1
stages:
2
- name: build
3
jobs:
4
- name: docker-build
5
runner: ubuntu-latest
6
steps:
7
- checkout
8
- run: |
9
docker build -t my-registry/my-app:${{ github.sha }} .
10
docker push my-registry/my-app:${{ github.sha }}

Next steps#