Glossary

Docker

Docker is a platform for building, shipping, and running containerized applications. A Docker container packages an application with all its dependencies into a standardized unit that runs consistently across any environment.

Explanation

The classic problem Docker solves: 'it works on my machine' — different environments (developer laptops, staging servers, production) have different OS versions, library versions, and configurations that cause subtle bugs. Docker packages everything the app needs (runtime, libraries, configuration, code) into a container that runs identically everywhere. Key concepts: a Docker image is a read-only template (layers of filesystem snapshots) that defines what's in the container — the blueprint. A Docker container is a running instance of an image — the actual running process. Dockerfile is the instructions for building an image: start FROM a base image, COPY files, RUN commands, set ENV variables, EXPOSE a port, and define the CMD to run. Images are layered: each instruction in a Dockerfile creates a new layer. Layers are cached — if a layer hasn't changed (e.g., RUN npm install based on an unchanged package.json), Docker reuses the cached layer, making builds much faster. This is why Dockerfile best practice copies package.json and runs npm install before copying source files — the install layer is cached unless dependencies change. Docker Compose orchestrates multiple containers: a docker-compose.yml file defines services (your app, a database, a cache), their images, ports, volumes, and networks. docker-compose up starts everything together. This makes local development with multiple services (Node.js app + PostgreSQL + Redis) trivial.

Code Example

bash
# Dockerfile for a Node.js application

FROM node:20-alpine          # base image — small Alpine Linux with Node 20

WORKDIR /app                 # working directory inside container

# Copy dependency files first (cached layer — only re-runs if package.json changes)
COPY package*.json ./
RUN npm ci --only=production # install production deps

# Copy source code (separate layer — changes with every code update)
COPY . .

ENV NODE_ENV=production
EXPOSE 3000

# Security: run as non-root user
USER node

CMD ["node", "server.js"]

# Build and run:
# docker build -t myapp:latest .
# docker run -p 3000:3000 -e DATABASE_URL=... myapp:latest

---
# docker-compose.yml: local development with multiple services
version: '3.9'
services:
  app:
    build: .
    ports: ["3000:3000"]
    environment:
      DATABASE_URL: postgres://postgres:password@db:5432/myapp
    depends_on: [db, redis]

  db:
    image: postgres:16-alpine
    environment: { POSTGRES_PASSWORD: password, POSTGRES_DB: myapp }
    volumes: [pgdata:/var/lib/postgresql/data]

  redis:
    image: redis:7-alpine

volumes:
  pgdata:

Why It Matters for Engineers

Docker is the standard for application deployment in 2025. Kubernetes (the dominant container orchestration platform), CI/CD pipelines, and cloud platforms (AWS ECS, GCP Cloud Run, Azure Container Apps) all run Docker containers. Knowing Docker means knowing how to build deployable applications, debug environment-specific issues, and understand the infrastructure your code runs on. Docker also transforms local development: instead of installing PostgreSQL, Redis, and Elasticsearch on your laptop and managing version conflicts, docker-compose up starts all of them in isolated containers that are easily reset or version-controlled.

Related Terms

Container · CI/CD · Environment Variable · nginx

Learn This In Practice

Go deeper with the full module on Beyond Vibe Code.

DevOps & Tools → →