Running Applications with Docker and Docker Compose — A Practical Guide
Apr 27, 2025
•4 minute read•133 views
Over my experience as a full stack developer, Docker has become almost my second — and for good reason. It simplifies running applications in consistent environments, whether you're on your laptop, a server, or somewhere in between. But it took away my opportunity of saying “but it runs on my machine“
If you’re someone just getting started with containerizing your applications, or maybe you've dabbled with Docker before but haven't gotten to grips with Docker Compose, this guide is for you.
Let's get a simple app up and running — and more importantly, let’s understand what’s going on along the way.
First Things First — Why Docker?
When you're working on a project, chances are you need to install a bunch of dependencies, deal with different versions of Node.js, PostgreSQL, maybe Redis, and so on. It works on your machine... until it doesn’t on someone else's.
Docker solves that by bundling your app + everything it needs into an isolated container.
Your project runs the same way, everywhere!
And Docker Compose?
It’s like Docker’s project manager. Instead of running one container at a time, you can describe your whole app — backend, database, frontend, whatever — in a single YAML file, and bring it all up (or down) with one command.
Setting Up the Playground
Let’s say you have a basic Node.js app that needs a PostgreSQL database.
Here’s the structure we'll build:
my-app/
├── backend/
│ ├── Dockerfile
│ ├── app.js
│ └── package.json
├── docker-compose.yml
└── .env
We'll spin up a Node server, connect it to a Postgres database, and manage both containers using Docker Compose.
Building the Backend App
Inside your backend/
folder, create a minimal Node.js app (not a todo app though).
package.json:
{
"name": "backend",
"version": "1.0.0",
"scripts": {
"start": "node app.js"
},
"dependencies": {
"express": "^4.18.2",
"pg": "^8.9.0"
}
}
app.js:
const express = require('express');
const { Pool } = require('pg');
const app = express();
const port = 3000;
const pool = new Pool({
user: 'postgres',
host: 'db', // notice 'db' — we'll define it in docker-compose.yml
database: 'mydb',
password: 'postgres',
port: 5432,
});
app.get('/', async (req, res) => {
const result = await pool.query('SELECT NOW()');
res.send(`Hello! Time from DB: ${result.rows[0].now}! Get a clock, touch grass!`);
});
app.listen(port, () => {
console.log(`Server running on port ${port}`);
});
Nothing fancy here — just a simple Express server fetching the current time from the database. Who needs a clock :D
Containerizing the Backend
Now, let’s dockerize it.
Inside backend/
, create a Dockerfile:
FROM node:18
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
What’s happening here?
We're starting with a Node 18 image
Setting
/app
as the working directoryInstalling dependencies
Copying the code and telling it to run
npm start
when the container launches
Pretty straightforward.
Wiring it Together with Docker Compose
In the root my-app/
directory, create a docker-compose.yml:
version: '3.8'
services:
backend:
build: ./backend
ports:
- "3000:3000"
environment:
- PGHOST=db
- PGUSER=postgres
- PGPASSWORD=postgres
- PGDATABASE=mydb
depends_on:
- db
db:
image: postgres:15
restart: always
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: mydb
volumes:
- pgdata:/var/lib/postgresql/data
volumes:
pgdata:
What this does:
backend service will be built from the Dockerfile
db service will spin up a Postgres container
They’re networked automatically (you can refer to the database as
db
)We're also creating a persistent volume for Postgres data
Running Everything
From the root my-app/
folder, run:
docker-compose up --build
If everything's good, you’ll see both containers coming up.
Hit http://localhost:3000 on your browser — you should see:
Hello! Time from DB: 2025-04-14T10:25:15.000Z! Get a clock, touch grass!
Success. 🚀
Managing Containers
Some useful commands you'll keep handy:
Command | Purpose |
docker-compose up | Start services |
docker-compose down | Stop and remove everything |
docker ps | Check running containers |
docker-compose logs backend | See backend logs |
docker-compose exec backend sh | Open a shell inside the backend container |
Want to keep services running in the background? Just add the -d
flag:
docker-compose up -d
A Couple of Good Practices
- Add a
.dockerignore
file insidebackend/
to avoid copying junk:
node_modules
npm-debug.log
.env
- Always pin your base images. Instead of
node:latest
, use something likenode:18
to avoid surprises in production.
Wrapping Up
Docker and Docker Compose can feel a bit much at first — but once you get it, it’s honestly hard to imagine managing local dev environments without them.
This setup — a simple app + database — is the foundation for scaling up to bigger architectures too (think Redis, queues, separate workers).
Once you’re comfortable here, spinning up production-grade containers with environment-specific configs becomes a breeze.
That’s it! Now go dockerize everything. (Responsibly)