Crypto

How to configure PostgreSql with Nestjs and Docker for Quick Local Development: A quick guide

Launching a new project and needs postgres for Nestjs development, but you do not want to commit (still) to a production DB supplier? Executing a local postgrian body in Docker is your best friend. Simple. Reliable. No system footprint.

Below, I will guide you through my reference configuration to run the NestJ and PostgreSql using Docker – minimum friction, fully scriptable, always reproducible. You will get a practical configuration, commands for direct access to the container and a NestJS Database Database Example.

Why this configuration?

Early development is to move quickly: change diagrams, reset data, run migration, sometimes all day. The Cloud Databases managed (such as neon) are an excellent final destination, but for hacking and local tests, Docker wins each time. It prevents postgres from your host machine, avoids surprises of “work on my machine”. It is a real plug-and-play for local development.

Project structure and required files

Here is what we are going to put in place:

  • Dockerfile For the Nestjs application
  • Docker- compose.yml To wire the knot and postgres
  • .G File for environmental variables
  • Example of config and nestjs scripts
  • Practical orders for current work flows

Dockerfile: Simple knot environment

FROM node:18

WORKDIR /app

COPY package*.json ./

RUN npm install

COPY . .

EXPOSE 3000

CMD ["npm", "run", "start:dev"]

Docker-compose.yml: knot + postgres side by side

It is the magic sauce that sticks your knot API together and a disposable postgre instance.

version: "3.8"

services:
  db:
    image: postgres:13
    restart: always
    env_file:
      - .env
    ports:
      - "5432:5432"
    volumes:
      - db-data:/var/lib/postgresql/data

  api:
    build:
      context: .
      dockerfile: Dockerfile
    ports:
      - "3000:3000"
    depends_on:
      - db
    env_file:
      - .env
    command: sh -c "npm run migration:run && npm run start:dev"  

volumes:
  db-data:

Advice: the volume key allows your database to survive the restarts without losing data.

.G

Create a .env File at the root of your project:

POSTGRES_USER=postgres
POSTGRES_PASSWORD=changeme
POSTGRES_DB=app_db
POSTGRES_HOST=db
POSTGRES_PORT=5432
PORT=3000

Keep your secrets out of Git! .env between .gitignore.

Package.json Scripts: Interactive containers

Why remember container identifiers? Add this to your package.json Scripts for quick access:

"scripts": {
  "db": "docker exec -it $(docker-compose ps -q db) bash",
  "api": "docker exec -it $(docker-compose ps -q api) bash"
}

Now run just npm run db for a database container shell, or npm run api for application.

NestJS: Connection to your DOCKING DATA

In your main startup (for example main.ts):

async function bootstrap() {
  const app = await NestFactory.create(AppModule);
  await app.listen(process.env.PORT);
}
bootstrap();

Database configuration:

Here is a common configuration file for Typeor:

const config = {
  type: "postgres",
  host: process.env.POSTGRES_HOST,
  port: parseInt(process.env.POSTGRES_PORT, 10),
  username: process.env.POSTGRES_USER,
  password: process.env.POSTGRES_PASSWORD,
  database: process.env.POSTGRES_DB,
  entities: [__dirname + "/**/*.entity{.ts,.js}"],
  synchronize: false,  // safer for non-prod
  migrations: [__dirname + "/migrations/**/*{.ts,.js}"],
  autoLoadEntities: true,
};

Development of workflow: daily orders

  • Start everything:docker-compose up --build (first time) or just docker-compose up
  • Show newspapers:docker-compose logs -f api
  • Dueyez it (remove the containers):docker-compose down
  • Jump into the db shell:npm run db
  • Jump into the application container:npm run api

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button