Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Cannot execute next step after running docker up

Jouni Riimala
Contributor
March 7, 2023

Hi,

I have following pipeline on bitbucket. Issues is that after running docker-compose -f docker-compose_test.yml up it just runs keep running docker container & doesn't move to step Populate db and start testing.

When running docker it actually start postgress db and node server. After this command <npm run userSeed> actually uses sequelize to post json record to the db. After this idea is to execute few integration tests.

Any help would be appreciated, thx in advanced.

 

image: docker:stable

options:

docker: true

pipelines:

default:

- step:

name: Build & run Docker

image: python:3.8.1

services:

- docker

caches:

- docker

- pip

script:

- pip install docker-compose

- docker-compose -f docker-compose_test.yml build

- docker-compose -f docker-compose_test.yml up

- step:

name: Populate db and start testing

script:

- npm run userSeed


- npm run test

 

1 answer

0 votes
Erez Maadani
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
March 7, 2023

Hey @Jouni Riimala 

  1. docker-compose up is a blocking command. It will return once the containers are done running or if there was an interrupt. You probably going to need to add the `-d` or `--detach` flags to run it in background.
  2. Each step runs in its own container. This means that your tests should be triggered within the same step. Since your tests should wait till your docker is running, I would suggest adding them into your docker-compose file and then you won't need to run in detach mode.
Jouni Riimala
Contributor
March 8, 2023

Hi,

Sorry from the late reply, but now I modified system so that I'm just building and running docker compose file (having db and node server) .and in the node server docker file I'm using command to execute node server, seeds and test so end of the dockerfile having CMD  with multiple commands.

Now when pipeline is activated it just running docker continuously ... without no evidence from executing seeds and test. Doesn't seems to work as I would expect. 

 

-Jouni

Patrik S
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 14, 2023

Hello @Jouni Riimala ,

Could you share your modified bitbucket-pipelines.yml file so we can analyze what is the current build configuration you have?

Also, would it be possible to also share the content of your docker compose yml ? 

When sharing this info here, please make sure to sanitize any sensitive information.

Thank you, @Jouni Riimala !

Patrik S

Jouni Riimala
Contributor
March 14, 2023

Hi Patrik,

My intention is to run database and server on bitbucket and seed default records into the postgress db (using sequelize) and execute testing. Likely my approach is not good, but this is my intention. I would welcome any examples how to do this correctly.  

My latest approach is to embedded all the database seeding and testing to the server dockerfile. This is not working as it complaints from missing bash module when I tried to run it in the terminal (I installed it via npm i bash --save, without impact so still complains.)

Compose file, server dockerfile and pipeline ->

version: "3.9"

services:

    db:

        container_name: node_db

        image: postgres:14.1-alpine

        restart: always

        environment:

          - POSTGRES_USER=postgres

          - POSTGRES_PASSWORD=postgres

          - POSTGRES_DB=postgres

        ports:

          - '5432:5432'

        # volumes:

          # - db:/var/lib/postgresql/data

          # - ./db/init.sql:/docker-entrypoint-initdb.d/create_tables.sql

    server:

        container_name: node server

        hostname: nodeserver

        build:

            context: ./server

        depends_on:

          - db

        ports:

          - "3006:3006"

        environment:

          ACCESS_TOKEN_SECRET: ${ACCESS_TOKEN_SECRET}

          REFRESH_TOKEN_SECRET: ${REFRESH_TOKEN_SECRET}

          SERVER: ${SERVER}

          PGHOST: host.docker.internal

          NODE_ENV: development
 
FROM node:16

WORKDIR /App

COPY config.json ./server/config/

COPY package.json .

COPY package-lock.json .

RUN npm install

COPY . .

EXPOSE 3006

CMD [ "node", "./server.js" ]

CMD ["/bin/bash.sh", "-c", "node", "./server.js" && "npm", "run", "siteSeed" && && "npm", "run", "contractSeed" && "npm", "run", "orderSeed" && "npm", "run", "licenseSeed" && "npm", "run", "userSeed" && "npm", "run", "test"]


and in the pipeline to have only this, which install docker compose, build and run docker compose file. As an output this just run docker all the time and without any additional ouput so this approach is not working... 

image: docker:stable

options:

docker: true

pipelines:

default:

- step:

name: Build & run Docker

image: python:3.8.1

services:

- docker

caches:

- docker

- pip

script:

- pip install docker-compose

- docker-compose -f docker-compose_licdb.yml build

- docker-compose -f docker-compose_licdb.yml up

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events