I'm trying to setup a CI environment to run my ruby tests on any commits (currently). I'm having some issues currently as my I can't get my setup configured correctly.
I have my pipeline.yml file:
image: ruby:2.5 pipelines: default: - step: caches: - bundler script: # Modify the commands below to build your repository. - cd $BITBUCKET_CLONE_DIR - echo "sudo apt-get build-essentials" - apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs ruby-dev libidn11 libidn11-dev yarn - echo "bundle install" - bundle install --path vender/bundle - echo "migrate/seed/test" - rake db:migrate RAILS_ENV=test - rake test RAILS_ENV=test services: - postgres - redis definitions: services: redis: image: redis postgres: image: postgres environment: POSTGRES_DB: $db_name POSTGRES_USER: $db_user POSTGRES_PASSWORD: $db_password caches: bundler: vendor/bundle
This get's me about 90% of the way to my CI environment, but for some reason my tests aren't passing in the environment and I want to test locally to figured out why/how to fix it.
To test locally, I have a docker-compose.yml that looks like so:
version: '3'
services:
db:
image: postgres
ports:
- "5432:5432"
volumes:
- ./tmp/db:/var/lib/postgresql/data
redis:
image: redis
web:
build: .
command: bundle exec rails s -p 3001 -b '0.0.0.0'
volumes:
- .:/src
ports:
- "3001:3001"
depends_on:
- db
- redis
A Dockerfile like so:
FROM ruby:2.5
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs ruby-dev libidn11 libidn11-dev yarn
RUN mkdir /src
WORKDIR /src
COPY Gemfile /src/Gemfile
COPY Gemfile.lock /src/Gemfile.lock
RUN bundle install
COPY . /src
The docker setup doesn't resemble the Bitbucket pipeline 100% (and I need it to) because all my rails tests pass in the straight docker container, but not on Bitbucket in the pipeline.
Any help on wha to change for my docker setup, or my pipeline setup?
Or, conversely, my rails tests fail in the Bitbucket Pipeline environment while attempt to access models that extend other models. Any help fixing a rails setup for that would be helpful too.
Hi Brett,
Have you seen this help page about debugging locally? It describes some additional parameters that can be passed to docker to better simulate the Pipelines environment?
https://confluence.atlassian.com/bitbucket/debug-your-pipelines-locally-with-docker-838273569.html
Cheers,
Steven
I have actually, I followed the guide.
I have missed how mine differs, because my tests pass locally in Docker, but not in the pipelines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
The reason I ask is that I can't see where you've passed the parameters restricting the type and amount of memory available to your docker containers (I'm not very familiar with docker compose though). Specifically, to simulate your current Pipelines configuration, each of your services should be limited to 1G memory:
--memory=1g --memory-swap=1g
--memory-swappiness=0
and your main build container 2G:
--memory=2g --memory-swap=2g
--memory-swappiness=0
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
If you continue to see the same discrepancy between your local environment and Pipelines (after applying these memory restrictions) then it might help if you could provide some more detailed information about the errors you're seeing in Pipelines. Eg. is there any indication that the underlying cause may be problem connecting to the postgres service?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
The memory parameters are something to look into.
It seems like connections to postgres are fine. I can run tests that require the database, which will succeed.
The problem I run into, are Ruby models that extend, or inherit, other models. When those tests are ran in the pipelines, records for those are not found.
Maybe it has something to do with the the Ruby fixtures not being loaded into the database for those models? Which I don't know if that's an issue with Ruby, Docker, or Pipelines usage of Ruby & Docker.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.