I have a more detailed question with more attempts here, but in summary, I have a dev docker image which runs all the services I need in order for tests to pass.
docker run -it -p 3000:3000 -p 6379:6379 -p 8983:8983 my_dockerhub/image
./start_services.sh
bundle exec rspec
# everything passes
I'd like to run these tests withing bitbucket pipelines. I tried using this container directly with this in my bitbucket-pipelines.yml:
image:
name: my_dockerhub/image
username: $DOCKER_HUB_USERNAME
password: $DOCKER_HUB_PASSWORD
email: $DOCKER_HUB_EMAIL
# bind ports here somehow?
pipelines:
branches:
'{master, develop, bitbucket_pipelines}':
- step:
name: Test
script:
- ./start_services.sh
- sleep 30
- bundle exec rspec
But I get the following error at the ```bundle exec rspec``` step
# Errno::ECONNREFUSED:
# Connection refused - connect(2) for "127.0.0.1" port 8983
How can I bind ports within the image so the tests pass?
I'm running redis and solr on custom ports. The thing that finally got it working for me was
docker run -id -p 3000:3000 -p 6379:6379 -p 8983:8983 -v $PWD:$work_dir $my_image
-d to run in detached mode
-i to keep stdin open, even if detached
Hi @Conor S
Not sure why you need to bind the ports, but another way to execute commands with your docker run I believe is
- docker run -p 3000:3000 -p 6379:6379 -p 8983:8983 <your_image> /bin/bash -c 'echo yourcommands'
Hope this helps!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.