In our pipeline, we build a container and push that to AWS ECR. Recently we expanded our tests in a way that requires them to be run inside the container otherwise they won't work. What I'm trying to figure out is how to set up a pipeline build step that will build and then run a test container and get the results from the test container in such a way that if the test fails the pipeline fails.
If someone has an example they could point me to that would be awesome.
We have a similar process, we build the container and run the test inside the container.
We use these lines for this:
- export IMAGE_NAME=gcr.io/example:$BITBUCKET_COMMIT
- docker build -t $IMAGE_NAME
- docker run --add-host host.docker.internal:$BITBUCKET_DOCKER_HOST_INTERNAL -e DATABASE_URL='postgis://test:firstname.lastname@example.org/app_db' $IMAGE_NAME /bin/bash -c "python manage.py check && pylint -j 4 --load-plugins pylint_django apps/ -E && pytest && coverage html"
I can send all the code if you need,
We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events