How to maintain the state of a service in Pipelines?

Simon Sattes
Contributor
May 28, 2024

Hi all,

I'm using a mysql service which is required for some integration tests. This service is seeded with test data. However, it seems in any later step it just re-creates the service again from scratch, so it doesn't have the data pre-filled. So I would need to pre-fill it in every step where I want to use the mysql service.

Is there a way to keep the mysql service running during the whole pipeline? Or what would be the best approach?

This is the current pipeline definition (simplified):

 

image: davidzapata/php-composer-alpine:8.2

pipelines:
default:
- step:
name: Prepare environment
caches:
- vendor-folder
services:
- mysql
script:
- composer install --no-plugins --no-scripts --no-interaction
- php seed-database.php

- step:
name: Execute PHPUnit tests
size: 2x
services:
- mysql
caches:
- vendor-folder
script:
- phpunit --log-junit ./test-reports/junit.xml

- step:
name: Execute e2e tests
services:
- mysql
caches:
- vendor-folder
script:
- php run-e2e-tests.php

definitions:
caches:
vendor-folder:
key:
files:
- composer.lock
path: vendor

services:
mysql:
image: mysql:8.0
variables:
MYSQL_DATABASE: $MYSQL_DATABASE
MYSQL_ROOT_PASSWORD: $MYSQL_PASSWORD

1 answer

1 accepted

1 vote
Answer accepted
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
May 29, 2024

Hi Simon,

Just to give you some context, Pipelines builds run in Docker containers. For every step of your build, a Docker container starts (the build container) using the image you have specified in your bitbucket-pipelines.yml file (in your case, it's the Dockerhub image davidzapata/php-composer-alpine:8.2) and the repo is cloned in this container. If a certain step has services, additional Docker containers get created for these services that share a network adapter with your build container. Then, the commands of the step's script are executed, and when the step finishes, all these Docker containers get destroyed.

This process is repeated for every step of a Pipelines build so it is not possible to keep the mysql service across different steps. Please note that it is not just the service container that gets destroyed after every step, but the build container as well.

You could create a database backup during the first step, define the backup file as an artifact, and then use it in subsequent steps to restore. Or it may be quicker to use your script to fill the database in every step that needs it.

You could also remove the first step altogether and run the 'composer install' command during the step 'Execute PHPUnit tests'. You have a first step for preparing the environment, but keep in mind that this environment is destroyed once the step is finished.

Please feel free to reach out if you have any questions.

Kind regards,
Theodora

Simon Sattes
Contributor
June 4, 2024

Hi Theodora, 

thanks for your reply and for confirming they get destroyed after each step. I now moved the composer install step to the phpunit execution, so it only gets executed once.

Thanks for your ideas!

Simon 

Like Theodora Boudale likes this
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
June 4, 2024

You are very welcome, Simon. Please feel free to reach out if you ever need anything else!

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PRODUCT PLAN
PREMIUM
TAGS
AUG Leaders

Atlassian Community Events