Hi,
I'm looking for solution to use Redis or database data in other step.
Right now I used for share application files, folders data in multiple steps. This working as expected.
But how to share data stored in database or Redis on multiple steps?
Hi @asi_lh,
Every step of a Pipelines build runs in a separate Docker container, and that container (along with any service containers) gets destroyed when the step is finished.
Artifacts are the only way to share data between steps:
Only files that are in BITBUCKET_CLONE_DIR at the end of a step can be configured as artifacts.
What you could do is create a backup file of the database in BITBUCKET_CLONE_DIR during one step and define this backup file as an artifact.
Then, in the following step, you can use this backup file to create the database with the data you had in the previous step.
Kind regards,
Theodora
Hi,
The other way is to use one step only instead of five separate ones for the tasks you listed in your reply to Aron. This is the recommended way if the first four steps are needed in order to run tests.
As I already mentioned, each step of a Pipelines build runs in a separate Docker container and that container gets destroyed when the step is finished. It doesn't make a lot of sense to have a separate step (and thus a new Docker container) to install dependencies and then do nothing with these dependencies in the same step. That would be similar to spinning up a virtual machine, installing dependencies, and then destroying the virtual machine, and spinning up a new one to run your tests.
If you need certain tools (dependencies, database, etc) in order to run tests, then the recommended approach is to install them in the same step where the tests run.
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
There is a way to achieve what I want to in way how I described it - separate steps without extra layer of backup database, Redis or something else.
And I already have that setup, but waiting to next month, to get next free hours ;)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
Thank you for the update. I'm glad that the setup you have works for you. In case you would like to share data between steps, then artifacts would be the way to go.
Regarding the build minutes, if you have a server where you can run builds, you could also use one of our runners:
Runners allow you to run Pipelines builds on your own infrastructure, you can still see the build logs on Bitbucket's website, and you won’t be charged for the build minutes used by your self-hosted runner.
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I don't fully understand the use case, but custom caches may fit the bill!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I want to separate steps by what they doing.
Steps:
I don't want to have step 2, 3, 4, 5 combined in one step, because it's not elegant way. In that way, wy have ugly name for those combined steps: "prepare database + prepare external data + some other + tests". So it is not elegant way.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.