As a user, I want to be able to logically group my testing/building/deploying steps.
However, the current step implementation does not seem to be set up to do this effectively. Does bitbucket allow any way to share data (artifacts and/or caches) between steps that doesn't require uploading/downloading to s3?
Step 1) Build frontend app, run tests
Step 2) Build backend app, run tests
Step 3) Build new docker containers
Step 4) Deploy new frontend/backend assets and containers to server
As you can see, each of these steps have some overlap. They all need to download the same caches (docker, node modules, etc), and data/artifacts from steps 1 and 2 (eg, app.bundle.js, all static assets, etc) could easily be reused in steps 3 and 4.
Because steps run in total isolation isolation, caches (which can get quite large and time-consuming to download) have to be downloaded each time. Sharing any data between steps (via artifacts) requires an upload to s3, and then downloading it right after. So, it ends up being far more efficient to just run everything as one giant step.
Having one giant step is very unfortunate because in order to see why a build failed I need to dig through a giant log file without any help from the UI. It also removes the ability to rerun failed steps (ie, if the tests passed but the deploy failed, I have to run the entire thing over from scratch, instead of just the last step).
How are others dealing with this? My situation does not feel unique.
We are excited to announce the open beta program for self-hosted runners. Bitbucket Pipelines Runners is available to everyone. Please try it and let us know your feedback. If you have any issue...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events