Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,557,770
Community Members
 
Community Events
184
Community Groups

Basic question about sharing docker instances

Edited

Hello everyone!

Today I discovered bitbucket pipelines and I find it very interesting.

But I have a basic performance problem, probably very common.

I have a script that looks like the following code:

image: docker/compose:1.29.2

pipelines:
default:
- step:
name: Setup
script:
- cp .env.local .env
- docker-compose up -d
- echo Setup done
services:
- docker
- step:
name: Run migrations
script:
- docker-compose run project alembic upgrade head
- echo Migrations done
services:
- docker
- step:
name: Run tests
script:
- docker-compose run project python -m pytest -rP -vv -x
- echo Tests done
services:
- docker

What is the problem? The second step returns Error: No such container
I realized I need to setup the container for each step, but there is any way to do this?

This is a good way to implement what I want??

image: docker/compose:1.29.2

pipelines:
default:
- step:
name: Run migrations
script:
- cp .env.local .env
- docker-compose up -d
- docker-compose run project alembic upgrade head
- echo Migrations done
services:
- docker
- step:
name: Run tests
script:
- cp .env.local .env
- docker-compose up -d
- docker-compose run project python -m pytest -rP -vv -x
- echo Tests done
services:
- docker

Just for context: the docker compose has a mysql, a firestore emulator, and an image that installs certain python packages.

2 answers

1 accepted

0 votes
Answer accepted
Aron Gombas _Midori_
Community Leader
Community Leader
Community Leaders are connectors, ambassadors, and mentors. On the online community, they serve as thought leaders, product experts, and moderators.
Sep 29, 2022

Bitbucket Pipelines has a feature called cache which allows one step to produce an artifact which can used by later steps:

https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/

Thanks! This is what I was looking for

Like Aron Gombas _Midori_ likes this
0 votes
Erez Maadani
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
Sep 29, 2022

You should consider that each step runs on a different container, possibly on different host.

The simplest option would be to upload the created image (in step #1) to a repository and pull it / use it as a runner in the next step.  

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events