Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

How to pipe docker image to google artifact registry

We have been using `atlassian/aws-ecr-push-image:1.5.0` to push the docker images built in our pipelines to aws elastic container registry.

We now need to set up a similar pipeline to deploy docker images to google artifact registry.

The integrations listed at don't seem to cover gcp artifact registry deployment. Only app runner, storage, and kubernetes.

Have you got any examples of how to deploy to google artifact registry? I've been googling and there's a few out there but there's very few examples and they aren't very generic.

1 answer

0 votes
Patrik S Atlassian Team Jul 19, 2022

Hello @Rokkup-Tom ,

I'm afraid that we indeed don't have a pipe to push docker images to Google Artifact Registry (GAR). I went ahead and created a feature request to develop a pipe for that functionality, which you can check in the following link:

I would suggest you to add your vote there, since this helps both developers and product managers to understand the interest. Also, make sure you add yourself as a watcher in case you want to receive first-hand updates from that ticket. Please note that all features are implemented with this policy in mind.

We also don't currently have an official bitbucket pipelines documentation on how to deploy to Google Artifact Registry, but doing some research I found this article which explains step-by-step how to configure your pipeline to push a docker image to Google Container Registry (GCR) : 

Although GCR is being replaced by GAR, this guide can be a good starting point, as the syntax doesn't seem to have changed much, and the authentication options are very similar.

You can also refer to GAR official documentation and compare with the examples in the article for any syntax/command that might have been changed : 

Hope that helps! Let me know in case you have any questions.

Thank you, @Rokkup-Tom .

Kind regards,

Patrik S

Hey Patrick,

Sorry I missed this email in my inbox. Thank you for raising the feature suggestion! In the meantime I managed to build some pipes to manually build the docker image, deploy it to GAR, and trigger a rebuild of my cloud run container using similar steps to your linked example. I can share the code if it would be useful?

My only problem is that I can't figure out how to pass parameters to the pipes. With the ECR push I can reuse the same pipe and just pass in the repository path and image name, but with GAR pushing I've had to duplicate steps to deploy to dev, prod, etc.

I imagine I'd have to develop my own pipe to pass through parameters correct? As steps seem to be hardcoded (using fixed env vars for example).

I've added my vote to the jira ticket so hopefully it'll gain interest.

Thanks again for your help!


Patrik S Atlassian Team Jul 26, 2022

Hello @Rokkup-Tom ,

You currently can't pass parameters to a pipeline step, but you can make use of deployment environments to have environment variables that have the same name but different values,  and combine it with YML anchors in order for not having to repeat the same step multiple times and to have different values 

Please allow me to share an example of a YML file :



  steps: # defining the step that will be referenced multiple times later in pipeline

    - step: &build-deploy

        name: Build and deploy


          - echo $MY_VARIABLE  #printing the env variable




    - step

        <<: *build-deploy

        deployment: "test" # This step will print the value for MY_VARIABLE defined in the deployment environment Test

    - step

        <<: *build-deploy

        deployment: "staging" # This step will print the value for MY_VARIABLE defined in the deployment environment Staging

    - step

        <<: *build-deploy

        deployment: "production" # This step will print the value for MY_VARIABLE defined in the deployment environment Production

In the beginning of the example, we are defining the yml anchor with the template of the step we want to repeat in the pipeline's definition. You can define as many steps as you want, and reference them in the pipelines definition by its alias.

Now in the pipelines definition, we are referencing the step anchor multiple times but overriding (<<) the deployment attribute of the step to different deployment environments. This allows us to configure a deployment variable with the same name MY_VARIABLE in each of the deployment environments (Repository Settings > Deployments), but with different values for each environment, meaning that each step will use the value of the variable of the respective environment it's referring to.

Using the suggestion above you could accomplish having different variables for each step in a "workaround-ishy" way, and avoid having to repeat the step multiple times in your yml file.

Let me know if you have any questions.

Thank you, @Rokkup-Tom .

Kind regards,

Patrik S

Suggest an answer

Log in or Sign up to answer
Site Admin

Atlassian Community Events