Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

You asked, we answered — Bitbucket Pipelines webinar Q&A

This article contains answers to questions from the Q&A portion of the recent Scale CI/CD workflows faster with Bitbucket Pipelines webinar:

Pipeline Execution

Q: Do you have plans to allow for parallel deployment steps?

The nature of how deployment locks are implemented prevent the usage of parallel deployment steps.

We recently released Environments, which provide all the permission management of deployment environments without locks. We intend to extend this concept to apply the regular deployment features without a highly opinionated lock.

https://www.atlassian.com/blog/bitbucket/evolving-deployments-in-bitbucket-pipelines-concurrency-groups-and-environments

Q: Pipeline ‘stages' do not support parallel steps - when could we expect that to work?

A stage's principal purpose is to allow multiple steps with deployments to work in a groupable fashion.

We are planning new features that may provide the compartmentalization experience you’re looking for. We’ll have more to say about that soon.

Q: Is it possible to create multiple Bitbucket pipeline files? If not, what are some of the best practices to achieve similar functionality to multiple GitHub workflows within a single Bitbucket Pipeline file?

Not at present. We’re looking at this problem in the near future. You can workaround this with dynamic pipelines, but that's not the future state.

You can define as many pipelines in a bitbucket-pipelines.yml file as you like. You can also define multiple Test, Staging, and Production environments. Each pipeline you define can have an independent trigger.

Q: How can we reuse pipeline steps with YAML anchors and references?

https://support.atlassian.com/bitbucket-cloud/docs/yaml-anchors/

Q: Hello, we are building our Docker Images directly on pipeline, and later uploading to AWS ECR. Is this a good practice? We are having long build times as Bitbucket is not using correctly the cached layers available in our base image.

Consider using buildx in runtimev3 https://docs.docker.com/reference/cli/docker/buildx/build/ and the caching functionality https://support.atlassian.com/bitbucket-cloud/docs/enable-and-use-runtime-v3/

In the future, we’ll consider a tighter integration with https://www.atlassian.com/wac/roadmap/cloud/Docker-Image-Registry?status=future;comingSoon&p=b4c45d8d-68 when this is released.

Q: How exactly did you extract the test report step out of the SonarQube step in order to decouple the Test and SonarQube steps so that they can run in parallel?

Our Test step previously consisted of two make targets: make test and make coverage which were executed sequentially, whereas the SonarQube only depended on the output/artifact generated by make coverage target.

As it turned out, make coverage didn’t depend on the execution of make test, so we refactored this so that the two steps run in parallel:

  • Test: only make test

  • SonarQube: make coverage && sonar

Q: Why is setting up self-hosted CI/CD on a personal VM complex?

Setting up self-hosted CI/CD on a personal VM is complex for several reasons:

 

Infrastructure Management

• Resource allocation: You need to properly size CPU, memory, and storage for your workloads
• Network configuration: Setting up proper networking, firewalls, and security groups
• Operating system maintenance: Regular updates, patches, and security hardening
• Backup and disaster recovery: Ensuring your CI/CD infrastructure and data are protected

 

Security Challenges

• Access control: Managing SSH keys, user permissions, and service accounts
• Secret management: Securely storing and rotating API keys, passwords, and certificates
• Network security: Configuring VPNs, SSL/TLS, and preventing unauthorized access
• Compliance: Meeting security standards and audit requirements

 

CI/CD Tool Complexity

• Installation and configuration: Setting up tools like Jenkins, GitLab CI, or GitHub Actions runners
• Plugin management: Installing, updating, and configuring necessary plugins
• Pipeline configuration: Writing and maintaining complex build/deploy scripts
• Integration setup: Connecting to version control, artifact repositories, and deployment targets

 

Scalability and Reliability

• High availability: Setting up redundancy and failover mechanisms
• Auto-scaling: Handling variable workloads and resource demands
• Monitoring and alerting: Implementing comprehensive observability
• Performance optimization: Tuning for build speed and resource efficiency

 

Maintenance Overhead

• Regular updates: Keeping all components current and secure
• Troubleshooting: Diagnosing and fixing issues when they arise
• Capacity planning: Monitoring usage and scaling resources appropriately
• Documentation: Maintaining runbooks and procedures

Q: Is it possible to add condition for a pipe which checks if there were changes in code since the last successful deploy? I have deployment which build components, its done on each release, and its time consuming. It would be nice to do it only, when source code of component was updated

Is it possible to decouple building the components from deploying them? If so, you could put a condition on the build step so that it only builds when the component's code is changed.

The deployment step could be triggered whenever you want, and it would use the most recent artifact from the build step. You’d probably want to upload the artifact to something like AWS S3.

See this for additional information on conditions: https://support.atlassian.com/bitbucket-cloud/docs/step-options/#Condition

Q: How to set up secret scanning at global/workspace level?

You would want to create a dynamic pipeline that injects the secret scanning step in every pipeline executed in the workspace.

Learn about dynamic pipelines here: https://support.atlassian.com/bitbucket-cloud/docs/dynamic-pipelines/

Q: How to avoid plain .env files and manage secrets securely?

A simple flow is to use secured variables or a third-party secrets integration plugin and inject them as environment variables. A more complex option with better security is to use our OIDC integration with third parties.

See this for a full explanation: https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/

Q: Does Pipelines support Windows workloads?

We support Windows builds via self-hosted runners.

Q: Any plans for Windows support on Atlassian-hosted runners?

This is being actively investigated.

Q: Can you use both own and Atlassian runners in the same pipeline?

Yes, you can configure Bitbucket Pipelines to use self-hosted runners for some steps and Atlassian runners for others. You specify a runs-on configuration for steps using self-hosted runners.

Learn more here: https://support.atlassian.com/bitbucket-cloud/docs/configure-your-runner-in-bitbucket-pipelines-yml/

Q: Every step is run in docker right? Why should the docker memory then not be 100%?

Every step is running in an isolated container that is not directly using the Docker runtime.

If Docker service is configured on a step, then we spin up a Docker daemon that requires an additional resource allocation.

Customers can control how much memory is reserved via yml configuration

Q: When would docker buildx be fully supported so that multi-arch build do not require anymore the declaration of a separate manifest?

It is supported in runtimev3 https://support.atlassian.com/bitbucket-cloud/docs/enable-and-use-runtime-v3/

Q: Is possible to build/deploy infrastructure on Azure using Terraform and not CLI commands? If yes, were to find documentation?

Yes, it is possible to use Terraform in Bitbucket Pipelines.

A pipeline can be as simple as:

image: hashicorp/terraform:latest
pipelines:
    default:
        - step:
            oidc: true
            script:
                - terraform init
                - terraform validate
                - terraform plan
                - terraform apply -input=false -auto-approve

The oidc:true option provides an OIDC token during the build that can be used for authentication.

https://support.atlassian.com/bitbucket-cloud/docs/integrate-pipelines-with-resource-servers-using-oidc/

https://registry.terraform.io/providers/hashicorp/azuread/latest/docs/guides/service_principal_oidc

Q: Is there a way to improve the Build setup step, the initial step always takes around 4s, the other steps will increase drastically to 30s + ?7. Integrations & Extensibility

Most of the build setup time is spent on cloning.

If the repository is really big, consider doing more shallow clones.

Disabling LFS if it is not required for the build can help reduce time.

Q: Are all Bitbucket Pipes from Atlassian or also from 3rd parties?

We have Atlassian-managed pipes, vendor-managed pipes, e.g., Snyk, and you can also create your own pipes.

For an example of a 3rd party pipe see: https://docs.42crunch.com/latest/content/tasks/integrate_bitbucket_pipelines.htm

Q: Can Bitbucket Pipes integrate with Bitbucket Data Center or only Cloud?

Bitbucket Pipelines and Pipes are currently a cloud-only feature.

Q: When will we get conditional steps?

This has been shortlisted for development, but we cannot indicate a timeline yet.

Q: When will we be able to have non-blocking manual steps?

Can you expand on the use case here? You can put the manual step at the point where you want there to be a terminal stop/end to the workflow.

Metrics, Monitoring & Performance

Q: Where can we find pipeline CPU and Memory metrics?

What we demoed was a pre-release feature. We plan to ship this to customers later this year.

Q: What kind of CI/CD analytics are available?

https://support.atlassian.com/bitbucket-cloud/docs/code-insights/

AI Code Review

Q: Is AI code review available for free?

It’s currently in beta and free, along with several other AI tools. Learn more and sign up for early access here.

Q: Is the code review approval feature available on all Bitbucket account tiers?

Yes, code review and approval functionality are available on all tiers as part of your plan.

Q: What models are used for AI code review? Is it dedicated hosting by Bitbucket?

We use a combination of GPT-4o and Sonnet3.5 and are always evaluating and updating to newer models to improve performance. The workflow (the running environment) is hosted within Bitbucket Pipelines, but the models are not; we just use the providers.

Q: We have tried to add the AI Code reviewer, but we found it a bit obtrusive. It posted a lot of comments where many of them were very vague and unimportant, so the good comments drown in this sea of unusable comments. But even more importantly, we found that on every commit to the pull request, the same truck load of comments were made without any regard to what was previously commented. We figured that being able to start the AI code reviewer step manually would help with these issues but there are apparently no ways to trigger a pull request pipeline manually. All in all, our experience with the AI code reviewer is that it sounds a lot better in theory that it actually is. Are we doing something wrong?

  1. Very vague and unimportant:

    1. We are always trying to improve our quality, so there are initiatives now to help Reviewer agents understand more about the codebase context to make comments more relevant.
    2. Has this always been the case, or has it gotten better/worse recently?

  2. Good comments drown in this sea of unusable comments:

    1. We are working on surfacing labels of categories to help people differentiate. It will be there in the next quarter.

  3. Trigger a pull request pipeline manually:

    1. This is coming soon in the coming quarter as well.

  4. To help make comments more tailored, we are now rolling out ‘customization’ to help users instruct the Reviewer agent.

Get started

Read our guide to learn where to create customization files in your repositories, and see example instructions and standards to improve your results. https://rovodevagents-beta.atlassian.net/wiki/external/MWY0OGFmODRmNTYxNGFhMWE5OGQxMzYzYjA4NjQ3OGI

  1. On every commit to the pull request, the same truckload of comments was made:

    1. Reviewer agent is only commenting on the 1st commit at the moment, but not subsequent commits.

Migration

Q: How to migrate from Jenkins to Bitbucket Pipelines?

For Jenkins, we have CLI tools that convert Jenkins modules to Pipelines syntax. Here is the documentation: https://support.atlassian.com/bitbucket-cloud/docs/how-to-migrate-from-jenkins-to-bitbucket-pipelines/

Q: What are the advantages of switching to Bitbucket Pipelines?

This blog outlines some of the benefits: https://community.atlassian.com/forums/Pipelines-articles/6-reasons-to-modernize-your-CI-CD-with-Bitbucket-Pipelines/ba-p/2724325

Q: How big of an effort is migration?

It depends on your current implementation. We recommend migrating a few of your pipelines to understand the effort required. We also have certified partners who can help you with your migration.

Other

Q: How to integrate with tools like ServiceNow, JSM approval, etc.?

Here’s how you can set up JSM change management with Bitbucket https://www.atlassian.com/software/jira/service-management/product-guide/getting-started/change-management#how-it-works

Q: Is there an API to query analytical data or DORA metrics?

Atlassian Analytics can be used to get DevOps data for DORA Metrics https://support.atlassian.com/analytics/docs/schema-for-devops-data/

Q: Is Compass included in a business license or sold separately?

Compass is sold as a separate product and requires its own license.

Q: How is Compass different from Bitbucket Cloud?

Bitbucket is a SCM and CI/CD tool. Compass is an internal developer platform that helps software teams manage their software components, get metrics, and more.

Learn more here: https://www.atlassian.com/software/compass

For more information on how Bitbucket and Compass work together, please take a look at:

https://community.atlassian.com/forums/Bitbucket-articles/Bring-more-context-to-your-code-with-Compass/ba-p/2694067

https://community.atlassian.com/forums/Bitbucket-articles/Bitbucket-Compass-So-much-more-together/ba-p/2752746

Q: Does it cost extra to use CI/CD Pipelines?

CI/CD is charged based on build minutes used. There are free minutes included in your plan to help you get started. You can add more minutes to your plan as needed. https://www.atlassian.com/software/bitbucket/pricing

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events