Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

bitbucket pipeline with package dependency install

Hey community,

Previously I was installing my required Python packages to a zip folder and committing these files to my bitbucket directory where a bitbucket pipeline would deploy these resources to S3. I realized that I could add a step in my pipeline to perform the pip3 install command to install the required packages on each deployment.

What is the best way to configure such as step, because I also need to zip the contents into a folder which gets uploaded to S3. I got the following to work:


- step:
name: Building artifact
- apt update && apt install sudo
- adduser build
- usermod -aG sudo build
- su build
- sudo add-apt-repository ppa:deadsnakes/ppa
- sudo apt install -y python3.9
- sudo apt install -y python3-pip
- python3.9 --version
- mkdir build
- cd lambda
- pip3 install -r requirements.txt


With the default docker image (atlassian/default-image:3), it uses Python 3.8, but I need 3.9 in my use case, and I wasn't able to get that installed unless I added a user.

Is there a better way to get the latest version of python installed with the default atlassian docker image?

1 answer

0 votes

Hi @dataking,

You should be able to install python 3.9 with the same commands without creating an extra user, you just need to remove sudo from the commands add-apt and apt install. Pipelines builds run as root and you don't need sudo to install anything.

That being said, I'm not sure how your build is going to be affected if you have two versions of python installed.

You could also use the Docker image python:3.9 instead of the atlassian default one, or if you are familiar with creating Docker images you could create a custom Docker image based on python:3.9 and in the Dockerfile add commands to install any other tools that you need for your builds.

If the Docker image has zip installed, you can zip any directories and/or files that exist in the clone path.
The build is running at this path /opt/atlassian/pipelines/agent/build
This is where the repo is cloned during the build, and if you have commands creating files/directories in the present directory, this is where they can be found, so you can give to the zip command the relative path to the files/directories you want to zip.

If you have any questions, please feel free to let me know.

Kind regards,

Suggest an answer

Log in or Sign up to answer

Atlassian Community Events