I can see from https://confluence.atlassian.com/bitbucket/variables-in-pipelines-794502608.html that there is a variable called BITBUCKET_COMMIT
But if this was part of multiple changes committed and pushed at once, how do I get all of those commits since the last version of BITBUCKET_COMMIT
I'd like to build some custom JSON from the list of changes (commit hash+ message) since the last time my pipeline ran.
Hi Mark, welcome to the Atlassian Community.
As you have not shared with your question so far, I don't know if you're using Atlassian Bitbucket Server or Cloud, as I normally use Cloud I would assume this in the following (but it might not make much of a difference for Server).
From top of my head it's possible to get these commit hashes (SCM revisions) for each pipeline via the Atlassian Bitbucket Cloud REST API.
Check Filter and sort API objects to obtain the commit hash (SCM revision) of the specific build number you're looking for.
You can then make use of git (if in your pipeline the clone depth is large enough, otherwise extend the shallow clone, e.g. git fetch --shallow-since <date>) to obtain all revisions from last build to current build.
However: This just shows what is technically possible (e.g. like your question), but there are many different triggers for a pipeline run and only the previous build number might not cut it.
So you can add more criteria to the API filter query to locate the actual "last" pipeline you're looking for.
Thanks for the info, I am using BitBucket Cloud
I ended up using git's own log command so my bash script looks something like this:
workspace=$1 # Comes from $BITBUCKET_REPO_OWNER
repository=$2 # Comes from $BITBUCKET_REPO_SLUG
buildNumber=$3 # Comes form $BITBUCKET_BUILD_NUMBER
commitHash=$4 # Comes from $BITBUCKET_COMMIT
gitBranch=$5 # Comes from $BITBUCKET_BRANCH
gitOrigin=$6 # Comes from $BITBUCKET_GIT_HTTP_ORIGIN
# Get Git commit info
if [ $buildNumber -eq 1 ]
echo "Getting commits for $commitHash"
git log --pretty=oneline $commitHash > git-commits.log
previousbuildNumber=$(expr $buildNumber - 1)
prevFullHash=$(curl -s -X GET "$prevCommitHashURL" | jq '.target.commit.hash')
prevHash=$(echo $prevHash | cut -b 1-7)
commitHash=$(echo $commitHash | cut -b 1-7)
echo "Comparing between $prevHash and $commitHash"
git log --pretty=oneline "$prevHash"..$commitHash > git-commits.log
It's possible I may just use the triggering hash to start with, as calling out to the jira rest api might be a call I'd rather not do (need to store credentials for private repo's) plus additional network calls etc.
Sounds reasonable to me. Depending on where you create the log, just using git for that sounds fair. The remark about shallow cloning might still apply, but can be solved as well with git only but you will have additional network activity then (at least when running in the Bitbucket infrastructure, the default).
It's also possible to tell in the pipeline to fully clone ("full"), the default is 50 commits:
So depending on your needs, this can be controlled already with the bitbucket-pipelines.yml file. Normally it's fine to raise a bit from the default 50 clone depth for larger releases.
Hi Atlassian Community, My name is Avni Barman, and I am a Product Manager on the Confluence Cloud team. Based on feedback from you, we are giving admins more power to create templates that a...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events