Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

How to copy snippets from BB server to BB cloud?


We are migrating from a local Bitbucket server to Bitbucket Cloud. We want to copy our snippets from server to cloud. 

Snippets are a third-party add-on in the server product, and a built-in feature in the cloud product.

The current version of Bitbucket Cloud Migration Assistant does not handle snippets.

The first way I thought of is the following:

  1. Use the Snippets add-on API to copy/clone the snippet to a local PC. (see ETA below)
  2. Edit the URL in the snippet's Git configuration file, either manually or using the `git remote` command. (I just made this up. It works for repositories, so why not for snippets?)
  3. Use the BB Cloud API to copy/push/write the snippet to the cloud.

Will that work?

If so, what's the REST API call for pushing to our cloud account?

Is there an easier way? 

Has someone already done this -- and if so, how did you do it?


The Snippets add-on API can be used to retrieve the contents of one or more snippets on the local server. The contents are retrieved as a JSON object, which must then be parsed to recover the contents of each snippet.

I have not succeeded (yet) at using the BB Cloud API to retrieve or write snippets. However, git commands can be used to pull and push snippets created in BB Cloud.

The second way I thought of is the following:

  1. Use the Snippets add-on API to copy/clone the snippet to a local PC.
  2. Use a script or short program to parse the JSON content and create each snippet as its own folder or directory on the local PC.
  3. Either manually or using the BB Cloud API, create a blank snippet for each snippet in the cloud.
  4. Use the BB Cloud API or Git commands to push/write the snippet to the cloud.

The problem with this approach is that I lose the authorship and creation dates of the original snippet contents.

Also, this seems like a tedious, excessively manual, and error-prone way to migrate the snippets from our local server to the cloud.

Is there a better way?


It looks like I can use the BB Cloud API to create snippets in our workspace, preserving most of the metadata. (Links here for the snippet itself, here for the comments.) I've written a short Python script to do the work. Now I just need to figure out what an `access_token` is, how to obtain one, and how to specify it in the API request. Unfortunately, I'm more dev than IT oriented, so I'm having some difficulty understanding the Authentication Methods documentation. 

I created and am using an app password instead, avoiding the access_token issue completely. API GET requests to both local and cloud servers are working. All that remains is getting the API POST requests working.


1 answer

1 accepted

1 vote
Answer accepted

Hey Ray, I'm glad to hear you're making some progress with this. I did want to note that you should actually be able to use normal credentials (well, at least your normal username and an app password) without the need to use a token if you don't want to. Since an app password is treated like a normal password you don't need any extra fields and you would simply leave off the "--header 'Authorization: Bearer <access_token>' \" bit.

I personally was working to create a short python script but it sounds like you beat me to the punch. I'll leave my incomplete copy here though if you want to steal any ideas from it. It does correctly get a list of snippets and it saves them to individual files/folders in your working directory but I hadn't gotten the upload/push part to cloud working yet. If you find yourself stuck though, let me know and I'll spend some more time on it to see if I can get it working. Cheers!
FYI: Sorry for the code block formatting. I broke some of the line spaces.

from pathlib import Path

from dataclasses import dataclass, field

from typing import Generator

from os import getcwd

from requests import Session



SERVER_BASE_URL = ''  # example:


CLOUD_PASSWORD = ''  # Use an app password

CLOUD_WORKSPACE = ''  # The slug/ID of the workspace (case sensitive)





STARTING_PATH = Path(getcwd())


class SnippetFile:

    name: str

    body: str


class Snippet:

    title: str

    raw_files: list[dict]

    files: list[SnippetFile] = field(default_factory=list)

    def __post_init__(self):

        for file in self.raw_files:



def get_server_snippets(page:int = None) -> Generator[Snippet, None, None]:

    while True:

        params = {'start': page}

        r = SERVER_SESSION.get(f'{SERVER_BASE_URL}/rest/snippets/1.0/snippets/browse', params=params)

        if r.status_code != 200:

            print(f'FATAL: Received a bad http error code. Check your server base url and your server credentials and try again.')

        r_json = r.json()

        for value in r_json.get('values'):

            snippet = Snippet(title=value.get('name'),


        yield snippet

        if r_json.get('isLastPage') is True:



            page = r_json.get('nextPageStart')

def save_snippet_locally(snippet: Snippet) -> None:

    p = Path(f'saved_snippets/{snippet.title}')

    p.mkdir(parents=True, exist_ok=True)

    for file in snippet.files:

        with open(f'{p}/{}', 'w+') as out:


def write_cloud_snippet(snippet: Snippet) -> None:

    url = f'{CLOUD_WORKSPACE}'

    headers = {'accept': 'application/json',

               'content-type': 'application/json'}

    payload = {'title': snippet.title,

               'scm': 'git',

               'is_private': True}

    files = {}

    for file in snippet.files:

        file_path = f'{STARTING_PATH}/saved_snippets/{snippet.title}/{}'

        files[] = open(file_path, 'rb')

    r =, headers=headers, json=payload, files=files)

# This is where we see a 400 status code indicating a bad request
# I haven't figured this bit out yet

def main():


    for snippet in get_server_snippets():



if __name__ == '__main__':


Hey Michael, thanks for the response! 

I finally figured out how to create and use an app password in lieu of the `Authorization: Bearer` header. It works great.

I got as far today as you did: the GET calls work great, and I can parse the output to recreate the snippets; the POST calls aren't working so well. Thank you for posting your Python code. It does some things better that mine does, so I'll certainly steal from it.

I'll get back to work on the POST calls next week. Thank you again for your help!

Like seanaty likes this

Another update:

I can create snippets and add files to them using cURL from the command line. I can even do (or fake) multipart/form-data POST requests using cURL.

I can create an unnamed snippet with multiple files using my Python script. I cannot create named snippets using Python. I cannot yet do multipart/related or multipart/form-data POST requests using Python. 

The Bitbucket Cloud REST API documentation, while admirably detailed, is short on examples. 

Hey Ray,

Sorry to hear you're still struggling with this. I've been out for the past few days but I'm happy to look into this some more. Can you provide the curl format (without your specific details of course) since that's working and I'll see if I can adapt it to python. Hopefully you're just missing something easy, I know I get stuck all the time on the little things so a second pair of eyes could definitely help.

As for the documentation, I agree. This topic is actually suprisingly sparse on examples and I've poked some of our internal people about it. With a little luck one of our devs will add to it when he gets a chance.

Hi Michael,

Here's how I ended up copying the snippets over. It's not pretty, but it works.

Bitbucket Cloud REST API: encoding and decoding multipart messages in Python? 

Basically, after more research, I realized that MIME/multipart is old, old tech that everybody already knows about (well, everybody but me) and so there's not much out there about it. So, with the knowledge that I can retrieve the snippets from the local server as JSON objects, and I can write them using os.system calls to cURL, I cobbled together the solution in that posting.

It doesn't preserve authorship or creation date, so I added a metadata file and a description file with that info.


Honestly, you and I are in the same boat as I'm not very familiar with MIME/multipart either. I know your final answer is a little hacky but I probably would have ended up somewhere in the same area as my searches weren't yielding much either. Having said that, I am glad to hear that you got it working. I'm definitely going to be pestering our documentation folks to see if we can get better examples though for this in the future.

If you need anything else, just reach out as always and we'll be around to help where we can! Have a great new year and stay safe out there!

Like Ray Depew likes this

Thank you, and thanks for your help, including that code block!

Suggest an answer

Log in or Sign up to answer

Atlassian Community Events