Does Pipelines run the builds on AWS?

Ben
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
November 28, 2016

I am curious if Bitbucket Pipelines runs the actual tests on AWS or not (and if so, in which region?).  

If it is within AWS, I would suspect that if we used Amazon ECR as our Docker Registry that docker image downloads to Pipelines would be free (and likely faster than other options), as it would be within the AWS platform.

(I recognize that if the answer is yes now, Atlassian would likely not guarantee it forever, but would be nice to know for now anyways).

1 answer

1 vote
Steffen Opel _Utoolity_
Community Leader
Community Leader
Community Leaders are connectors, ambassadors, and mentors. On the online community, they serve as thought leaders, product experts, and moderators.
November 28, 2016

This is partially addressed in an InfoQ interview embedded in Bitbucket Pipelines Provides Continuous Delivery within Atlassian’s Bitbucket Cloud:

Pittet confirmed that Bitbucket Pipelines is implemented atop the Amazon EC2 Container Service (ECS). While this transparent and managed container usage is one of Pipeline’s value propositions, users have also requested the ability to execute builds in their own ECS cluster.

I'm not aware of an official statement regarding the facilitated ECS region, but given Jim's answer to Where are Bitbucket's data centers located? being "Virginia and California", I would assume Pipelines is running in a close by AWS region (i.e. one or more of the US regions). 

The ability to pull images from Amazon ECR is a fairly popular feature request already, which you might want to watch and vote for accordingly:

However, given the way images are currently referenced in the configuration file, I agree with the reporter's skepticism concerning the workaround proposed by a Bitbucket team member:

I'm pretty sure the eval statement needs to be run by the Bitbucket Pipelines process before it can download the correct Docker image within which to run user supplied code. Is there a way to customize the code run at that stage of the process?

The variable approach might be conceptually valid still, albeit composed differently, as I have outlined in my answer to Pipelines: Pulling docker images from a Amazon ECR repository - given the surprising lack of an API for updating Bitbucket Pipelines variables (which implies you'd need to update the variables with the ECR authentication credentials manually every 12 hours right now), I have never tried that though ...

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events