Howdy folks.
There’s a good reason Platform teams across the industry are working towards creating golden paths for their developers. Golden paths remove significant friction for developers configuring environments in accordance with organizational standards, reducing their cognitive load in the process. They also help new hires self-serve and become independent faster. In this article, I’ll show you how to create a Compass template that sets up a Bitbucket pipeline and provisions AWS infrastructure. I'll also include some sample code that you can adjust to suit your needs.
Compass templates are a powerful tool that can significantly reduce the friction developers face when provisioning a new environment for a software component. This is achieved through three simple steps. First, they create a new repository by cloning a starting point repository provided when the template is created. Second, they create a new Compass component for the new repository. Finally, they invoke a webhook and pass it a JSON payload that is configured when the template is created. The webhook handler can be built in a myriad of ways.
Compass templates create a new repository by cloning an existing repository that is setup as a starting point. The starting point repository acts as a standards-compliant base for development projects of a specific kind.
For example, if a team writes Golang code that is executed in AWS Lambda, the starting point repository might be a Golang repository with an AWS SAM template file and a standard Golang lambda main and handler.
If a team writes C++ code that uses CMake, GoogleTest, and Boost, the starting point repository might have a /src directory with a main.cpp, a /tst directory, and a CMakeLists.txt file that imports GoogleTest and Boost and successfully builds with them.
The starting point repository can be tokenized so that the webhook handler can easily find and replace strings in code and configuration files. For example, the name of the project in a CMakeList.txt file can be set to something like REPLACE_ME.
project(REPLACE_ME VERSION 0.0.1 LANGUAGES CXX)
The webhook hander can replace this with an appropriate value, commit the change, and git push to the new repository. In the example below REPLACE_ME was replaced with “templateTestRun” during a test run of the Compass template.
project(templateTestRun VERSION 0.0.1 LANGUAGES CXX)
The screenshot below shows the starting point repository used in this demo video.
Compass templates create a new Compass component for the new repository. A component is the combined output of a software engineering team, typically tracked in source control. A dedicated team owns, develops, and operates a component. From here, teams can connect the new component to all of their other tools, and add metrics and scorecards. With a Compass component setup, teams can access, manage, and share information about the new component.
The new Compass component starts with a couple of default scorecards.
And metrics.
Setting up a Compass component using a template automatically links the component to the source code repository. This means that events, like deployments, are automatically shown in the Activity feed for that component.
After creating a new repository and a new Compass component, Compass templates invoke a webhook and pass it a JSON payload. The webhook handler can be built to do whatever a team needs. Things such as:
Setting up standards-compliant CI/CD pipelines to build, test, and deploy code to test, staging, and production environments
Provisioning infrastructure that the code depends on, such as AWS DynamoDB tables, S3 buckets, or Kinesis streams.
Compass templates automate the provisioning of a compliant environment for developers when they’re bootstrapping a new project, allowing them to focus on building cool software and solving interesting problems.
The demo template uses an AWS Lambda written in Golang. It uses Git, Bitbucket REST API, and the AWS Go SDK to interact with Bitbucket and AWS.
The options for how to build the webhook handler are endless. For the demo in the video linked above, we use an AWS Lambda written in Golang with an AWS API Gateway endpoint. The code leverages the Git Lambda layer for access to Git, the Bitbucket Cloud REST API to configure Bitbucket, and the AWS Go SDK to interact with AWS.
The gameplan of the webhook handler is as follows:
Parse the JSON payload passed to it by the Compass template
Enable Bitbucket pipelines with Bitbucket Cloud REST API
Configure deployments in Bitbucket pipelines with Bitbucket Cloud REST API
Clone the new repository to /tmp using git
Download a preconfigured, tokenized bitbucket-pipelines.yml file from S3 and add it to the new repository
Programmatically modify the CMakeLists.txt file and other config files for REPLACE_ME tokens and replace them with appropriate values
Git add, git commit, git push changes to Bitbucket
Download a tokenized CloudFormation template.yml file from S3 that defines a DynamoDB
Programmatically modify the template.yml file
Use AWS Go SDK to create a new stack with the modified template.yml
In the following sections, I’ll explain one way to do each step. The provided code snippets are for illustrative purposes. To make it production-ready, you’d want to add additional logging, exception handling, and testing. While there are certainly more optimized approaches to writing this code, for this example, I chose to keep it simple. For the sake of brevity, I’ve removed all error handling, logging, and returns from the snippets below.
Here is the signature of the Golang Lambda handler. Documentation about Golang AWS Lambda handler functions can be found here.
func (d *Dependency) Handler(ctx context.Context, request events.APIGatewayProxyRequest) (events.APIGatewayProxyResponse, error) {
The request is an APIGatewayProxyRequest, the structure of which is available here. The handler passes the request body to a function called processRequest.
process_string, process_err := d.processRequest(request.Body, region, aws_account_id)
The request.Body contains the JSON CLOB from the Compass template.
Here is the signature of processRequest for reference.
func (d *Dependency) processRequest(request_body string, region string, aws_account_id string) (string, error) {
In processRequest, we define a struct that matches the structure of the JSON payload we expect from Compass. Then we unmarshal the request body into an instance of the struct.
type JsonFromCompass struct {
Component struct {
Id string `json:"id"`
Name string `json:"name"`
Repository string `json:"repository"`
} `json:"component"`
Parameters struct {
Wonk_param_1 string `json:"wonk_param_1"`
Wonk_param_2 string `json:"wonk_param_2"`
} `json:"parameters"`
}
json_from_compass := JsonFromCompass{}
err := json.Unmarshal([]byte(request_body), &json_from_compass)
From here we can extract a bunch of the information we need for the remaining steps. The json_from_compass.Component.Repository has the structure
https://bitbucket.org/atlbettog/testtemplate
We can grab the repo slug and repo name by splitting the string on “/” and grabbing the 4th and 5th tokens.
parts := strings.Split(json_from_compass.Component.Repository, "/")
if len(parts) != 5 {
// the repository url is malformed. abort.
return "", errors.New("invalid Bitbucket repository URL")
}
repo_workspace := parts[3]
repo_slug := parts[4]
Now we have most of the information we need to move on with the rest of the steps.
Here is a function that uses Bitbucket REST API to enable pipelines. It uses the pipelines_config endpoint. The same thing could be done on GitHub, GitLab, or Azure if you use those tools.
func enablePipelines(repo_workspace string, repo_slug string, aws_access_token string) (string, error) {
url := fmt.Sprintf("https://api.bitbucket.org/2.0/repositories/%s/%s/pipelines_config", strings.ToLower(repo_workspace), strings.ToLower(repo_slug))
jsonBodyEnable := []byte(`{"enabled": true}`)
bodyReaderEnable := bytes.NewBuffer(jsonBodyEnable)
req, err := http.NewRequest(http.MethodPut, url, bodyReaderEnable)
req.Header.Add("Authorization", fmt.Sprintf("Bearer %s", aws_access_token))
req.Header.Add("Accept", "application/json")
req.Header.Add("Content-Type", "application/json")
resp, err := http.DefaultClient.Do(req)
defer resp.Body.Close()
}
This will enable Bitbucket pipelines for the repo created by the Compass template.
Here is a function that uses Bitbucket REST API to add a new deployment environment. It uses the environments endpoint.
func addDeploymentEnvironment(repo_workspace string, repo_slug string, environment_type string, environment_name string, aws_access_token string) (string, error) {
url := fmt.Sprintf("https://api.bitbucket.org/2.0/repositories/%s/%s/environments", strings.ToLower(repo_workspace), strings.ToLower(repo_slug))
jsonBodyCreate := []byte(fmt.Sprintf(`{"environment_type": {"type": "deployment_environment_type", "name": "%s", "rank": 0}, "name": "%s"}`, environment_type, environment_name))
bodyReaderCreate := bytes.NewBuffer(jsonBodyCreate)
req, err := http.NewRequest(http.MethodPost, url, bodyReaderCreate)
req.Header.Add("Authorization", fmt.Sprintf("Bearer %s", aws_access_token))
req.Header.Add("Accept", "application/json")
req.Header.Add("Content-Type", "application/json")
resp, err := http.DefaultClient.Do(req)
defer resp.Body.Close()
}
It is called multiple times in processRequest to add new environments.
addDeploymentEnvironment(repo_workspace, repo_slug, "Test", "Test us-west-1", aws_access_token)
addDeploymentEnvironment(repo_workspace, repo_slug, "Staging", "Staging us-east-2", aws_access_token)
addDeploymentEnvironment(repo_workspace, repo_slug, "Production", "Production us-west-2", aws_access_token)
addDeploymentEnvironment(repo_workspace, repo_slug, "Production", "Production us-east-1", aws_access_token)
addDeploymentEnvironment(repo_workspace, repo_slug, "Production", "Production ca-central-1", aws_access_token)
When these calls finish we’ll have added five new deployment environments. By default, a repository comes with a Test, Staging, and Production environment. These can be deleted with similar calls to the environments endpoint.
Thanks to the Git Lambda layer we added, Git commands can be invoked directly. Here’s the clone function. We need to clone into the /tmp directory in the AWS Lambda since the rest of the file system is read-only.
func gitClone(aws_access_token string, repo_from_compass string, file_name string) ([]byte, error) {
clone_url := fmt.Sprintf("https://x-token-auth:%s@%s.git", aws_access_token, repo_from_compass)
cmd := exec.Command("git", "clone", clone_url, file_name)
cmd.Dir = "/tmp"
return cmd.Output()
}
Here’s the add function.
func gitAdd(file_name string) ([]byte, error) {
cmd := exec.Command("git", "add", "--all")
cmd.Dir = fmt.Sprintf("/tmp/%s", file_name)
return cmd.Output()
}
Git commit, and git push follow the same pattern. Next we’ll move on to pulling files down from AWS S3.
This code snippet uses the older AWS Golang SDK v1. It is a good idea to update to the AWS Golang SDK v2 for new development.
func (d *Dependency) getFileFromS3(folder string, file_name string) (string, error) {
file, err := os.Create(fmt.Sprintf("%s/%s", folder, file_name))
defer file.Close()
sess, err := session.NewSession(&aws.Config{
Region: aws.String("us-west-2")},
)
downloader := s3manager.NewDownloader(sess)
_, err = downloader.Download(file,
&s3.GetObjectInput{
Bucket: aws.String("YOUR_BUCKET_NAME_HERE"),
Key: aws.String(file_name),
})
}
This function downloads a file from S3 and saves it to the disk. It can be called like this.
_, err := d.getFileFromS3("/tmp", "dynamodb_cf.yml")
I like to follow the KISS principle when writing code, I dislike enterprise Java, and I like Linux CLI stuff.
So, I used SED like this to replace the tokens in my files.
exec.Command("sed", "-i", fmt.Sprintf("s/REPLACE_ME/%s/g", component_name), fmt.Sprintf("/tmp/%s/CMakeLists.txt", file_name)).Output()
There are other ways of doing this, but easy one-liners are nice, and I didn’t have to use any AbstractSingletonProxyFactoryBeans to achieve it.
Here’s a function to provision a new AWS CloudFormation stack assuming you have a correctly written template.yml file available. I pulled my template.yml down from S3 and modified it with a couple of SED calls before using this.
func (d *Dependency) createDynamoDbTableWithCloudFormation(template_file string, stack_name string) (string, error) {
content, err := ioutil.ReadFile(template_file)
template_body := string(content)
_, err = d.DepCfn.CreateStack(&cloudformation.CreateStackInput{
TemplateBody: &template_body,
StackName: &stack_name,
})
}
It’s called like this.
_, err = d.createDynamoDbTableWithCloudFormation("/tmp/dynamodb_cf.yml", fmt.Sprintf("dynamodb-cfn-for-%s", component_name))
In this article, we learned that Compass templates help developers build using a golden path. Then, we looked at what Compass templates do out of the box before diving into how to create a webhook handler that uses Git, the Bitbucket Cloud REST API, and the AWS Golang SDK. Following this example, we were able to create a simple golden path that creates a Bitbucket repository, creates a Compass component, provisions an AWS DynamoDB table, and configures Bitbucket pipelines so that developers are able to focus on building software and solving problems.
If you aren’t already using Compass, try it out. If you’re already using Compass, try setting up a new Compass template that automates the setup and configuration of CI/CD for your most common project type. Check out the links below for additional information.
Compass templates YouTube video: https://youtu.be/79RC9nPoF_8?si=M-vZ0e-GVtx_5Na0
Compass templates webhook handler sample code: https://bitbucket.org/atlbettog/compasstemplatewebhookpublic/src/mainline/
Bitbucket REST API: https://developer.atlassian.com/cloud/bitbucket/rest/intro/#authentication
Compass API: https://developer.atlassian.com/cloud/compass/rest/intro/#about
AWS Golang SDK: https://aws.amazon.com/sdk-for-go/
AWS Lambda Git Layer: https://github.com/lambci/git-lambda-layer
Warren Marusiak
0 comments