It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Configure google cloud sql as a service

Hi all,

 

What I'm trying to do is to configure a pipeline that would connect to postgresql google cloud instance, and run django migrations.

I experimented a bit, and so far I wasn't very successful.

I can use the google/cloud-sdk image to connect to the instance, and it seems to be runing fine using those commands:

-step:
  image: google/cloud-sdk
  script:
    - echo $GOOGLE_CLIENT_SECRET > ./gcloud-api-key.json
   - gcloud auth activate-service-account --key-file gcloud-api-key.json
- gcloud config set project $CLOUDSDK_CORE_PROJECT
- apt-get install wget
- wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
- chmod +x cloud_sql_proxy
- ./cloud_sql_proxy -instances="[$DEV_GCP_DB_INSTANCE]"=tcp:5432

However if I run this as a step, it won't help much because it run as a server:

+ ./cloud_sql_proxy -instances="[$DEV_GCP_DB_INSTANCE]"=tcp:5432
2018/11/20 00:21:38 current FDs rlimit set to 1048576, wanted limit is 8500. Nothing to do here.
2018/11/20 00:21:38 Listening on 127.0.0.1:5432 for [test-django-pg:europe-west1:test-django-db]
2018/11/20 00:21:38 Ready for new connections


So I need to run it as a service. I tried multiple solutions but so far I was unlucky.

Here's the last one:

  branches:
    dev:
      - step:
          image: python:3.6.2
       caches:
            - pip
          script:
            - pip install -r requirements.txt
            - python monsite/manage.py migrate

definitions:
  services:   
cloudsql:
      image: google/cloud-sdk
      command:
        - echo $GOOGLE_CLIENT_SECRET > ./gcloud-api-key.json
        - gcloud auth activate-service-account --key-file gcloud-api-key.json
        - gcloud config set project $CLOUDSDK_CORE_PROJECT
        - apt-get install wget
        - wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
        - chmod +x cloud_sql_proxy
        - ./cloud_sql_proxy -instances="[$DEV_GCP_DB_INSTANCE]"=tcp:5432
      restart: always

 I also tried using the image: gcr.io/cloudsql-docker/gce-proxy:1.13 but this image requires a credential file to be passed as an argument to cloud_sql_proxy and I wasn't able to do it.

 

Any help welcome !

 

Thanks

2 answers

1 accepted

0 votes
Answer accepted

I finally found a solution: keep it simple. Run the proxy in background directly in the step:

 branches: 
# Deployement to gcp on test server
dev:
- step:
image: python:3.6.2
caches:
- pip
script:
# First we need to download the gcloud auth.
- curl -o /tmp/google-cloud-sdk.tar.gz https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-183.0.0-linux-x86_64.tar.gz
- tar -xvf /tmp/google-cloud-sdk.tar.gz -C /tmp/
- /tmp/google-cloud-sdk/install.sh -q
- source /tmp/google-cloud-sdk/path.bash.inc

# Activate gcloud register secret json and link to the project
- echo $GOOGLE_CLIENT_SECRET > ./gcloud-api-key.json
- gcloud auth activate-service-account --key-file gcloud-api-key.json
- gcloud config set project $CLOUDSDK_CORE_PROJECT

# Download cloud_sql_proxy and run it as proxy
- wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
- chmod +x cloud_sql_proxy
- ./cloud_sql_proxy -instances="$DEV_GCP_DB_INSTANCE"=tcp:5432 > cloudsql.log 2>&1 &

# Migrate, collecstatic, check for superuser, test, coverage etc.
- pip install -r requirements.txt # Install or upgrade dependencies
- python monsite/manage.py migrate # Apply database migrations
- mkdir static # Used to store static files
- python monsite/manage.py collectstatic --noinput # Collect static files
- coverage run --source='monsite/' monsite/manage.py test --noinput # Run converage functions
- coverage report # To get the report directly in pipelines

# Generate app.yaml and push to server

# Save log as artifact
artifacts:
- cloudsql.log

 

You'll need the following repository variables to make it work:

  • $GOOGLE_CLIENT_SECRET : copy paste the content of the json key file from gcp

  • $CLOUDSDK_CORE_PROJECT: your core project

  • $DEV_GCP_DB_INSTANCE: your cloud sql instance id (not the instance name, the whole thing with the project name, the project zone and the instance name)

Also note that your service account must have the following roles:

  • storage admin
  • app engine admin
  • sql client editor

And the SQL admin api must be enable.

Answering to myself: this post indicates that the services in docker don't support "command" https://community.atlassian.com/t5/Bitbucket-questions/does-services-in-pipeline-support-quot-COMMAND-quot-keyword/qaq-p/782262

So I need to find a way to make it work without a command instruction or using the docker service.

mwatson Atlassian Team Nov 20, 2018

It seems there is a docker image for the sql proxy itself that you might be able to use, either directly with the docker service in pipelines, or as the base image for building your own docker container that you can use as a service. See the docs: https://cloud.google.com/sql/docs/postgres/connect-docker

Suggest an answer

Log in or Sign up to answer
Community showcase
Published in Bitbucket Pipelines

What We Learned When We Researched Open Source Vulnerabilities in 7 Popular Coding Languages

...hey are a part of us, shaping how we interact with the world around us. The same holds true for programming languages when we think about how different kinds of vulnerabilities raise their heads in t...

146 views 0 1
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you