Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Google Cloud Storage deploy - KEY_FILE variable?

davidstanton April 6, 2020

can anyone tell me what is the KEY_FILE variable that has to go in this:

 

i have a

private_key_id

&

private_key

 

script:

- pipe: atlassian/google-cloud-storage-deploy:0.4.0 variables:

KEY_FILE: $KEY_FILE

PROJECT: 'my-project'

BUCKET: 'my-bucket'

SOURCE: '.'

 

 

tried all sort of thing on this now and nothing gets it working,  i get this output:

 

i've tried adding a path to the file, no go

i've tried adding a variable, pasting the whole json file into it = no go

i've tried adding a variable, pasting the private_key_id  into it = no go

 

there must be someone on these borads that knows about this, its a atlassian repohttps://bitbucket.org/atlassian/google-cloud-storage-deploy/src/master/

davidstanton___prebid-header-bidding___Pipelines_—_Bitbucket.png

2 answers

1 accepted

0 votes
Answer accepted
tobo04 April 8, 2020

If i remember it correctly, the key_file was the content of that json in base64 encoded.

davidstanton April 9, 2020

ok, so i have to take the json file and run it through a base64 encoder? then upload that file to the repo, and point $KEY_FILE to that with a "variable"?

tobo04 April 9, 2020

I think u have to copy the bease64 encoded content to a variable. No file

tobo04 April 9, 2020

But the Dockerfile is public of that image so you can look it up there.

tobo04 April 9, 2020

https://bitbucket.org/atlassian/google-cloud-storage-deploy/src/master/pipe/pipe.sh

->
echo "${KEY_FILE}" | base64 -d >> /tmp/key-file.json

davidstanton April 9, 2020

so what that mean, im a complete dumb A$$ at this?

davidstanton April 9, 2020

thanks very much, i think i have it working now properly,  another question would be, is there a way to add google cdn cache invalidation to this, as in if a file changes , it invalidates the cache as well.  not 100% sure yet if google auto invalidates it or not, need to test more

davidstanton April 9, 2020

i see one problem here already,  it seems to upload the whole file, not just the files that have changed, where as if i do a "transfer from S3 to google bucket" it only transfers the files which have changed.  that in the long run will save a load of money etc and it auto invalidates the "changed" files, as a new date stamp on them

 

Is there anyway to only upload 'changed' files?

minhkha7911 September 23, 2020

Hi, I get stuck with KEY_FILE variable now. Do u know how to define it ?

Renato Prosofsky December 9, 2020

@davidstanton @minhkha7911  I'm facing the same issue. how did you fixed?

I already have integrate my KEY_FILE to a repository variable with base64, but when I run, I get the same message as you.

I have this step in my YML:

step:
nameGCP Storage
script: - pipeatlassian/google-cloud-storage-deploy:0.4.5 variables:
KEY_FILE$KEY_FILE
PROJECT'my-project'
BUCKET'my-bucket'
SOURCE'the path with the APK'

(of course I have other configurations on my YML but I copy only the part to GCP STORAGE)

Renato Prosofsky December 10, 2020

@tobo04 can you help me?

0 votes
Renato Prosofsky December 9, 2020

@davidstanton I'm facing the same issue. how did you fixed?

I already have integrate my KEY_FILE to a repository variable with base64, but when I run, I get the same message as you.

I have this step in my YML:

- step:
name: GCP Storage
script: - pipe: atlassian/google-cloud-storage-deploy:0.4.5 variables:
KEY_FILE: $KEY_FILE
PROJECT: 'my-project'
BUCKET: 'my-bucket'
SOURCE: 'the path with the APK'

(of course I have other configurations on my YML but I copy only the part to GCP STORAGE)

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events