Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,459,191
Community Members
 
Community Events
176
Community Groups

ERROR: (gcloud.auth.activate-service-account)

Greetings,

I have a service account json file located in the root directory of my pipeline - still in development mode - and when I try to run the pipeline, I get the following error:

ERROR: (gcloud.auth.activate-service-account) Could not read json file /tmp/key-file.json: Expecting value: line 1 column 1 (char 0)

 

Here is my pipe:

- pipe: atlassian/google-app-engine-deploy:0.7.0

  variables:

    KEY_FILE: 'gcloud-api-key.json'

 

Not sure why it won't find it? 

 

Thanks,

Adam

2 answers

0 votes

Hello, @Adam_Woloszyn , you can recheck environment variable KEY_FILE. This should base64 encoded string , not filename. So the setting would be

KEY_FILE: $(cat gcloud-api-key.json').

This json file should contain proper base64 content and

This can be a tricky part, but we have a guide for this

https://confluence.atlassian.com/bitbucket/use-ssh-keys-in-bitbucket-pipelines-847452940.html#UseSSHkeysinBitbucketPipelines-UsemultipleSSHkeysinyourpipeline , section "Use multiple SSH keys in your pipeline".

You can use the same approach for your key.

 

We will update the documentation to be more clear in future releases

Greetings,

I appreciate your reply, however that didn't seem to work:

bash: /opt/atlassian/pipelines/agent/tmp/bashScript4455552900426182708.sh: line 72: unexpected EOF while looking for matching `''

 

I'll keep trying different things though.

Thanks,

Adam

@Adam Woloszyn , could you provide the full log, please? I don't see json parsed here related, only bash

This solution I proposed for local testing

KEY_FILE: $(cat gcloud-api-key.json'),

but you can still refer to confluence docs I mentioned to set environment variable right in the pipeline (Deployments section, Repository Variables)

It worked with this: KEY_FILE: $(cat encoded.json)

 

But I will read the article.

Weird part is that nothing showed up in my Deployments section after using that pipe. I thought that was the point of it?

 

Thanks for the help - if you know more about the Deployment sections that'd be helpful too!

 

Thanks,

@Adam Woloszyn to answer on your question, I need to know, what exactly has not been shown up.

For example, secured variables like KEY_FILE , they are shown in Deployments section variables, but masked. And in the logs they are not shown.

@Adam Woloszyn I guess you define key file variable as path variable where encoded string lies.

This is not the base case of usage.Obtain  KEY_FILE variable (and put to the repository variable) like this:

base64 < your-service-account-file.json - you will have base64 encoded string. Put such string to the repository (or deployment) variable and name it KEY_FILE.

Regards, Galyna

More of the error. 

googlecloudsdk.api_lib.auth.service_account.BadCredentialFileException: Could not read json file /tmp/key-file.json: Expecting value: line 1 column 1 (char 0)

ERROR: (gcloud.auth.activate-service-account) Could not read json file /tmp/key-file.json: Expecting value: line 1 column 1 (char 0)

The strange part is that I can execute these commands just fine if I use a normal step script. Just having issues with the pipe itself.

script:

- gcloud auth activate-service-account --key-file ./gcloud-api-key.json

- gcloud config set project project-slug

- gcloud auth configure-docker --quiet

- gcloud app deploy --quiet

Hello, @Adam Woloszyn , you can recheck environment variable KEY_FILE. This should base64 encoded string , not filename. So the setting would be

KEY_FILE: $(cat gcloud-api-key.json').

This json file should contain proper base64 content and

This can be a tricky part, but we have a guide for this

https://confluence.atlassian.com/bitbucket/use-ssh-keys-in-bitbucket-pipelines-847452940.html#UseSSHkeysinBitbucketPipelines-UsemultipleSSHkeysinyourpipeline , section "Use multiple SSH keys in your pipeline".

You can use the same approach for your key.

 

We will update the documentation to be more clear in future releases

Hi, @Adam Woloszyn .

 

Haven't you encountered on manual scripting below?

 

gcloud auth activate-service-account --key-file ./gcloud-api-key.json

gcloud config set project project-slug

gcloud auth configure-docker --quiet

gcloud app deploy --quiet

 

Like `gcloud : command not found`

 

Also, I would like to ask if you resolved the original issue with `atlassian/google-app-engine-deploy:0.7.0`?

Hoping for your reply.

 

Thank you.

Suggest an answer

Log in or Sign up to answer
TAGS

Atlassian Community Events