You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
Next: Root
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
The Atlassian Community can help you and your team get more value out of Atlassian products and practices.
I have below pipeline
image: node:10.15.3
pipelines:
default:
- step:
name: Installing
caches:
- node
script:
- rm -rf package-lock.json
- rm -rf node_modules
- npm install
- step:
name: Build
script:
- npm install --production
artifacts:
- build/**
- step:
name: Deploying
script:
- pipe: atlassian/google-cloud-storage-deploy:0.4.5
variables:
KEY_FILE: $KEY_FILE
PROJECT: $PROJECT
BUCKET: 'bucketname'
SOURCE: 'build'
CACHE_CONTROL: 'max-age=30'
ACL: 'public-read'
The expected behavior is to deploy everything inside the build folder, but it uploads the build folder itself so in my storage, there is a build folder then everything inside this...
I have tried
'build/**' it picks only files and ignore directories
'buiid/' it behave same as 'build' picks build directory along with everything inside it directories and folders
'build/*' it picks only files and ignore directories
How can I everything inside build folder , files and diectories..
Thanks
I found the issue it's inside pipeline, I have written gsutil command to copy the folder recursively, and problem resolved
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.