Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Need Help in Uploading JAR file into S3

navaneethakumar
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
August 21, 2019

I need to build my scala code and generate jar file using SBT ASSEMBLY and upload the jar into S3 . 

 

# This is a sample build configuration for Scala.
# Check our guides at https://confluence.atlassian.com/x/5Q4SMw for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: bitbucketpipelines/scala-sbt:scala-2.12

pipelines:
  default:
    - step:
        script: # Modify the commands below to build your repository.
          - sbt assembly
  
    - step:
             name: Deploy
             deployment: production
             script:
               - pipe: atlassian/aws-s3-deploy:0.2.4
                 variables:
                   AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                   AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                   AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                   S3_BUCKET: 'MyS3Path'
                   ACL: 'public-read'
                   LOCAL_PATH: '/opt/atlassian/pipelines/agent/build/target/scala-2.11/'

Getting failure as below 

 

aws s3 sync /opt/atlassian/pipelines/agent/build/target/scala-2.11/ s3://MyS3Path/ --acl=public-readupload failed: target/scala-2.11/resolution-cache/eventstoreingestion/eventstoreingestion_2.11/1.0/resolved.xml.properties to s3://MyS3Path/resolution-cache/eventstoreingestion/eventstoreingestion_2.11/1.0/resolved.xml.properties An error occurred (AccessDenied) when calling the PutObject operation: Access DeniedCompleted 5.6 KiB/~2.8 MiB (0 Bytes/s) with ~14 file(s) remaining (calculating...) 

 

 

 

Inside this path "/opt/atlassian/pipelines/agent/build/target/scala-2.11/" I have 2 directories and a JAR file. I just need JAR file to be uploaded into S3

 

1 answer

1 vote
Oleksandr Kyrdan
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
August 22, 2019

Hi @navaneethakumar ,

you can add EXTRA_ARGS: '--exclude=* --include=*.jar' to upload only .jar file into s3.

More details, you can find in the AWS docs Use of Exclude and Include Filters .

Also in your case, could be good to use artifacts to share build result between steps.


Cheers,
Alex

navaneethakumar
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
August 22, 2019

Thanks Alex this is working now.  And thanks for other links too 

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events