Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Fail to deploy to AWS Beanstalk with Spring Boot & Gradle.

박성민 May 16, 2023

Hi folks, I'm struggling with making a pipeline to AWS Beanstalk using Spring Boot.

I'm using Gradle for build the artifact, and it works well in my local system.

 

Here's my pipeline yaml script.


branches:

  master:

    - step:

        name: Build

        script:

          - bash ./gradlew clean build

          - ls -lR *

    - step:

        name: Deploy to Production

        deployment: Production

        trigger: manual

        script:

          - pipe: atlassian/aws-elasticbeanstalk-deploy:1.1.0

            variables:

              AWS_ACCESS_KEY_ID: 'AKIA'

              AWS_SECRET_ACCESS_KEY: '7d+'

              AWS_DEFAULT_REGION: 'ap-northeast-2'

              APPLICATION_NAME: 'goodcare-sample-005'

              ENVIRONMENT_NAME: 'goodcare-sample-005-env'

              ZIP_FILE: './build/libs/*.jar'



The step named "build" is always fully successful, but for the "Deploy to Production" step, I got an error like this.

INFO: Uploading to s3 bucket: goodcare-sample-005-elasticbeanstalk-deployment and key goodcare-sample-005/goodcare-sample-005-16-b686f5be.jar
94
Traceback (most recent call last):
95
File "/pipe.py", line 275, in <module>
96
pipe.run()
97
File "/pipe.py", line 257, in run
98
self.upload()
99
File "/pipe.py", line 141, in upload
100
s3.upload_file(zip_file, s3_bucket, source_bundle_key)
101
File "/usr/local/lib/python3.10/site-packages/boto3/s3/inject.py", line 143, in upload_file
102
return transfer.upload_file(
103
File "/usr/local/lib/python3.10/site-packages/boto3/s3/transfer.py", line 292, in upload_file
104
future.result()
105
File "/usr/local/lib/python3.10/site-packages/s3transfer/futures.py", line 103, in result
106
return self._coordinator.result()
107
File "/usr/local/lib/python3.10/site-packages/s3transfer/futures.py", line 266, in result
108
raise self._exception
109
File "/usr/local/lib/python3.10/site-packages/s3transfer/tasks.py", line 269, in _main
110
self._submit(transfer_future=transfer_future, **kwargs)
111
File "/usr/local/lib/python3.10/site-packages/s3transfer/upload.py", line 585, in _submit
112
upload_input_manager.provide_transfer_size(transfer_future)
113
File "/usr/local/lib/python3.10/site-packages/s3transfer/upload.py", line 244, in provide_transfer_size
114
self._osutil.get_file_size(transfer_future.meta.call_args.fileobj)
115
File "/usr/local/lib/python3.10/site-packages/s3transfer/utils.py", line 247, in get_file_size
116
return os.path.getsize(filename)
117
File "/usr/local/lib/python3.10/genericpath.py", line 50, in getsize
118
return os.stat(filename).st_size
119

FileNotFoundError: [Errno 2] No such file or directory: './build/libs/*.jar'

120
121

In this situation, what should I check for and what could be the solution to tackle this problem?

 

I wasted my whole day and I really need your help.

 

Thank you in advance.

 

 

1 answer

1 accepted

0 votes
Answer accepted
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
May 17, 2023

Hi Chris and welcome to the community!

Is the .jar file that you use in the ZIP_FILE variable generated during the first step of your pipeline?

If so, you will need to define this file as an artifact in the first step, so that it becomes available in the next step:

branches:
master:
- step:
name: Build
script:
- bash ./gradlew clean build
- ls -lR *
artifacts:
- build/libs/*.jar

Just to give you some context, Pipelines builds run in Docker containers. For every step of your build, a Docker container starts (the build container) using the image you have specified in your bitbucket-pipelines.yml file. The repo is cloned in the build container, and then the commands of the step's script are executed. When the step finishes, this Docker container gets destroyed. The same thing happens then for any subsequent step of your pipeline. So, if you want to share a file generated in one step with subsequent steps, you will need to use an artifact definition.

Please keep in mind that artifacts are stored for 14 days following the execution of the step that produced them. After this time, the artifacts are expired and any manual steps later in the pipeline can no longer be executed.

This is the documentation for artifacts:

Kind regards,
Theodora

박성민 May 18, 2023

Thank you for your supporting to fix this problem.

According to your advice, I figured it out as I put the artifact property, so that I could build the pipeline well.

Best regards,

Chris

Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
May 22, 2023

Hi Chris,

You are very welcome, it's good to hear that the issue has been resolved.

Please feel free to reach out if you ever need anything else!

Kind regards,
Theodora

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PERMISSIONS LEVEL
Site Admin
TAGS
AUG Leaders

Atlassian Community Events