Does serverless-deploy pipe support upgrading Pytruntimes regardless of docker image used to deploy?

Sam Mahr March 19, 2024

The serverless-deploy pipe currently uses `python:3.10-slim` for its Docker image.

 

I use the serverless-deploy pipe to deploy serverless applications, and I'm looking to upgrade our Lambda functions to python 3.12 in AWS. I know we can technically, probably, update the runtime to python 3.12 and still use this pipe to deploy.

 

Is it safe to choose any runtime regardless of the version of Python serverless-deploy pipe is using? Should I wait until 3.11 and 3.12 is used? I'd like to stay updated with the latest, stable runtime as best as possible.

 

Thanks

1 answer

1 vote
Oleksandr Kyrdan
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 22, 2024

Hi @Sam Mahr 

Thanks for your question!

You could use any runtime supported by cloud platform: 

Multi-Language - Supports Node.js, Python, Java, Go, C#, Ruby, Swift, Kotlin, PHP, Scala, & F#

 

Best regards,
Oleksandr Kyrdan

Sam Mahr April 3, 2024

@Oleksandr Kyrdan Thank you for your reply.

I've been deploying python 3.10 and it works just fine.

I'm having an issue deploying to AWS for Lambda functions using python 3.11 runtime with serverless-deploy:1.5.0. I'll paste output and file snippets (I'm removing any URLs as well if you see [url] because I'm not sure what URLs are invalid). The first paste is the serverless-deploy output, the last paste is the serverless yaml. Again, this all works if I switch everything to python 3.10.

Pasted Snippets below:

  • Pipelines Output
  • serverless.yml
  • pre execution script
  • requirements.txt
  • bitbucket-pipelines.yml

More info: I'm using Poetry for package management, the requirements file is generated by poetry-export plugin, and `usePoetry` in `serverless.yml` is set to false to ensure `requirements.txt` is used to install with serverless-python-requirements plugin.

 

Does this indicate a bug or do I have my setup wrong? Separate question: When will Python3.12 be available? 

 

Please let me know if I'm missing anything or if you need any additional info. I will paste different outputs and files in follow up replies... I'm having issues putting this all in one reply 

Sam Mahr April 3, 2024

Pipelines Output (just starting at deploy command, having issues posting even with omitting html):

 

serverless deploy -c https://bitbucket.org/organization/serverlessapp/src/b447a78f5f5dd0c2db581eca47b670e0df145e0e/serverless.yml -s dev --region us-east-2 --verbose [0m

Deploying ServerlessAppTemplate to stage dev (us-east-2)

Generated requirements from /opt/atlassian/pipelines/agent/build/requirements.txt in /opt/atlassian/pipelines/agent/build/.serverless/requirements.txt
Installing requirements from "/root/.cache/serverless-python-requirements/b0d9a8e6e9636fe42dfec025929907c66a55d7d4b47e56429cc20bdc2bf8ecaa_x86_64_slspyc/requirements.txt"
Using download cache directory /root/.cache/serverless-python-requirements/downloadCacheslspyc

× Stack ServerlessAppTemplate-dev failed to deploy (0s)
Environment: linux, node 16.20.2, framework 3.38.0, plugin 7.2.0, SDK 4.5.1
Credentials: Local, environment variables
Docs: docs.serverless.com
Support: forum.serverless.com
Bugs: github.com/serverless/serverless/issues

Error:
Error: `python3.11 -m pip help install` Exited with code 1
at ChildProcess.<anonymous> (/usr/lib/node_modules/serverless-python-requirements/node_modules/child-process-ext/spawn.js:38:8)
at ChildProcess.emit (node:events:513:28)
at ChildProcess.emit (node:domain:489:12)
at maybeClose (node:internal/child_process:1100:16)
at Socket.<anonymous> (node:internal/child_process:458:11)
at Socket.emit (node:events:513:28)
at Socket.emit (node:domain:489:12)
at Pipe.<anonymous> (node:net:301:12)
[31m✖ Deployment failed :( [0m
Skipping cache upload for failed step
Searching for files matching artifact pattern .bitbucket/pipelines/generated/pipeline/pipes/**

Searching for test report files in directories named [test-results, failsafe-reports, test-reports, TestResults, surefire-reports] down to a depth of 4
Finished scanning for test reports. Found 0 test report files.
Merged test suites, total number tests is 0, with 0 failures and 0 errors.
Sam Mahr April 3, 2024

Serverless YAML:

service: ServerlessApp
frameworkVersion: ^3.38.0
package:
individually: true
patterns:
- '!config/**'
- '!lambda_functions/**'
- '!*.json'
- '!*.md'
- '!*.txt'
- '!.gitignore'
- '!pytest.ini'
- '!tests/**'
- '!*.yml'
- '!*.yaml'
- '!Makefile'
- '!poetry.lock'
- '!pyproject.toml'

provider:
name: aws
runtime: python3.11
httpApi:
cors: true
stage: ${opt:stage, "dev"}
region: ${opt:region, "us-east-2"}
deploymentBucket: # don't create a new bucket per deployment, put them all here!
name: woooooo-hoooooo-serverless-deployments-${self:provider.region}
deploymentPrefix: Serverless
environment:
STAGE: ${self:provider.stage}
POWERTOOLS_SERVICE_NAME: template

functions: ${file(config/functions.yml)}

resources: ${file(config/resources.yml)}

stepFunctions: ${file(config/state_machines.yml)}

plugins:
- serverless-python-requirements
- serverless-step-functions

custom:
powertoolsLayer: arn:aws:lambda:${opt:region}:017000801446:layer:AWSLambdaPowertoolsPythonV2:67
pythonRequirements:
layer: true
slim: true
strip: false
usePoetry: false

Pre Execution Script:

 

apt-get update
apt-get install -y git

# START SSH setup
INJECTED_SSH_CONFIG_DIR="/opt/atlassian/pipelines/agent/ssh"
# The default ssh key with open perms readable by alt uids
IDENTITY_FILE="${INJECTED_SSH_CONFIG_DIR}/id_rsa_tmp"
# The default known_hosts file
KNOWN_HOSTS_FILE="${INJECTED_SSH_CONFIG_DIR}/known_hosts"

mkdir -p ~/.ssh || debug "adding ssh keys to existing ~/.ssh"
touch ~/.ssh/authorized_keys

# If given, use SSH_KEY, otherwise check if the default is configured and use it
if [ "${SSH_KEY}" != "" ]; then
debug "Using passed SSH_KEY"
(umask 077 ; echo ${SSH_KEY} | base64 -d > ~/.ssh/pipelines_id)
elif [ ! -f ${IDENTITY_FILE} ]; then
error "No default SSH key configured in Pipelines."
exit 1
else
debug "Using default ssh key"
cp ${IDENTITY_FILE} ~/.ssh/pipelines_id
fi

if [ ! -f ${KNOWN_HOSTS_FILE} ]; then
error "No SSH known_hosts configured in Pipelines."
exit 2
fi

cat ${KNOWN_HOSTS_FILE} >> ~/.ssh/known_hosts
if [ -f ~/.ssh/config ]; then
debug "Appending to existing ~/.ssh/config file"
fi
echo "IdentityFile ~/.ssh/pipelines_id" >> ~/.ssh/config
chmod -R go-rwx ~/.ssh/
# END SSH setup

npm install --save -g serverless-step-functions

requirements.txt File:
certifi==2024.2.2 ; python_version >= "3.10" and python_version < "4.0"
charset-normalizer==3.3.2 ; python_version >= "3.10" and python_version < "4.0"
idna==3.6 ; python_version >= "3.10" and python_version < "4.0"
requests==2.31.0 ; python_version >= "3.10" and python_version < "4.0"
urllib3==2.2.1 ; python_version >= "3.10" and python_version < "4.0"


Pipelines File:

image: python:3.11

definitions:
caches:
poetry-deps:
key:
files:
- poetry.lock
path: .venv
steps:
- step: &audit-step
name: Audit Packages
caches:
- pip
- poetry-deps
script:
- curl -sSL [url]/to/install/poetry | python3 -
- export PATH="$HOME/.local/bin:$PATH"
- poetry config virtualenvs.in-project true
- poetry install
- poetry self add poetry-audit-plugin
- poetry audit
- step: &test-step
name: Test
caches:
- pip
- poetry-deps
script:
- curl -sSL [url]/to/install/poetry | python3 -
- export PATH="$HOME/.local/bin:$PATH"
- poetry config virtualenvs.in-project true
- poetry install
- poetry run python3 -m pytest --cov=./ --cov-report xml
- step: &lint-step
name: Lint code
caches:
- pip
- poetry-deps
script:
- curl -sSL [url]/to/install/poetry | python3 -
- export PATH="$HOME/.local/bin:$PATH"
- poetry config virtualenvs.in-project true
- poetry install
- poetry run flake8 . --extend-exclude=dist,build --show-source --statistics

- step: &deploy-functions-step
name: Deploy Functions
caches:
- docker
- node
- pip
script:
- &deploy-functions-script
pipe: atlassian/serverless-deploy:1.5.0
variables: &deploy-functions-variables
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
EXTRA_ARGS: "-s $STAGE --region $REGION"
DEBUG: "true"
PRE_EXECUTION_SCRIPT: './pipelines_pre_execution.sh'

pipelines:
default:
- parallel:
- step: *audit-step
- step: *test-step
- step: *lint-step

branches:
main:
- parallel:
- step: *audit-step
- step: *test-step
- step: *lint-step
- step:
<<: *deploy-functions-step
name: deploy
deployment: dev
script:
- <<: *deploy-functions-script
variables:
<<: *deploy-functions-variables
tags:
'v*':
- parallel:
- step: *audit-step
- step: *test-step
- step: *lint-step
- step:
<<: *deploy-functions-step
name: deploy
deployment: prod
trigger: manual
script:
- <<: *deploy-functions-script
variables:
<<: *deploy-functions-variables

 

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PERMISSIONS LEVEL
Site Admin
TAGS
AUG Leaders

Atlassian Community Events