Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,368,405
Community Members
 
Community Events
168
Community Groups

A step does not have the minimum resources needed to run (1024 MB). Services on the current step are

Edited

1. Can't paste with formatting, sorry - I tried ..

2. Getting this error:

Configuration error

A step does not have the minimum resources needed to run (1024 MB). Services on the current step are consuming 7128 MB

How do I give some memory to my step 1? Couldn't find it anywhere explained. Tried to reduce from 7128 to 6000, getting the same error just with 6000 now. If anymore could point me in the right direction that would be helpful, thanks.

bitbucket-pipelines.yml

pipelines:
default: # TODO: add branches
- step:
name: ECR login script
image: python:3.7.3-alpine3.8
script:
- pip install awscli
- echo $(aws ecr get-login --no-include-email --region us-west-2) > ecr.sh
artifacts:
- ecr.sh
caches:
- pip
- step:
name: Build and push docker image
services:
- docker
size: 2x
script:
# TODO: should be prod ECR
- docker build --tag <image>:$BITBUCKET_COMMIT .
- export AWS_ACCESS_KEY_ID=<access key>
- sh ecr.sh
- docker push <image>:$BITBUCKET_COMMIT
caches:
- docker

options:
docker: true

definitions:
services:
docker:
# https://confluence.atlassian.com/bitbucket/use-services-and-databases-in-bitbucket-pipelines-874786688.html#UseservicesanddatabasesinBitbucketPipelines-Servicememorylimits
memory: 7128

1 answer

0 votes

Hi @tobias_fielitz

Have you tried using size: 2x in your first step as you do for the second step?

Bitbucket pipelines has two configurations for memory 4GB (size: 1x which is the default) and 8GB (size: 2x) as per Configure bitbucket-pipelines.yml (size).
⚠️That will double the minutes consumed for your build.

 

I also recommend you to check this document to be informed of what limitations you will find in Bitbucket pipelines:

 

I hope that helps.

I highly doubt that:

pip install awscli
echo $(aws ecr get-login --no-include-email --region us-west-2) > ecr.sh

requires 2x.

I've solved it by building a base image with the expensive stuff (numpy, scipy, pandas) and use that.

Like # people like this

Hi @tobias_fielitz

I reviewed your script again after your last message (thank you for calling my attention) and found this:

Run Docker commands in Bitbucket Pipelines - Atlassian Documentation

Add Docker to all build steps in your repository

options:
  docker: true

Note that even if you declare Docker here, it still counts as a service for Pipelines, has a limit of 1 GB memory, and can only be run with two other services in your build step. This setting is provided for legacy support, and we recommend setting it on a step level so there's no confusion about how many services you can run in your pipeline.

The options > docker > true is adding docker as a service to all steps and definitions > services > docker > memory > 7128  is configuring the docker instance to use a bigger amount of memory. I would say that to fix the issue you needed just to remove the options section and leave the docker service definition inside the second step only.

Thanks @Daniel Santos I've also done that.

Like Daniel Santos likes this

And it helped?

Beside removing

docker: true

you need to increase size (2x) for all steps that use pipe. Why? Because they run on docker too, so you will get the same misleading error.

Like Emptyfruit likes this

Suggest an answer

Log in or Sign up to answer
TAGS

Atlassian Community Events