Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,465,620
Community Members
 
Community Events
176
Community Groups

Build failure happening due to docker container exceeding memory limit

I am having a recurring issue where I am unable to have pipelines that work with sonar cloud due to the error message : Container 'docker' exceeded memory limit.

I am unable to figure out why I am firstly supposed to use the image in a build since all I want to do is to take the code from my repository and put it in sonarcloud for analysis I have not been able to analyse a single piece of code even after following the template available for bitbucket pipelines from sonar cloud

image: python:3.8 # Choose an image matching your project needs

clone:
depth: full # SonarCloud scanner needs the full history to assign issues properly

definitions:
caches:
sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
- step: &build-test-sonarcloud
name: Build, test and analyze on SonarCloud
caches:
- pip # See https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.htm
- sonar
script:
- pipe: sonarsource/sonarcloud-scan:1.2.0
variables:
SONAR_TOKEN: ${SONAR_TOKEN}

- step: &check-quality-gate-sonarcloud
name: Check the Quality Gate on SonarCloud
script:
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
variables:
SONAR_TOKEN: ${SONAR_TOKEN}


pipelines:
branches:
feature/*:
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud
pull-requests:
'**':
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud

I would like to know why am I being restricted to memory exceeded when I am using cloud services and how can I change my code to adhere to the restrictions ?

0 answers

Suggest an answer

Log in or Sign up to answer
TAGS

Atlassian Community Events