I am having a recurring issue where I am unable to have pipelines that work with sonar cloud due to the error message : Container 'docker' exceeded memory limit.
I am unable to figure out why I am firstly supposed to use the image in a build since all I want to do is to take the code from my repository and put it in sonarcloud for analysis I have not been able to analyse a single piece of code even after following the template available for bitbucket pipelines from sonar cloud
image: python:3.8 # Choose an image matching your project needs
clone:
depth: full # SonarCloud scanner needs the full history to assign issues properly
definitions:
caches:
sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
- step: &build-test-sonarcloud
name: Build, test and analyze on SonarCloud
caches:
- pip # See https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.htm
- sonar
script:
- pipe: sonarsource/sonarcloud-scan:1.2.0
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
- step: &check-quality-gate-sonarcloud
name: Check the Quality Gate on SonarCloud
script:
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
variables:
SONAR_TOKEN: ${SONAR_TOKEN}
pipelines:
branches:
feature/*:
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud
pull-requests:
'**':
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud
I would like to know why am I being restricted to memory exceeded when I am using cloud services and how can I change my code to adhere to the restrictions ?