Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
It's not the same without you

Join the community to find out what other Atlassian users are discussing, debating and creating.

Atlassian Community Hero Image Collage

Build Docker image with git submodule

I have ASP NET Core project referring to my library using git submodules.

Project structure:
-Project Folder:

---DockerFile

---MY_APP:

------MY_APP.csproj

---MY_SUBMODULE

------MY_SUBMODULE.csproj

---MY_APP.sln

My Dockerfile:

FROM mcr.microsoft.com/dotnet/sdk:5.0 AS build-env
WORKDIR /app

COPY . ./
RUN dotnet publish MY_APP-c Release -o publish

FROM mcr.microsoft.com/dotnet/aspnet:5.0
WORKDIR /app
COPY --from=build-env /app/publish .

ENTRYPOINT ["dotnet", "MY_APP.dll"]

My pipelines file:
image: mcr.microsoft.com/dotnet/sdk:5.0

pipelines:
default:
- step:
name: Update submodules
script:
- apt-get update
- apt-get install --yes openssh-client
- git submodule update --init --recursive
- step:
name: Build and push
caches:
- dotnetcore
- docker
script:
- docker build -f Dockerfile -t IMAGENAME .

....
services:
- docker
- step:
name: Deploy to server branch develop
.....

 

On my local machine all is going well, but build with pipelines throws an error:
Skipping project "/app/MY_SUBMODULE/MY_SUBMODULE.csproj" because it was not found. I guess I have some problems with submodules files.

Step with submodule update does well, but i don't see that it uses path Cloning into '/opt/atlassian/pipelines/agent/build/MY_SUBMODULE'...

So, what is a proper way to build docker image inside bitbucket pipeline on a project with submodules?

 

1 answer

1 accepted

1 vote
Answer accepted

Hi @rchernigovskikh and welcome to the community!

I believe the culprit might be that the submodule is updated in a different step than the one that builds the Docker image.

Just to give you some context: for every step in a Pipelines build, a Docker container starts. The repo is cloned in that container, the commands of that step's script run, and then the container gets destroyed. Then, another container starts for the second step etc.

If files are generated or downloaded in a certain step, they won't be available to the next steps unless they are defined as artifacts. In your specific example, the submodule directory is empty until you run the command git submodule update --init --recursive, so the data is fetched during this first step. After this command, the build container for that step gets destroyed. A new container will then start for the second step, with a fresh clone, so the submodule data that was fetched in the previous step won't be available.

If you combine the first and second steps into one, the submodule files should be available in the build's clone directory when you build your Docker image.

Kind regards,
Theodora

Thanks for help!

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket

⭐ Calling all Bitbucket and DevOps experts: Special showcase opportunity ⭐

Hi, Bitbucket community! Are you a DevOps practitioner (or know one in your network)? Do you have DevOps tips, tricks, or learnings you'd like to share with the community? If so, we'd love to hea...

1,527 views 4 8
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you