Please see my question on StackOverflow. I have a fix, but I don't feel I have an answer regarding using Docker with Bitbucket Pipelines.
I am not sure why my Docker Image seems to be ignored. After Bitbucket Pipeline completes, using:
I can `ls` into node_modules on bitbucket pipelines and see missing packages (compared to what is installed when I run the docker image locally). My npm build step in BitPipes doesn't work with my docker image due to missing packages (see StackOverflow link above).
Why doesn't the Docker Image I use to build locally work in Bitbucket Pipeline? My fix requires that I use bitbucket_pipeline.yml to execute npm installs (instead of using the docker image to npm install). I am not sure how I have misused docker, or misunderstood its place in BitPipes.
Hi @Chadd Portwine,
I suggest you run your custom image locally before trying it in Pipelines to ensure it works as expected :)
To answer your question, when you build and push your docker image, it creates layers using what's specified in your Dockerfile. Since you don't add your package.json, npm doesn't know what to install. So your Dockerfile should look like
# Add this line
COPY package.json package.json
# Current commands
RUN npm install
RUN npm install -g firebase-tools
Then when you run your image, you can see it has installed correctly
$ docker run -it --entrypoint /bin/sh test-image
bin ... node_modules ... package-lock.json package.json ... var
$ which firebase
It's important to note, the commands in the Dockerfile are not a simple substitute of the script commands in your bitbucket-pipelines.yml file.
Now, if you now use this image above in Pipelines, you can run these commands
- which firebase
- cd /
You should see that firebase has been installed correctly and that there is a node_modules directory at root, which is the working directory. This should still work for your application since the way npm works is it traverses up until it finds a node_modules directory.
This is different to when you have npm install commands in your bitbucket-pipelines.yml file because the working directory is /opt/atlassian/pipelines/agent/build, instead of root, so the node_modules are installed under /build.
Hope this helps!
@davina, your information was very helpful. I am able to use a Docker image to install Node and Firebase, and I understand what I was doing wrong.
My mistake - I forgot about `.gitignore` and how that affects the `node_modules` folder in source / Bitbucket Pipelines.
I was looking at my local `node_modules` folder (and building locally btw ;) which worked.
The `node_modules` in source control, by design, is not in-sync with my local folder.
It was necessary for me to `rm node_modules` and `npm install` using the bitbucket-pipelines.yml. Now, BitPipes finds the modules I have installed locally.
This is sort of the point of maintaining the `package.json`, but I got confused.
This is what confused me, and I hope I can explain it.
When Bitbucket Pipelines "spins up" an environment to run a build script, it has already "installed" a `node_modules` folder. And that "pre-installed" `node_modules` folder doesn't match my project package.json.
So, I found it necessary to delete the `node_modules` folder "pre-installed" by Pipelines. Then, Pipelines uses my package.json to rebuild the `node_modules` folder to my specification.
You are right, I do use `.gitignore` so the `node_modules` folder is not committed. But I still need to`npm install` my own packages in the bitbucket_pipelines.yml file.
I hope this helps!
Did you find solution for the problem? I'm using node:17.0 and I have the same problem. I can't find node_modules and package-lock.json in bit-bucket pipeline when I try to install from Dockerfile. But it works fine if I install from pipeline, not sure why this is happening