Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Updating Drupal with Composer and pipeline

Richard Tremblay May 28, 2020

Hi,

I have 10 websites built with Drupal 8 and each website has 2 environment (TEST & PROD), and each website has 1 repo in BitBucket, but the repo doesn't contain the Drupal files, only the custom theme (plus some external pages not related to Drupal).

When it's time to update Drupal core and all the modules, it requires a lot of time to connect to each environment and run Composer, so I was wondering if there's a better way to do it with pipeline (or anything else in BitBucket)?

Maybe I could add Composer.json in my repo and create a pipeline script that execute when I push into a specific branch (i.e. Composer-Test and Composer-Prod) and it will execute all the command directly in the server.

Any suggestions and how-to would be greatly appreciated.

Thanks.

1 answer

1 accepted

0 votes
Answer accepted
ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
May 28, 2020

This sounds like a build for deployment. Bitbucket supports three environments for deployment (test, staging and production) so this looks fitting for your TEST & PROD use-case.

The build does the composer install, so yes, at least somewhere you should place your `composer.json` file. IIRC Drupal 8 is compatible w/ Composer, go for it.

What does build and deployment mean here?

Instead of connecting to each environment you let Bitbucket create the packages (see artifacts) and in another step you can let do Bitbucket the deployments.

As you have a test environment I'd recommend to automatically deploy to test and make deploying to production a manual step so you can first review the test environments.

Keep in mind that for anything that you want to do w/ Bitbucket Cloud Pipelines Plugin, you normally need to have any file in the repository that is involved, e.g. composer.json.

If there is a reason why you did not considered to add it to the repository, this might be worth to review before continuing.

Richard Tremblay May 29, 2020

Thanks for the information, looks like what I need.

Im no BitBucket expert, so I'm a little confused on how this will work tho.

From what I understand, I simply put the Composer.json in my repo, and I add a command in pipeline to execute Composer install/update, which will download all the Drupal 8 files inside BitBucket Cloud and after I can deploy them to my environments (Test first to review, and Prod after) ?

I don't want to ask too much, but if you could maybe break it down into steps that I need to accomplish, at least from there I can try to Google the details and do some test (I have setup a TEST Server with a BitBucket Repo where I can messed up anything and everything safely).

Thank you for your help

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
May 29, 2020

Working step by step is a very good idea, no problems for asking.

A good start is PHP with Bitbucket Pipelines in the Bitbucket Support docs.

It shows how to choose the PHP image / version and how to install composer.

Then you do composer install. If that is already all you need then basically that pipeline step is done, give the step an artifacs enty:

- step:
image: php:7.4
script:
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- vendor/bin/phpunit --testsuite unit,integration
artifacts:
- vendor

 The next steps, e.g. deployments, don't need to do composer install any longer as the artifacts go over to the next step.

Try it out and feel free to ask any additional questions.

Richard Tremblay June 1, 2020

I really appreciate your help.

If I understand correctly, Docker is a VM, and I can specify which configuration I want by providing a predefined image, which should be something that fits my needs to accomplish some automation and/or fit the server where I will deploy.

And in your example, pipeline download composer in the Docker, and run the install command.

At that step I get a message, not sure if it is important:

Do not run Composer as root/super user!


The next thing, I get this error:

the requested PHP extension gd is missing

I have the feeling that I should probably add a line to install the extension before I install composer or load a Docker image with that installed...?


Now Im not sure I understand what you do next, the vendor/phpunit and the artifacts part?

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
June 2, 2020

Do not run Composer as root/super user!

More recent versions of composer should not display the root/super user warning inside Docker containers, but I have not tested on Bitbucket. Locally I  could not reproduce the warning. It says that composer runs as root which is something you normally don't want if you do a `composer install` locally.

the requested PHP extension gd is missing

If you know that building the software (running the pipeline) does not require that php extension, you can either ignore all platform requirements on install:

composer install --ingore-platform-reqs

or add it specifically to composer config platform in the composer.json.

/Edit: Using `--ignore-platform-reqs` is not recommended, composer has some more info about trouble-shooting PHP version & extensions in their docker image read-me.

Alternatively choose a different docker image or create your own that has all the software needed. This depends on the base image etc. As you have noticed, the image is where the pipeline runs in.

Now Im not sure I understand what you do next, the vendor/phpunit and the artifacts part? 

That totally depends on what you need in your specific case. If you don't need to / want to run phpunit tests, then you can skip that part.

The artifacts part is to have the vendor folder (created by the composer install) in the next pipeline steps, e.g. where you would do the deployment, for example to the test environment.

Richard Tremblay June 2, 2020

Thanks again for your time.

For the Composer root user, if it's just a warning I guess in this case I don't need to worry about that?

I'm running in all kind of roadblock, there must be an image already defined for Drupal 8 that has all the requirement, so I don't have to define every PHP configuration, extension, etc. I can't believe I'm the only one trying to do this in a Docker?
I've been searching and reading all morning but I can't figure what it should be. But maybe I'm wrong about that. 

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
June 2, 2020

Do you have a repository with this (or an exemplary) Drupal extension so I could take a look into it? That would help me to give more concrete answers as right now it's pretty general.

There are a lot of docker images available so it sometimes can be hard to find a good one.

And to get things running easier, I can suggest to run the `bitbucket-pipelines.yml` file locally with  the `pipelines` runner, a command-line application written in PHP. It has not all the options as in Bitbucket Pipelines but serves me very well to write the pipelines, test docker images etc.. 

Richard Tremblay June 2, 2020

I can setup a 'test' repo to work with, but like I said in my first post, all my repositories don't contain the Drupal files, only the theme and some external pages not related to Drupal.

First, just to make sure we are on the same page, here is what I thought was possible to do:

1- Add the Composer.json to each website repository.

2- Create a branch called Drupal-Test where I update the Composer.json and push it to BitBucket

3- In BitBucket when a new push is sent to the Drupal-Test branch, pipeline will load a Docker with an image that fit the Drupal requirement and my Linux server.

2- Run composer to install the latest version of Drupal in the Docker image.

3- Run composer to install/update all the Drupal modules based on the composer.json.

4- SSH/upload the entire structure of files created in Docker to my TEST server.

5- Test the code deployed in my TEST server.

6- If all good, repeat the process to Drupal-Prod branch to deploy in my PROD server.

Does that make sense?

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
June 2, 2020

Some questions back:

1. In your Drupal Website projects, do you have a composer.json for the project or you just have the theme files in there?

2. How do you install Drupal normally and how do you add your theme then?

3. IIRC Most things Drupal can be automated with Drush. Are you making use of it?

4. About how many repositories are we talking about?

Richard Tremblay June 2, 2020

1. Right now the composer.json is not in the project repo, only the custom theme.

2. Drupal manually using FTP, and the custom theme using BitBucket/pipeline/ssh.

3. Not using Drush as of now, but not opposed to that if it make things done, but I read it might not be supported for Drupal anymore.

4. 16 repositories, but only 10 are using Drupal, and that's the one I want to automate.

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
June 2, 2020

So the composer.json is added (by commit I assume) in the Drupal-Test branch which is used to trigger the Pipeline as I understand it.

Would this composer.json all the same for those 10 Drupal repositories? Where does it come from / how does it look like (you can redact private information, I'm trying to understand this better, not being in a Drupal 8 project lately).

You write you install Drupal manually so far via FTP. When you do that, is it by extracting tar.gz packages or similar and then uploading the files via FTP? Would switching to the composer.json change that? [I guess so, just asking to better understand what the change is]

Richard Tremblay June 2, 2020

So the composer.json is added (by commit I assume) in the Drupal-Test branch which is used to trigger the Pipeline as I understand it.

It's not done right now, but that was my idea.

Would this composer.json all the same for those 10 Drupal repositories? Where does it come from / how does it look like (you can redact private information, I'm trying to understand this better, not being in a Drupal 8 project lately).

I didn't figure all that yet, but all our web site use the same bunch of modules, plus some specific ones. I could have a default one when I setup a new website to start with, and update it base on the company needs. 

You write you install Drupal manually so far via FTP. When you do that, is it by extracting tar.gz packages or similar and then uploading the files via FTP? Would switching to the composer.json change that? [I guess so, just asking to better understand what the change is]

Downloading the zip, extracting locally, uploading all files, and running the install through the web browser. 
I don't mind installing Drupal manually when I setup a new website (we don't create new web site that often), it's the maintenance of updating Drupal and updating all the modules and dependencies that is a lot of work... 2 server per website x 10 web site x once a month... make no sens to do that manually. So if I can just run the composer update and push all the new files to the server, I can do it in a few clicks without connecting to each server manually.

Ask all the questions you need, I will be glad to answer/explain if that help to narrow down the best solution. 

ktomk
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
June 3, 2020

Okay, for me this looks like you just want to

  1. do composer install in the first step.
  2. Give that first step vendor/** as artifacts.
  3. You can use the official docker composer image for the first step, specify the php version and extensions you expect on the deployment target system in composer.json under config.platform (docker composer image docs) *

Set it up so it runs without errors. When done, add another step for the deployment. This depends on how you deploy which I don't know.

* For starters I would not do the optimal suggestion to create multi stage docker images

So much for the theory, following the installation instructions of Drupal 8 it is recommended to use composer create project (and there see about doing a modified Drupal 8 install).

That requires an empty directory. If you want this in the main Pipeline clone directory (BITBUCKET_CLONE_DIR), it is not empty. Also the install fails as you wrote because of the required extensions. So composer create project with --no-install into a temporary folder which is then move-merged with a tar-pipe into the clone dir. Then the composer platform configuration is done and then installed:

pipelines:
default:
- step:
image: composer
script:
- # prepare to install drupal 8, requires an empty directory
- PRJTMP="$(mktemp -d --tmpdir composer-drupal8.XXXXXX)"
- composer create-project --no-install drupal/recommended-project "${PRJTMP}"
- # tar-pipe to make the BITBUCKET_CLONE_DIR a drupal/recommended project
> # (overwrites existing files with the same name)
tar cf - --remove-files -C "${PRJTMP%/*}" "${PRJTMP##*/}" \
| tar xf - --strip-components=2
- # the php version, use the one of the target system
composer config platform.php 7.3.18
- # drupal 8 requires the gd extension
composer config platform.ext-gd 2.3.0
- # the actual composer install of drupal 8 into BITBUCKET_CLONE_DIR
composer update --lock
artifacts:
- vendor/** # composer
- web/** # drupal web folder
caches:
- composer

I hope this example is at least read able. Depends a bit on your directory layout whether or not it fits, here it's the worst case maybe, hence the tar-pipe.

/E: removed the composer plugin, for Drupal it made no difference.

Like Richard Tremblay likes this
Richard Tremblay June 4, 2020

I will test that solution and let you know, thanks a lot for your time, it's greatly appreciated!

Richard Tremblay June 4, 2020

The script runs, I just need to make a few modification and understand how to use my json, but I think you put me on the right track, and I have probably enough details to Google the pieces missing.

I added the rsych ssh part to deploy in my server, I see content being uploaded, not exactly how I want it, but I think it's only a matter of how the Drupal is being installed by Composer in the Docker image.

I'll play around with different settings, and I will let you how it goes.

You sir are a champion!

Thanks again!!

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events