Hello dear all,
i am using this method for pulling changed files to ftp:
image: samueldebruyn/debian-git
pipelines:
default:
- step:
deployment: Production
script:
- apt-get update
- apt-get -qq install git-ftp
- git ftp push --syncroot www -u "$FTP_USERNAME" -p "$FTP_PASSWORD" ftp://$FTP_SRV
but each time it takes up to 30 seconds for taking the image.. update, install.. than few sec for push.
can you tell me if there is another faster and better way.
thank you
Hi @Vasilev,
Bitbucket pipelines cache will not help in this case.
If you want to save the time required by the git-ftp installation, you will need to update the docker image used and publish it in DockerHub. Later you can use the new image in your pipeline and it will not require the installation steps.
Hi @Daniel Santos ,
yes, yesterday i saw this: Using-Pipelines-with-git-ftp-efficently
thank you for the answer !
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.