The official MSSQL Linux Server Docker image expects at least 2GB of RAM: https://hub.docker.com/r/microsoft/mssql-server-linux/
It appears the Pipelines can only allocate at most 1GB to a service, however.
While we can run tests against MSSQL in Pipelines, tests that load a large amount of data using bulk inserts result in the server terminating the connection to the client.
I suspect the service is killing the connection due to lack of memory. All subsequent connections to the service then fail with a "Login timeout expired" message. I can't verify this since no error information is appearing in the pipeline log for the service
Is there any way to relax the memory ceiling on services? It doesn't look like the memory requirement for MSSQL will be relaxed any time soon: https://github.com/Microsoft/mssql-docker/issues/91
It works! Kind of. I can send queries/statements to it with tsql.
But I'd also like to run a db restore before my integration tests run – is there a way to get files into the docker image used by the service? If I apt install docker io && docker ps -a I get
Cannot connect to the Docker daemon at tcp localhost 2375. Is the docker daemon running?
So how can I make a file available to the service?
Is the build directory automatically available to the service in some path (like with --mount in docker)? Is there a way to expose volumes under services in bitbucket-pipelines yml?
(sorry for the weird punctuation, this forum keeps complaining that I'm posting links without using the link function :<)
memory: 2048 # default: 1024
and in the pipeline step script:
- printf 'USE master\n%s\ngo\n' "RESTORE DATABASE CI FROM DISK='/stuff/bak.bak' WITH MOVE 'CI' TO '/var/opt/mssql/data/CI.mdf', MOVE 'CI_log' TO '/var/opt/mssql/data/CI_log.ldf'" |tsql -oq -U SA -S "$HOSTNAME" -p 1433 -P lsakjfd
but it fails with "Cannot open backup device '/stuff/bak.bak'. Operating system error 2(The system cannot find the file specified.)."
I think this question is related.
Guess I may have to look into making a private docker image if there's no way to pass files into services :-/
One workaround is to install mssql manually in the main pipeline. Probably won't work for everyone, as you'll need pipeline-specific install steps, but worked for us at least (on a debian stretch image) with some .deb-fiddling.
Relaxed memory ceiling on containers would make things a lot simpler though :-/
Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...
Connect with like-minded Atlassian users at free events near you!Find an event
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no Community Events near you at the moment.Host an event
You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events