Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,301,601
Community Members
 
Community Events
165
Community Groups

Running MSSQL service in Bitbucket Pipeline

Edited

The official MSSQL Linux Server Docker image expects at least 2GB of RAM: https://hub.docker.com/r/microsoft/mssql-server-linux/

It appears the Pipelines can only allocate at most 1GB to a service, however.

While we can run tests against MSSQL in Pipelines, tests that load a large amount of data using bulk inserts result in the server terminating the connection to the client.

I suspect the service is killing the connection due to lack of memory. All subsequent connections to the service then fail with a "Login timeout expired" message. I can't verify this since no error information is appearing in the pipeline log for the service

Is there any way to relax the memory ceiling on services? It doesn't look like the memory requirement for MSSQL will be relaxed any time soon: https://github.com/Microsoft/mssql-docker/issues/91

5 answers

I have since figured out that you can give more memory to a service by adding "memory: <amount in MB>"

Example:


definitions:
services:
sqlserver:
image: microsoft-mssql-server
memory: 2048
variables:
ACCEPT_EULA: Y
SA_PASSWORD: <YourStrong!Passw0rd>

I don't found memory in the documentation. But it works

A very simple solution: https://github.com/justin2004/mssql_server_tiny

service:
mssql
:
image:
name: justin2004/mssql_server_tiny
variables:
ACCEPT_EULA: Y
SA_PASSWORD: changeIt!

It works! Kind of. I can send queries/statements to it with tsql.

But I'd also like to run a db restore before my integration tests run – is there a way to get files into the docker image used by the service? If I apt install docker io && docker ps -a I get

Cannot connect to the Docker daemon at tcp localhost 2375. Is the docker daemon running?

So how can I make a file available to the service?

Is the build directory automatically available to the service in some path (like with --mount in docker)? Is there a way to expose volumes under services in bitbucket-pipelines yml?


(sorry for the weird punctuation, this forum keeps complaining that I'm posting links without using the link function :<)

You can generate your own Docker image based on this image and in these image include the test data your needed.  Or you add a step in the pipeline that populate the Database with data.

I tried

definitions:
caches:
stack: ~/.stack
services:
mssql:
image:
name: justin2004/mssql_server_tiny
memory: 2048 # default: 1024
volumes:
- /opt/atlassian/pipelines/agent/build:/stuff
variables:
ACCEPT_EULA: Y
SA_PASSWORD: lsakjfd

and in the pipeline step script:

- printf 'USE master\n%s\ngo\n' "RESTORE DATABASE CI FROM DISK='/stuff/bak.bak' WITH MOVE 'CI' TO '/var/opt/mssql/data/CI.mdf', MOVE 'CI_log' TO '/var/opt/mssql/data/CI_log.ldf'" |tsql -oq -U SA -S "$HOSTNAME" -p 1433 -P lsakjfd

 

but it fails with "Cannot open backup device '/stuff/bak.bak'. Operating system error 2(The system cannot find the file specified.)."

I think this question  is related.

Guess I may have to look into making a private docker image if there's no way to pass files into services :-/

One workaround is to install mssql manually in the main pipeline. Probably won't work for everyone, as you'll need pipeline-specific install steps, but worked for us at least (on a debian stretch image) with some .deb-fiddling.

Relaxed memory ceiling on containers would make things a lot simpler though :-/

Great, so how are we supposed to test things in MSSQL then if the amount of memory available to services is not enough?

I've hit the same issue, and it looks like the SQL Server container is only ever granted 1GB memory, no matter the pipeline memory configuration. :( 

Suggest an answer

Log in or Sign up to answer
TAGS
Community showcase
Published in Bitbucket

Git push size limits are coming to Bitbucket Cloud starting April 4th, 2022

Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...

2,267 views 2 9
Read article

Community Events

Connect with like-minded Atlassian users at free events near you!

Find an event

Connect with like-minded Atlassian users at free events near you!

Unfortunately there are no Community Events near you at the moment.

Host an event

You're one step closer to meeting fellow Atlassian users at your local event. Learn more about Community Events

Events near you