We have multiple AWS accounts to keep infrastructure, dev, and prod separate. Currently our BitBucket server lives in our "infrastructure" account and I am trying to set up a client in our "dev" account. We have a NAT gateway set up for both the bitbucket server and the client. The bitbucket server is behind a load balancer and the client is simply trying to clone via ssh through the NAT gateway but it is not working (ssh timeout).
I tried turning on debug logging but there is nothing showing up in the logs so this tells me that perhaps my client is not reaching the server.
Question: What is the best practice for setting up BitBucket in AWS with the requirement that we have enable client access from a different AWS account?
I have confirmed that we can clone repositories with a client corp env that is not in AWS so we know that cloning works in general. I just can't get cloning to work from a different AWS account.
Thoughts?
Can you check if the SSH port is opened in the security groups?
I found the issue. It seems that the best way to communicate with bitbucket is using the public IP. The answer was to use a NAT gateway with an EIP as the default route for the Jenkins slaves and whitelisting the EIP for SSH on the BB server side. On a side note, we set up the VPC peer for the Jenkins master to connect to the Jenkins slaves via the internal IP address thereby enabling full control from the Jenkins master while allowing the slaves to communicate as clients to BB. The only issue is now we need to set up an SSH gateway to enable us to connect to Jenkins slaves from the local VPC for troubleshooting on the command line when necessary. This makes things a bit more secure since now the Jenkins slaves do not require a public IP address which was part of the goal anyway.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.