we are building a software that aims to find out what repositories contain a specific file (eg: .gitignore)
In order to obtain that file, we send our requests to the bitbucket REST APIs after parsing the list of repositories (https://api.bitbucket.org/2.0/repositories).
The rate-limit is pretty high but we need to ensure that, if something goes wrong and that limits are hit, we can recover our process from last successful request.
We have implemented a simple solution but it’s really hard to test.
We cannot exhaust our resources (60k https r/h and 1k r/h on repos) every time we want to check out if the implemented solution is correct (and it’s really a bad idea to waste server resources only for these tests).
Do you have any ideas on how manage this problem?
Beginning on April 4th, we will be implementing push limits. This means that your push cannot be completed if it is over 3.5 GB. If you do attempt to complete a push that is over 3.5 GB, it will fail...