Security and Compliance in Data Center: Rate Limiting

Welcome back to our Data Center features for stronger security and compliance series! Today I’d like to share a bit about rate limiting.

I’m sure you’ve noticed that threats to your instance can have significant negative impacts. Whether they degrade your application’s performance or take it down completely, these threats likely cost your agency resources and frustrate your staff. Rate limiting, another great Data Center feature, uses self-protection to keep your instance safe. Not only does it help to prevent brute-force attacks, but it also helps to prevent the more common, accidental instance degradation from internal actors.

When automated integrations or scripts send requests to your instances in huge bursts, it can affect the software’s stability, leading to drops in performance or even downtime. Put simply, rate limiting controls how many external REST API requests both automations and users can make, as well as how often they can make them. This uses the token bucket algorithm, giving users a balance of tokens that can be exchanged for requests to stabilize your instance.

Beyond reducing downtime, rate limiting also enhances your agency’s security and adds new levels of control. With custom configurations, admins can optimize user experience based on individual needs, or simply add users to an allow-list, bypassing all restrictions. Jira and Confluence offer a block-listing capability to prevent certain users from acting at all for security purposes, which I’ll discuss further in next week’s post.

To learn more about how you can protect your Data Center instance from threats, check out this video or comment your questions below!

0 comments

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events