Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root


1 badge earned


Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!


Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.


Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!


Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
Community Members
Community Events
Community Groups

Is there a way to limit the number of issues a scheduled Automation Rule can modify?


My question is in the context of post-migration activities (from Server to Cloud) where bulk edits are required to data in migrated custom fields.

Other users might want to do bulk edits for other reasons.

In this case we need to update 5k to 6k issues.


I know it is possible to do this using bulk edit - you just need to repeat the process (6 times in my case) updating 1000 issues each run. This is pretty tedious, and more so for larger numbers of issues.

Instead I created a scheduled automation rule to update issues matching a specific JQL. This gets throttled and results in errors on the Automation page, but at least it does update the issues. Not sure how the Cloud Ops team feel about this...


Can you make a JQL query that returns a maximum number of rows? Something like SQL TOP or LIMIT clauses.

After reading the documentation I have not found a way to do this.

I guess you could do this using the REST API but that seems overkill for a simple search and replace operation.

Has anyone found a better solution?

3 answers

1 accepted

0 votes
Answer accepted
John Funk
Community Leader
Community Leader
Community Leaders are connectors, ambassadors, and mentors. On the online community, they serve as thought leaders, product experts, and moderators.
Apr 17, 2023

Hi Dwight,

If there is something unique about the issues, then you can limit your query that way - such as are all of them or on the same project or multiple projects? Perhaps run the query to see how many your return using the create date, etc. 

Bill Sheboy
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
Apr 17, 2023

Hi @Dwight Holman 

Yes, John's answer:

How frequently are you running that scheduled trigger rule?  I ask because that trigger (with JQL) is limited to 100 issues at a time (subject to what else happens in your rule with related issues).  If you watch your processing time you may be able to tune the schedule to process 100 issues every X-minutes such that you do not exceed your service limits.  Please look here for more information:

Kind regards,

@Bill Sheboy- initially my automation rule frequency was hourly - meaning the issues would update in a few hours (6-7 hours). I increased the interval to 2 hours given how long the first run took. Rather slow, but it did update the issues with less tedium.

Regardless of the frequency/interval the automation rule still gets throttled (at about 1000 issues per-run). Will need to do this for our actual (production) Cloud migration, so I was hoping there was an easy way to improve the process.

If I could create an automation rule that processes only 100 issues per 'run' it would avoid these alerts, and I could use a higher frequency.

Bill Sheboy
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
Apr 18, 2023

Thanks for that information, Dwight. 

It appears something has changed as I thought the scheduled trigger was already limited to 100 issues at a time...and that throttling only happened when either the rule ran too frequently or it referenced to many related issues, per each triggered issue.

John's approach sounds like a good alternative, such as checking create date and make your own algorithm to limit processing.  (For example, use the hour from {{now}} as a proxy for the number of days/months/etc. in the past to check in the JQL.)  That would break up processing into 24 chunks.

Like Dwight Holman likes this

Hi @Bill Sheboy - thanks for your suggestion of using {{now}} in a JQL query to limit the number of issues. That would help in our situation.

Like # people like this

Subsequently found two JAC tickets that seem related. Although it doesn't look like there are any suggestions directly relating to enhancing automation. I'll send a request.

JRACLOUD-42307 Automatically Edit 1000 Issues Continuously if Issues to be Edited is more than 1000

JRACLOUD-68133 Cannot use JQL In Escalation Service because of failed: Add JQL LIMIT support

I'm voting for these.

@John- the issues are multiple projects - some have a few hundred issues to update and others have thousands.

I'm using a JQL filter in the automation rule so it:

  1. Only selects issues matching specific custom-field text. Before any issues are modified this query indicates about 5600 issues.
  2. Sorts the matching issues by the last updated date (oldest). However because our situation is migration - all of the issues have previously been modified within a short period of time (to inject the custom field data we're modifying) :-\

Neither of these prevent the automation rule exceeding the limits.

Suggest an answer

Log in or Sign up to answer
Site Admin
AUG Leaders

Atlassian Community Events