JQL Tricks Plugin kills memory

Well it's official.  We have been able to isolate our Out of Memory issues to the JQL Tricks plugin.  Specifically the hasSubtasks() function.  With this plugin enabled and using this function I can run my 8GM Heap out and GC will not recover it.  With simply disabling this plugin I can not run JIRA Out of Memory regardless how many people I have hitting it as hard as they can for hours.

Any ideas?  Anyone else seeing this kind of erratic behavior with the JQL Tricks plugin?

5 answers

1 vote


Sorry to hear about the issue that you are facing. I am the original author of the plugin and I can only sympathize!

I have seen issues with some of these functions in large instances of JIRA and there is nothing we can do in the plugin code. In fact, we have implemented a number of features like limiting functions to selected users, limiting functions to selected projects etc to reduce the impact.

However, the main bottleneck was in the JIRA code itself. We went ahead and reported it with Atlassian and they have made drastic improvements in the later version of JIRA. See JRA-36368 for all the details about this issue. As mentioned in the ticket, this is not an issue with JQL Tricks plugin, instead it is an issue with JQL function plugin module.

There is still room for improvement but upgrading JIRA 6.3.4 or higher will definitely improve the performance of these functions a lot. And if that doesn't help, you might want to disable the hasSubtasks module and instead directly query the indexes as follows:

jqltField = subtasks

Pleas contact support at j-tricks dot com if you want more help.

Jobin, Thank you for the quick response. My concern is really not slow performance, but that it appears to be leaking memory. Or is this related to the DB Connections leak? ( https://jira.atlassian.com/browse/JRA-39769 ) I am trying to replicate this in my dev environment, but I only have about 100 issues in there and not sure this is causing the same issues that my prod server is causing with 20k+. Thanks Bill

We haven't seen any evidence of memory leaks yet. If you find anything, please raise a ticket in our JIRA and we will be happy to assist. Again, it would make sense to test it in JIRA 6.3.4+ because of the other fixes.

Changing the garbage collector to g1 helped one of out customers recently. 

Thanks Matt for your suggestion. Changing GC collector will also be a good alternative but we can't use this without performance testing. As a quietest solution, now we are disabling these functions (hasSubTask & epicsWhereIssueIn) to avoid any future risk and will follow-up with the vendor for permanent solution.

JVM: 1.8 OS: Linux 2.6.32-279.el6.x86_64 JIRA Version: 6.3.1 build 6329 JQL Tricks Plugin: 5.3.3

Hi @Jobin Kuruvilla [Go2Group],

We have also faced this issue last week while users execute following JQL T function:

key in epicsWhereIssueIn("")

Underline slow JQL query (multiple attempts):

2015-03-08 06:48:37,086 http-bio-8080-exec-15 INFO gkhan2 408x5190x27 ew14ri /issues/ [issue.search.providers.LuceneSearchProvider_SLOW] JQL query '' produced lucene query '*:*' and took '536' ms to run.
2015-03-08 06:48:37,175 http-bio-8080-exec-31 INFO gkhan2 408x5189x26 ew14ri /issues/ [issue.search.providers.LuceneSearchProvider_SLOW] JQL query '' produced lucene query '*:*' and took '1007' ms to run.
2015-03-08 06:48:37,175 http-bio-8080-exec-66 INFO gkhan2 408x5192x21 ew14ri /issues/ [issue.search.providers.LuceneSearchProvider_SLOW] JQL query '' produced lucene query '*:*' and took '735' ms to run.
2015-03-08 06:56:56,289 http-bio-8080-exec-24 INFO gkhan2 416x139x1 jehkjv /secure/QueryComponent!Jql.jspa [issue.search.providers.LuceneSearchProvider_SLOW] JQL query '' produced lucene query '*:*' and took '744' ms to run.
2015-03-08 06:57:05,864 http-bio-8080-exec-4 INFO gkhan2 417x141x2 jehkjv /rest/issueNav/1/issueTable [issue.search.providers.LuceneSearchProvider_SLOW] JQL query '' produced lucene query '*:*' and took '1736' ms to run.

It hanged JVM (frequent Full GC occurred) and crashed (Out of memory) after 5-7 minutes. Disabling certain functions may fix this issue but do you have the list of those suspected JQL T functions. Our JIRA instance is having following data statistics:

Custom Fields380
Users18062  (10578 currently active)

JIRA version: 6.3.12

JIRA tomcat heap size: 16 GB

JQL Tricks plugin version: 5.3.3


Keep in mind, we do not want to afford unscheduled outage (not even 15 minutes system restart), so we are looking error proof solution.

Please help




Unfortunately, https://jira.atlassian.com/browse/JRA-39375 remains unresolved and so there is nothing that can be done in the plugin side to improve things. You might want to vote/comment on https://jira.atlassian.com/browse/JRA-39375.

Thank you for pointing me this bug. I have voted and staring my conversation there. I will also take this issue with Atlassian premium support to put exclusive focus. I will reach you if anything we will require your help to improve at plugin side. Thanks once again.

Hi @Jobin Kuruvilla [Go2Group],

As an immediate next step, I want to disable "epicsWhereIssueIn("")" function but I did not find this function in JQL tricks function configuration page (see below screen shot for details). Can you please suggest how we can disable this function?


JQL T function config.png



This is something we can help with. Reach out to support at j-tricks dot com and I'm sure someone will respond.

I have logged the ticket and also received the response for how we can disable these functions. Thank you for your support and guidance.

Suggest an answer

Log in or Sign up to answer
Community showcase
Published Mar 13, 2019 in Marketplace Apps

Marketplace Spotlight: Marketing apps for Confluence to keep your teams working on the same page


245 views 0 6
Read article

Atlassian User Groups

Connect with like-minded Atlassian users at free events near you!

Find a group

Connect with like-minded Atlassian users at free events near you!

Find my local user group

Unfortunately there are no AUG chapters near you at the moment.

Start an AUG

You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs

Groups near you