too many open files exception reaching 16000

Geoffrey_Jennings February 21, 2019

We are having to reboot our Jira Weekly to avoid maxing out the file limit. When looking into it we are seeing these open files in LSOF:

image (1).pngWe are trying to see what is opening these pipes and/or how to stop them from just hanging.

1 answer

1 vote
Andy Heinzer
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
February 21, 2019

We have seen this kind of performance problem a lot in operating systems where Jira is running under the root account on Linux/Unix based systems.    This is something we don't recommend because we know it tends to cause performance problems like this.

I would recommend that you create a separate user to run Jira.  See the KB :  How to set the user 'jira' to run in Linux

It might also help to take a closer look at the KB Loss of functionality due to too many open files errors as it could offer some additional work-around here, however even that KB notes that we do not recommend running Jira as root for this very reason.

Noni Khutane September 6, 2019

We are facing the same issue and Jira is run by the Jira user.

Andy Heinzer
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
September 6, 2019

Hi @Noni Khutane 

In the original post we can see the user is running as root, and we know that doing this can sometimes be a related cause to triggering this number of open files prematurely.  

In your case, I think there would still be value in trying to follow the steps in the KB Loss of functionality due to too many open files errors. I would be interested in learning some more details about the specifics of your environment such as:

  1. Which flavor of linux/unix is this?
  2. What version of the OS is this?
  3. What values you get back when running the command line task as the jira user of:
lsof +L1 > open_files.txt

This command is also in the Diagnosis section of that KB.  I'd be interested to see if you would share that information with us when this happens so we can better try to understand what files/locations are open when this limit is reached.   Perhaps we can better guide the troubleshooting steps needed here with some more information about your environment.

Cheers,

Andy

Suggest an answer

Log in or Sign up to answer