7 Replies Latest reply on Dec 16, 2019 10:54 AM by ctomc

    Possible file handle leak in WildFly 12+

    garethwilson

      Hi All,

       

      I am trying to identify if behaviour that I am observing is correct or if WildFly is leaking file handle descriptors.

       

      During our standard performance testing after upgrading from WildFly 11 to 14 we ran into an issue regarding too many open files. After digging into things a bit more it looks like it is actually the number of pipes that WildFly has open that is increasing.

       

      To help reproduce the problem I have created a simple JSF 2.2 application that contains a large image (100mb to simplify testing). I can make this available if required.

       

      I am retrieving the image using the standard JSF resource URL: /contextroot/javax.faces.resource/css/images/big-image.png.xhtml

      And have also tried adding omnifaces and using the unmapped resource handler URL: /contextroot/javax.faces.resource/css/images/big-image.png

       

      Adding Omnifaces did not change the behaviour I am seeing, and I have only included it as we first thought that it might have been a contributing factor.

       

       

      Behaviour I am seeing

      • WildFly starts and jstack reports that it has two threads matching default task-* which is the default value for task-core-threads
      • If I send in 5 concurrent requests for my large image, 3 new default task-* threads are spawned to serve the requests. 3 new Linux pipes will also be created. 
      • If I stop my requests and wait for 2 minutes (the default value for task-keepalive) 3 of the threads will be removed. The pipes remain open.
      • Periodically - I believe about every 4.5 minutes some kind of clean up occurs and the pipes that were left over from the step above are removed.
      • However... If one of the original 2 worker threads is removed, e.g. task-1, task-3 and task-4 are removed, leaving task-2 and task-5, the pipe associated with task-1 is never cleaned up.

       

      Over time these pipes add up and as far as I can tell they are never removed. Is this a leak somewhere, and if so where? JSF? WildFly? Undertow?

       

       

      Things I have tried

      • I have used this tool to try and identify the file leaks: http://file-leak-detector.kohsuke.org/
      • WildFly 14, 17 and 18 - All exhibit the same behaviour
      • With and without Omnifaces (2.7 and 3.3) - Changes the stack trace from the file leak tool, but effectively the same behaviour
      • Changing the min and max threads to be the same - this prevents handles building up, but I'd rather not go down this route

       

       

      Additional Information

       

      I am running WildFly on CentOS 7.7, we have seen the same behaviour in RHEL 7.7.

      Java version is openjdk version "1.8.0_222"

       

      I have attached

      • lsof output of a freshly restarted WildFly 18 with my sample application (no omnifaces) deployed
      • lsof output after a few cycles of 5 concurrent requests pause for minutes, 5 concurrent requests. I waited 5 minutes after the test to let any clean up happen.
      • A sample stacktrace from the file-leak-detector tool
      • jstack output showing which threads are still active after the cycles of requests

       

      Thanks!

        • 1. Re: Possible file handle leak in WildFly 12+
          jaikiran

          Hello Gareth,

           

          Welcome to the forums.

           

          I think it's better to repost that question here. Additionally please also include the output of "lsof" command against this process to show us which file handle seem to be leaking.

          • 2. Re: Possible file handle leak in WildFly 12+
            jaikiran

            Also, it will be very convenient if you can test and get the results (of lsof) against WildFly 18, since that's the latest released version and makes it easier for us to debug and compare the code against.

            • 3. Re: Possible file handle leak in WildFly 12+
              smarlow

              It sounds like your seeing some resources being used and possibly kept open for reuse (e.g. socket is kept open), which is typical + expected behavior.  If you put a constant load on the server, for a long period of time and get out of memory or out of file handle failures, that would likely be a leak.

               

              I see nothing interesting in the after.txt, it is easier to read that if you sort on the name column.  I am attaching the after.txt sorted with just the name column (because that was easy to do with "vim" + "sort").  I was looking to see if any particular jars are loaded repeatedly. I do see that most jars are mentioned twice, which doesn't seem likely to indicate a leak. 

               

              I don't see much difference between fresh.txt + after.txt.

               

              I also don't see much of interest in threads.txt or stack-trace.txt.

               

              Scott

              • 4. Re: Possible file handle leak in WildFly 12+
                garethwilson

                Hi Scott,

                 

                I initially thought that it was caching of resources, and that maybe with the new threading implementation that was added in WildFly 12 we would just need to tune our ulimit (currently set to 4096 files for the user that runs WildFly).

                 

                I agree with you that the open files all look pretty normal, however I think you might have missed my pointer towards it being the number of pipes that are growing.

                 

                The before file shows 14 open pipes (2 handles per pipe)

                cat fresh.txt | grep pipe | wc -l

                28

                The after file shows 19 open pipes

                cat after.txt | grep pipe | wc -l

                38

                 

                That was after only about 15 minutes of me sending load at my sample application.

                When we run our regression suite (A selenium grid based setup with 20 runner threads) we see an increase of pipes at a rate of about 2 or 3 a minute over the one hour duration of the test.

                 

                Cheers,

                Gareth

                • 5. Re: Possible file handle leak in WildFly 12+
                  gjaekel

                  Dear Gareth,

                  it looks like I'm facing the same issue; see File handle (pipe/selector) pseudo-leak : The triples of two pipes and one selector grows limitless until the next run of the finalisation task of the garbarge collector.

                   

                  greetings

                  Guido

                  • 6. Re: Possible file handle leak in WildFly 12+
                    tshah.gr

                    Hi All ,

                     

                    I am facing same issue ,My OS is Centos  7 and using Wildfly 18 ,although I have increased file limit to 65k and after 10 to 15 hrs Wildfly exception as

                     

                    Caused by: java.nio.file.FileSystemException: /opt/wildfly/standalone/deployments: Too many open files in system

                    at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)

                    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)

                    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)

                    at sun.nio.fs.UnixFileSystemProvider.newDirectoryStream(UnixFileSystemProvider.java:427)

                    at java.nio.file.Files.newDirectoryStream(Files.java:589)

                    at org.jboss.as.server.deployment.scanner.FileSystemDeploymentService.listDirectoryChildren(FileSystemDeploymentService.java:1358)

                    • 7. Re: Possible file handle leak in WildFly 12+
                      ctomc

                      first check what files are open that are eating up to you file limit.

                      on linux use command "lsof" which will tell you what process has what open file handles.

                      form there we can see what files are open and what would be the reason.