6 Replies Latest reply on Jun 24, 2008 11:56 PM by dhinojosa

    Too many open files jeopardizing Seam application

    dhinojosa

      On my production server....Out of nowhere I get an error message 'Too many open files' and it knocks my server out for 10 minutes.   I believe I had narrowed it down to this Jira Issue . I was just wondering if anyone else has been getting this issue with their Seam app + JBoss 4.2.2.GA and if you have a workaround.  Also I was wondering if you are getting this issue and if it is bringing down your server once a day for 10 minutes or more if you can post a comment on issue pressing the importance of getting it fixed.


      Example of stack trace here:


      2008-06-08 13:29:07,284 ERROR [org.apache.tomcat.util.net.JIoEndpoint] Socket accept failed
      java.net.SocketException: Too many open files
              at java.net.PlainSocketImpl.socketAccept(Native Method)
              at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:384)
              at java.net.ServerSocket.implAccept(ServerSocket.java:453)
              at java.net.ServerSocket.accept(ServerSocket.java:421)
              at org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServerSocketFactory.java:61)
              at org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:309)
              at java.lang.Thread.run(Thread.java:619)
      2008-06-08 13:29:07,285 ERROR [org.apache.tomcat.util.net.JIoEndpoint] Socket accept failed
      java.net.SocketException: Too many open files
              at java.net.PlainSocketImpl.socketAccept(Native Method)
              at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:384)
              at java.net.ServerSocket.implAccept(ServerSocket.java:453)
              at java.net.ServerSocket.accept(ServerSocket.java:421)
              at org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServerSocketFactory.java:61)
              at org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:309)
              at java.lang.Thread.run(Thread.java:619)
      2008-06-08 13:29:08,375 WARN  [org.jboss.deployment.scanner.URLDeploymentScanner]
       Scan URL, caught java.io.IOException: Could not list directory '/usr/local/jboss-4.2.2.GA/server/default/deploy',
       reason unknown
      
      
      
      

        • 1. Re: Too many open files jeopardizing Seam application
          ctomc

          Hi,


          I have 2 application in production on RHEL 4 with jboss 4.2.2.ga and seam 2.0.x


          and have no issues like that. Apps are running with more than 2 months uptime(till now), only downtime is publishing new version of app.


          When this error occurs check what what handles are open. If you are running server on linux you can get that with lsof command on windows google for handle command prompt application.


          When you see who is keeping handles or see what files are opened you can narrow problem down to you application or to jbas or seam.


          But in any case post your results.


          cheers,
          tomaz

          • 2. Re: Too many open files jeopardizing Seam application

            I don't have this issues but part of things that might prevent from this is to reduce the scanning frequency of the deployment directory.


            By default JBoss is configured 'optimized' for development and not production. I guess you need to buy the 'total package' from Red Hat to get production-optimized profile, however, I'm not the best qualified to really comment on this one.


            However, JBoss wiki has/had a page with tips&tricks to fine tune JBoss profile (sorry got no time to look it up right now, may be I'll come back to this later).

            • 3. Re: Too many open files jeopardizing Seam application
              toby.tobias.hill.gmail.com

              We had similar problems. They were only showing on the Linux setups of my colleagues but not on my XP setup. I realized that the number of files allowed to open simultaneously on their Ubuntu installations was rather low, only 1024. I sent them the following (which I frankly don't know the details of - I am not a Linux guy - but I do know how to google) which apparently made them happy: ulimit -n 10000


              So in short: Try increasing this if it is too low on your machine.

              • 4. Re: Too many open files jeopardizing Seam application
                dhinojosa

                I'll let you know.....this sounds good.  Also Tomaz response lead me in the right direction too.

                • 5. Re: Too many open files jeopardizing Seam application
                  toby.tobias.hill.gmail.com
                  • 6. Re: Too many open files jeopardizing Seam application
                    dhinojosa

                    So, anyone can try this in their Linux shell.  List your processes, and find the main pid for jboss using jps (java process).


                    #jps -l
                    32218 sun.tools.jps.Jps
                    280 org.jboss.Main
                    



                    Then put that pid onto lsof (list of open files)


                    # lsof -p 280 | wc -l
                    



                    you will see the number of open files rise dramatically depending on the popularity of your server.  I can tell ya, I have been looking at this for a while....that number ain't going down.  Resources are not being closed...BAD JBOSS (not Seam) PROGRAMMERS!


                    Now you can see the limit by doing a


                    # sysctl -a
                    



                    and look for the fs.file-max key the number of open files that can be opened and maintained by the Linux kernel.  That I believe is the overall limit of open files (can someone verify?).


                    So what you have is a ticking time bomb, the number of files, very slowly/rapidly will increase by jboss along with other well-behaved processes until fs.file-max is reached, at which time, I can only speculate, Linux will just say screw it and start killing off old files which is why you will see a 5-6 minute downtime.