3 Replies Latest reply on Jun 26, 2014 2:44 PM by Alexandre Nikolov

    Error: "AsyncCacheWriter is dead!"

    Alfie Kirkpatrick Newbie

      Hi, I've started load testing our app and after some time I'm seeing entries like the following in the log:

       

      17-02-2014 06:28:04,859 ERROR [AsyncStoreCoordinator-sessions] org.infinispan.persistence.async.AsyncCacheWriter  - ISPN000055: Unexpected error in AsyncStoreCoordinator thread. AsyncCacheWriter is dead!

      org.infinispan.util.concurrent.TimeoutException: ISPN000233: Waiting on work threads latch failed: java.util.concurrent.CountDownLatch@6909506a[Count = 5]

              at org.infinispan.persistence.async.AsyncCacheWriter$AsyncStoreCoordinator.workerThreadsAwait(AsyncCacheWriter.java:297)

              at org.infinispan.persistence.async.AsyncCacheWriter$AsyncStoreCoordinator.run(AsyncCacheWriter.java:254)

              at java.lang.Thread.run(Thread.java:662)

       

      // more entries, about one every 10-20 seconds

       

      17-02-2014 07:42:57,724 ERROR [AsyncStoreCoordinator-audit] org.infinispan.persistence.async.AsyncCacheWriter  - ISPN000055: Unexpected error in AsyncStoreCoordinator thread. AsyncCacheWriter is dead!

      org.infinispan.util.concurrent.TimeoutException: ISPN000233: Waiting on work threads latch failed: java.util.concurrent.CountDownLatch@620ee765[Count = 25]

              at org.infinispan.persistence.async.AsyncCacheWriter$AsyncStoreCoordinator.workerThreadsAwait(AsyncCacheWriter.java:297)

              at org.infinispan.persistence.async.AsyncCacheWriter$AsyncStoreCoordinator.run(AsyncCacheWriter.java:254)

              at java.lang.Thread.run(Thread.java:662)

       

      This happens for a couple of the most heavily used caches. All the later entries have Count=25.

       

      The caches are configured with a long modification queue length (200,000 for one and 100,000 for another), with 25 service threads. Concurrency level is set at 300.

       

      My app is also consuming all available heap over time which I'm digging into, but these errors start appearing when there is still a reasonable amount of free space.

       

      Any ideas / advice on what to look for appreciated! I'm going to try reducing the queue length and easing off the load a little to see if it helps.

       

      Many thanks, Alfie.

        • 1. Re: Error: "AsyncCacheWriter is dead!"
          Alfie Kirkpatrick Newbie

          Update: I disabled async mode and the memory consumption is now stable. Caches reach max entries quite quickly and heap stops growing at this point.

           

          Unfortunately my 999th percentile response time has also tanked (due to the Mongo store occasionally skipping a beat).

          • 2. Re: Error: "AsyncCacheWriter is dead!"
            Alfie Kirkpatrick Newbie

            I ended up rolling my own much simpler async cache writer implementation (with some additional write flow control I needed). I'd still like to know whether I hit a bug in ISPN or something else.

             

            Alfie.

            • 3. Re: Error: "AsyncCacheWriter is dead!"
              Alexandre Nikolov Newbie

              I am getting the same error. But in my case the heap usage is small - about 200Mb which is right about the average usage for the application. And I am getting it on a cache that has only about 500 key/value pairs, but some entries are updated very often - up to 10 times per second. I also use asynchronous writes.

              I am using Infinispan 6.0.2.Final

               

              I am also using cache listener that looks something like that:

              @Listener(sync = false)

              public class CacheListener {

                @CacheEntryModified

                 public void entryUpdated(CacheEntryModifiedEvent event){

                   if(!event.isPre()){

                       // act after the update

                   }

                 }

              }


              It appears that this exception does not affect application functionality.