4 Replies Latest reply on Mar 14, 2013 11:35 AM by marcb

    TimeoutException, Unable to acquire lock on key

    marcb

      Hello

       

      While doing a clear (via hotrod) on two caches, one has 1 million entries and the other one has 2 million entries and both are backed by a CassandraCacheStore, I get the following exception:

       

      2013-03-12 11:16:17,321 - ERROR - HotRodServerWorker-2 - [org.infinispan.interceptors.InvocationContextInterceptor:handleAll@148] - ISPN000136: Execution error
      org.infinispan.util.concurrent.TimeoutException: Unable to acquire lock after [10 seconds] on key [ByteArrayKey{data=ByteArray{size=16, hashCode=d55956e9, array=0x033e0d3132383935..}}] for requestor [Thread[OOB-6,AccountFinder,dl380x2382-8473,5,main]]! Lock held by [Thread[HotRodServerWorker-4,5,main]]
              at org.infinispan.util.concurrent.locks.LockManagerImpl.lock(LockManagerImpl.java:213)
              at org.infinispan.util.concurrent.locks.LockManagerImpl.acquireLockNoCheck(LockManagerImpl.java:196)
              at org.infinispan.interceptors.locking.AbstractLockingInterceptor.lockKey(AbstractLockingInterceptor.java:114)
              at org.infinispan.interceptors.locking.NonTransactionalLockingInterceptor.visitClearCommand(NonTransactionalLockingInterceptor.java:99)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.interceptors.base.CommandInterceptor.handleDefault(CommandInterceptor.java:132)
              at org.infinispan.commands.AbstractVisitor.visitClearCommand(AbstractVisitor.java:77)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.interceptors.base.CommandInterceptor.handleDefault(CommandInterceptor.java:132)
              at org.infinispan.commands.AbstractVisitor.visitClearCommand(AbstractVisitor.java:77)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.statetransfer.StateTransferInterceptor.handleTopologyAffectedCommand(StateTransferInterceptor.java:216)
              at org.infinispan.statetransfer.StateTransferInterceptor.handleWriteCommand(StateTransferInterceptor.java:194)
              at org.infinispan.statetransfer.StateTransferInterceptor.visitClearCommand(StateTransferInterceptor.java:161)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.interceptors.InvocationContextInterceptor.handleAll(InvocationContextInterceptor.java:128)
              at org.infinispan.interceptors.InvocationContextInterceptor.handleDefault(InvocationContextInterceptor.java:92)
              at org.infinispan.commands.AbstractVisitor.visitClearCommand(AbstractVisitor.java:77)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.InterceptorChain.invoke(InterceptorChain.java:343)
              at org.infinispan.commands.remote.BaseRpcInvokingCommand.processVisitableCommand(BaseRpcInvokingCommand.java:61)
              at org.infinispan.commands.remote.SingleRpcCommand.perform(SingleRpcCommand.java:70)
              at org.infinispan.remoting.InboundInvocationHandlerImpl.handleInternal(InboundInvocationHandlerImpl.java:101)
              at org.infinispan.remoting.InboundInvocationHandlerImpl.handleWithWaitForBlocks(InboundInvocationHandlerImpl.java:122)
              at org.infinispan.remoting.InboundInvocationHandlerImpl.handle(InboundInvocationHandlerImpl.java:86)
              at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.executeCommandFromLocalCluster(CommandAwareRpcDispatcher.java:247)
              at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.handle(CommandAwareRpcDispatcher.java:220)
              at org.jgroups.blocks.RequestCorrelator.handleRequest(RequestCorrelator.java:484)
              at org.jgroups.blocks.RequestCorrelator.receiveMessage(RequestCorrelator.java:391)
              at org.jgroups.blocks.RequestCorrelator.receive(RequestCorrelator.java:249)
              at org.jgroups.blocks.MessageDispatcher$ProtocolAdapter.up(MessageDispatcher.java:598)
              at org.jgroups.JChannel.up(JChannel.java:707)
              at org.jgroups.stack.ProtocolStack.up(ProtocolStack.java:1020)
              at org.jgroups.protocols.pbcast.STATE_TRANSFER.up(STATE_TRANSFER.java:178)
              at org.jgroups.protocols.FRAG2.up(FRAG2.java:181)
              at org.jgroups.protocols.FlowControl.up(FlowControl.java:400)
              at org.jgroups.protocols.pbcast.GMS.up(GMS.java:896)
              at org.jgroups.protocols.pbcast.STABLE.up(STABLE.java:245)
              at org.jgroups.protocols.UNICAST.up(UNICAST.java:414)
              at org.jgroups.protocols.pbcast.NAKACK2.handleMessage(NAKACK2.java:721)
              at org.jgroups.protocols.pbcast.NAKACK2.up(NAKACK2.java:574)
              at org.jgroups.protocols.BARRIER.up(BARRIER.java:126)
              at org.jgroups.protocols.VERIFY_SUSPECT.up(VERIFY_SUSPECT.java:143)
              at org.jgroups.protocols.FD.up(FD.java:253)
              at org.jgroups.protocols.FD_SOCK.up(FD_SOCK.java:288)
              at org.jgroups.protocols.MERGE2.up(MERGE2.java:205)
              at org.jgroups.protocols.Discovery.up(Discovery.java:359)
              at org.jgroups.protocols.TP.passMessageUp(TP.java:1263)
              at org.jgroups.protocols.TP$IncomingPacket.handleMyMessage(TP.java:1825)
              at org.jgroups.protocols.TP$IncomingPacket.run(TP.java:1798)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
              at java.lang.Thread.run(Unknown Source)
      2013-03-12 11:16:17,325 - DEBUG - HotRodServerWorker-2 - [org.infinispan.server.core.logging.Log$class:debug@50] - Exception caught
      org.infinispan.server.hotrod.HotRodException: org.infinispan.util.concurrent.TimeoutException: Unable to acquire lock after [10 seconds] on key [ByteArrayKey{data=ByteArray{size=16, hashCode=d55956e9, array=0x033e0d3132383935..}}] for requestor [Thread[OOB-6,AccountFinder,dl380x2382-8473,5,main]]! Lock held by [Thread[HotRodServerWorker-4,5,main]]
              at org.infinispan.server.hotrod.HotRodDecoder.createServerException(HotRodDecoder.scala:214)
              at org.infinispan.server.core.AbstractProtocolDecoder.decode(AbstractProtocolDecoder.scala:77)
              at org.infinispan.server.core.AbstractProtocolDecoder.decode(AbstractProtocolDecoder.scala:47)
              at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500)
              at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435)
              at org.infinispan.server.core.AbstractProtocolDecoder.messageReceived(AbstractProtocolDecoder.scala:387)
              at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
              at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
              at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
              at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:107)
              at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:313)
              at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:88)
              at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
              at java.lang.Thread.run(Unknown Source)
      Caused by: org.infinispan.util.concurrent.TimeoutException: Unable to acquire lock after [10 seconds] on key [ByteArrayKey{data=ByteArray{size=16, hashCode=d55956e9, array=0x033e0d3132383935..}}] for requestor [Thread[OOB-6,AccountFinder,dl380x2382-8473,5,main]]! Lock held by [Thread[HotRodServerWorker-4,5,main]]
              at org.infinispan.util.concurrent.locks.LockManagerImpl.lock(LockManagerImpl.java:213)
              at org.infinispan.util.concurrent.locks.LockManagerImpl.acquireLockNoCheck(LockManagerImpl.java:196)
              at org.infinispan.interceptors.locking.AbstractLockingInterceptor.lockKey(AbstractLockingInterceptor.java:114)
              at org.infinispan.interceptors.locking.NonTransactionalLockingInterceptor.visitClearCommand(NonTransactionalLockingInterceptor.java:99)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.interceptors.base.CommandInterceptor.handleDefault(CommandInterceptor.java:132)
              at org.infinispan.commands.AbstractVisitor.visitClearCommand(AbstractVisitor.java:77)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.interceptors.base.CommandInterceptor.handleDefault(CommandInterceptor.java:132)
              at org.infinispan.commands.AbstractVisitor.visitClearCommand(AbstractVisitor.java:77)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.statetransfer.StateTransferInterceptor.handleTopologyAffectedCommand(StateTransferInterceptor.java:216)
              at org.infinispan.statetransfer.StateTransferInterceptor.handleWriteCommand(StateTransferInterceptor.java:194)
              at org.infinispan.statetransfer.StateTransferInterceptor.visitClearCommand(StateTransferInterceptor.java:161)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.base.CommandInterceptor.invokeNextInterceptor(CommandInterceptor.java:118)
              at org.infinispan.interceptors.InvocationContextInterceptor.handleAll(InvocationContextInterceptor.java:128)
              at org.infinispan.interceptors.InvocationContextInterceptor.handleDefault(InvocationContextInterceptor.java:92)
              at org.infinispan.commands.AbstractVisitor.visitClearCommand(AbstractVisitor.java:77)
              at org.infinispan.commands.write.ClearCommand.acceptVisitor(ClearCommand.java:60)
              at org.infinispan.interceptors.InterceptorChain.invoke(InterceptorChain.java:343)
              at org.infinispan.commands.remote.BaseRpcInvokingCommand.processVisitableCommand(BaseRpcInvokingCommand.java:61)
              at org.infinispan.commands.remote.SingleRpcCommand.perform(SingleRpcCommand.java:70)
              at org.infinispan.remoting.InboundInvocationHandlerImpl.handleInternal(InboundInvocationHandlerImpl.java:101)
              at org.infinispan.remoting.InboundInvocationHandlerImpl.handleWithWaitForBlocks(InboundInvocationHandlerImpl.java:122)
              at org.infinispan.remoting.InboundInvocationHandlerImpl.handle(InboundInvocationHandlerImpl.java:86)
              at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.executeCommandFromLocalCluster(CommandAwareRpcDispatcher.java:247)
              at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.handle(CommandAwareRpcDispatcher.java:220)
              at org.jgroups.blocks.RequestCorrelator.handleRequest(RequestCorrelator.java:484)
              at org.jgroups.blocks.RequestCorrelator.receiveMessage(RequestCorrelator.java:391)
              at org.jgroups.blocks.RequestCorrelator.receive(RequestCorrelator.java:249)
              at org.jgroups.blocks.MessageDispatcher$ProtocolAdapter.up(MessageDispatcher.java:598)
              at org.jgroups.JChannel.up(JChannel.java:707)
              at org.jgroups.stack.ProtocolStack.up(ProtocolStack.java:1020)
              at org.jgroups.protocols.pbcast.STATE_TRANSFER.up(STATE_TRANSFER.java:178)
              at org.jgroups.protocols.FRAG2.up(FRAG2.java:181)
              at org.jgroups.protocols.FlowControl.up(FlowControl.java:400)
              at org.jgroups.protocols.pbcast.GMS.up(GMS.java:896)
              at org.jgroups.protocols.pbcast.STABLE.up(STABLE.java:245)
              at org.jgroups.protocols.UNICAST.up(UNICAST.java:414)
              at org.jgroups.protocols.pbcast.NAKACK2.handleMessage(NAKACK2.java:721)
              at org.jgroups.protocols.pbcast.NAKACK2.up(NAKACK2.java:574)
              at org.jgroups.protocols.BARRIER.up(BARRIER.java:126)
              at org.jgroups.protocols.VERIFY_SUSPECT.up(VERIFY_SUSPECT.java:143)
              at org.jgroups.protocols.FD.up(FD.java:253)
              at org.jgroups.protocols.FD_SOCK.up(FD_SOCK.java:288)
              at org.jgroups.protocols.MERGE2.up(MERGE2.java:205)
              at org.jgroups.protocols.Discovery.up(Discovery.java:359)
              at org.jgroups.protocols.TP.passMessageUp(TP.java:1263)
              at org.jgroups.protocols.TP$IncomingPacket.handleMyMessage(TP.java:1825)
              at org.jgroups.protocols.TP$IncomingPacket.run(TP.java:1798)
              ... 3 more
      

       

      Each cache is configured as distributed, synchronous with 2 owners.

      The cache store is configured as synchronous, and there are no locking parameters configured on the cache. That is, it is in the default configuration for Infinispan 5.2.0.
      After some googling it seems that the issue should be solved by disabling lock striping - but according to the documentation the lock striping is disabled by default?

       

      There are 3 infinispan nodes in the cluster. Sometimes one of them reports this error, sometimes two of them report it.

       

      Any ideas?

       

      EDIT: Get the same with only 1 node in the cluster too...

        • 1. Re: TimeoutException, Unable to acquire lock on key
          marcb

          After explicitly setting useLockStriping=false, we no longer get the locking errors - guess the default value is still "true"?

           

          Instead we get cassandra timeouts, yay!
          But they might be solvable...

          • 2. Re: TimeoutException, Unable to acquire lock on key
            marcb

            Actually.. cancel that. Still get the lock error sometimes, but not as often

            • 3. Re: TimeoutException, Unable to acquire lock on key
              marcb

              Uhm, i'm getting the lock error rarely now, hard to reproduce. So instead of clear() i decided to try to retrieve the entire keySet and perform size() on it..

               

              And then I get this exception... Any way to increase the timeout for the distributed tasks?

               

              2013-03-12 15:21:00,111 - DEBUG - HotRodServerWorker-2 - [org.infinispan.distexec.mapreduce.MapReduceTask$MapTaskPart:execute@925] - Invoking MapCombineCommand [keys=[], taskId=b85656ca-ff44-466f-b7e0-d859b70128ef] on dl380x2383-17555
              2013-03-12 15:21:00,113 - DEBUG - HotRodServerWorker-2 - [org.infinispan.distexec.mapreduce.MapReduceTask$MapTaskPart:execute@928] - Invoked MapCombineCommand [keys=[], taskId=b85656ca-ff44-466f-b7e0-d859b70128ef] on dl380x2383-17555
              2013-03-12 15:21:00,113 - DEBUG - HotRodServerWorker-2 - [org.infinispan.distexec.mapreduce.MapReduceTask$MapTaskPart:execute@925] - Invoking MapCombineCommand [keys=[], taskId=b85656ca-ff44-466f-b7e0-d859b70128ef] on dl380x2384-47864
              2013-03-12 15:21:00,113 - DEBUG - HotRodServerWorker-2 - [org.infinispan.distexec.mapreduce.MapReduceTask$MapTaskPart:execute@928] - Invoked MapCombineCommand [keys=[], taskId=b85656ca-ff44-466f-b7e0-d859b70128ef] on dl380x2384-47864
              2013-03-12 15:21:00,158 - DEBUG - transport-thread-6 - [org.infinispan.distexec.mapreduce.MapReduceTask$MapTaskPart:invokeMapCombineLocallyForLocalReduction@938] - Invoking MapCombineCommand [keys=[], taskId=b85656ca-ff44-466f-b7e0-d859b70128ef] locally
              2013-03-12 15:21:00,789 - DEBUG - transport-thread-6 - [org.infinispan.loaders.cassandra.CustomizedCassandraCacheStore:loadAllKeys@337] -
              2013-03-12 15:21:02,824 - DEBUG - Timer-3,AccountFinder,dl380x2382-4794 - [org.jgroups.protocols.FD$Monitor:run@425] - dl380x2382-4794: sending are-you-alive msg to dl380x2383-17555
              2013-03-12 15:21:05,825 - DEBUG - Timer-2,AccountFinder,dl380x2382-4794 - [org.jgroups.protocols.FD$Monitor:run@425] - dl380x2382-4794: sending are-you-alive msg to dl380x2383-17555
              2013-03-12 15:21:08,827 - DEBUG - Timer-4,AccountFinder,dl380x2382-4794 - [org.jgroups.protocols.FD$Monitor:run@425] - dl380x2382-4794: sending are-you-alive msg to dl380x2383-17555
              2013-03-12 15:21:11,827 - DEBUG - Timer-5,AccountFinder,dl380x2382-4794 - [org.jgroups.protocols.FD$Monitor:run@425] - dl380x2382-4794: sending are-you-alive msg to dl380x2383-17555
              2013-03-12 15:21:14,827 - DEBUG - Timer-2,AccountFinder,dl380x2382-4794 - [org.jgroups.protocols.FD$Monitor:run@425] - dl380x2382-4794: sending are-you-alive msg to dl380x2383-17555
              2013-03-12 15:21:15,123 - DEBUG - HotRodServerWorker-2 - [org.infinispan.server.core.logging.Log$class:debug@50] - Exception caught
              org.infinispan.CacheException: java.util.concurrent.ExecutionException: org.infinispan.CacheException: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at org.infinispan.distexec.mapreduce.MapReduceTask.execute(MapReduceTask.java:352)
                      at org.infinispan.distexec.mapreduce.MapReduceTask.execute(MapReduceTask.java:619)
                      at org.infinispan.server.hotrod.util.BulkUtil.getAllKeys(BulkUtil.java:55)
                      at org.infinispan.server.hotrod.AbstractEncoder1x.writeResponse(AbstractEncoder1x.scala:132)
                      at org.infinispan.server.hotrod.HotRodEncoder.encode(HotRodEncoder.scala:69)
                      at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.doEncode(OneToOneEncoder.java:66)
                      at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:59)
                      at org.jboss.netty.channel.Channels.write(Channels.java:704)
                      at org.jboss.netty.channel.Channels.write(Channels.java:671)
                      at org.jboss.netty.channel.AbstractChannel.write(AbstractChannel.java:248)
                      at org.infinispan.server.core.AbstractProtocolDecoder.writeResponse(AbstractProtocolDecoder.scala:179)
                      at org.infinispan.server.hotrod.HotRodDecoder.customDecodeKey(HotRodDecoder.scala:160)
                      at org.infinispan.server.core.AbstractProtocolDecoder.decodeKey(AbstractProtocolDecoder.scala:121)
                      at org.infinispan.server.core.AbstractProtocolDecoder.decode(AbstractProtocolDecoder.scala:71)
                      at org.infinispan.server.core.AbstractProtocolDecoder.decode(AbstractProtocolDecoder.scala:47)
                      at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500)
                      at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435)
                      at org.infinispan.server.core.AbstractProtocolDecoder.messageReceived(AbstractProtocolDecoder.scala:387)
                      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
                      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
                      at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
                      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:107)
                      at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:313)
                      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:88)
                      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
                      at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
                      at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
                      at java.lang.Thread.run(Unknown Source)
              Caused by: java.util.concurrent.ExecutionException: org.infinispan.CacheException: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
                      at java.util.concurrent.FutureTask.get(Unknown Source)
                      at org.infinispan.distexec.mapreduce.MapReduceTask$TaskPart.get(MapReduceTask.java:832)
                      at org.infinispan.distexec.mapreduce.MapReduceTask.executeMapPhaseWithLocalReduction(MapReduceTask.java:477)
                      at org.infinispan.distexec.mapreduce.MapReduceTask.execute(MapReduceTask.java:350)
                      ... 27 more
              Caused by: org.infinispan.CacheException: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at org.infinispan.util.Util.rewrapAsCacheException(Util.java:541)
                      at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.invokeRemoteCommand(CommandAwareRpcDispatcher.java:186)
                      at org.infinispan.remoting.transport.jgroups.JGroupsTransport.invokeRemotely(JGroupsTransport.java:515)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.invokeRemotely(RpcManagerImpl.java:175)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.invokeRemotely(RpcManagerImpl.java:197)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.invokeRemotely(RpcManagerImpl.java:254)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.access$000(RpcManagerImpl.java:80)
                      at org.infinispan.remoting.rpc.RpcManagerImpl$1.call(RpcManagerImpl.java:288)
                      at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
                      at java.util.concurrent.FutureTask.run(Unknown Source)
                      ... 3 more
              Caused by: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at org.jgroups.blocks.MessageDispatcher.sendMessage(MessageDispatcher.java:390)
                      at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.processSingleCall(CommandAwareRpcDispatcher.java:301)
                      at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.invokeRemoteCommand(CommandAwareRpcDispatcher.java:179)
                      ... 11 more
              2013-03-12 15:21:15,127 - ERROR - HotRodServerWorker-2 - [org.infinispan.server.core.logging.Log$class:logErrorBeforeReadingRequest@98] - ISPN005009: Unexpected error before any request parameters read
              org.infinispan.CacheException: java.util.concurrent.ExecutionException: org.infinispan.CacheException: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at org.infinispan.distexec.mapreduce.MapReduceTask.execute(MapReduceTask.java:352)
                      at org.infinispan.distexec.mapreduce.MapReduceTask.execute(MapReduceTask.java:619)
                      at org.infinispan.server.hotrod.util.BulkUtil.getAllKeys(BulkUtil.java:55)
                      at org.infinispan.server.hotrod.AbstractEncoder1x.writeResponse(AbstractEncoder1x.scala:132)
                      at org.infinispan.server.hotrod.HotRodEncoder.encode(HotRodEncoder.scala:69)
                      at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.doEncode(OneToOneEncoder.java:66)
                      at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:59)
                      at org.jboss.netty.channel.Channels.write(Channels.java:704)
                      at org.jboss.netty.channel.Channels.write(Channels.java:671)
                      at org.jboss.netty.channel.AbstractChannel.write(AbstractChannel.java:248)
                      at org.infinispan.server.core.AbstractProtocolDecoder.writeResponse(AbstractProtocolDecoder.scala:179)
                      at org.infinispan.server.hotrod.HotRodDecoder.customDecodeKey(HotRodDecoder.scala:160)
                      at org.infinispan.server.core.AbstractProtocolDecoder.decodeKey(AbstractProtocolDecoder.scala:121)
                      at org.infinispan.server.core.AbstractProtocolDecoder.decode(AbstractProtocolDecoder.scala:71)
                      at org.infinispan.server.core.AbstractProtocolDecoder.decode(AbstractProtocolDecoder.scala:47)
                      at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500)
                      at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435)
                      at org.infinispan.server.core.AbstractProtocolDecoder.messageReceived(AbstractProtocolDecoder.scala:387)
                      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
                      at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
                      at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
                      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:107)
                      at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:313)
                      at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:88)
                      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
                      at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
                      at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
                      at java.lang.Thread.run(Unknown Source)
              Caused by: java.util.concurrent.ExecutionException: org.infinispan.CacheException: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
                      at java.util.concurrent.FutureTask.get(Unknown Source)
                      at org.infinispan.distexec.mapreduce.MapReduceTask$TaskPart.get(MapReduceTask.java:832)
                      at org.infinispan.distexec.mapreduce.MapReduceTask.executeMapPhaseWithLocalReduction(MapReduceTask.java:477)
                      at org.infinispan.distexec.mapreduce.MapReduceTask.execute(MapReduceTask.java:350)
                      ... 27 more
              Caused by: org.infinispan.CacheException: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at org.infinispan.util.Util.rewrapAsCacheException(Util.java:541)
                      at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.invokeRemoteCommand(CommandAwareRpcDispatcher.java:186)
                      at org.infinispan.remoting.transport.jgroups.JGroupsTransport.invokeRemotely(JGroupsTransport.java:515)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.invokeRemotely(RpcManagerImpl.java:175)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.invokeRemotely(RpcManagerImpl.java:197)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.invokeRemotely(RpcManagerImpl.java:254)
                      at org.infinispan.remoting.rpc.RpcManagerImpl.access$000(RpcManagerImpl.java:80)
                      at org.infinispan.remoting.rpc.RpcManagerImpl$1.call(RpcManagerImpl.java:288)
                      at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
                      at java.util.concurrent.FutureTask.run(Unknown Source)
                      ... 3 more
              Caused by: org.jgroups.TimeoutException: timeout sending message to dl380x2383-17555
                      at org.jgroups.blocks.MessageDispatcher.sendMessage(MessageDispatcher.java:390)
                      at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.processSingleCall(CommandAwareRpcDispatcher.java:301)
                      at org.infinispan.remoting.transport.jgroups.CommandAwareRpcDispatcher.invokeRemoteCommand(CommandAwareRpcDispatcher.java:179)
                      ... 11 more
              2013-03-12 15:21:17,577 - DEBUG - Timer-2,AccountFinder,dl380x2382-4794 - [org.jgroups.protocols.UNICAST:reapIdleConnections@845] - dl380x2382-4794: removed expired connection for dl380x2383-17555 (64962 ms old) from recv_table
              
              • 4. Re: TimeoutException, Unable to acquire lock on key
                marcb

                Incase anyone has the same problem, solution to first problem was to explicitly set useLockStriping in namedCache/locking to false.
                Solution to second problem was replTimeout in namedCache/clustering/sync.