1 2 Previous Next 16 Replies Latest reply on Feb 26, 2013 7:16 PM by Randall Hauch Go to original post
      • 15. Re: Webdav internal caches - how to disable them ?
        pjakub Newbie

        Yes, different storage location too.

        got file storage from config, or:

        "type" : "cache",

                    "dataCacheName" : "dataCache",

                    "metadataCacheName" : "metadataCache"

                }

        with same or different locations.

        Still not working as I expected, need to perform write to see changes on second node.

         

        Have you tried running my code?

        • 16. Re: Webdav internal caches - how to disable them ?
          Randall Hauch Master

          I just took your example to replicate MODE-1830, and I uploaded a new version with some changes and best practices for configuring clustered ModeShape and Infinispan in a JavaSE environment. Have a look to see if it helps.

           

          Note that MODE-1830 appears to be a problem with how ModeShape handled events originating from other nodes in the cluster. The core problem is that when changes were made in one process (say Process A), events were correctly sent across the cluster but the other processes (e.g., Processes B, C, ...) did not properly use the events to purge the changed items from the local cache. Obviously this means that changes did not become visible until other changes were made to the same nodes. Kind of a big deal. But interestingly, ModeShape still applied the changes correctly to the persistent storage -- IOW, no data was lost, which I think is why we didn't really see it in any of our testing. But I think it does explain many of the problems people have been having lately with clustering. See the issue for the details.

           

          The good news is that I've created a pull request and we should be able to merge this tomorrow AM. And we're ready to release 3.1.3.Final tomorrow, too, so if you're running into similar problems please give 3.1.3 a try as soon as possible.

          1 2 Previous Next