I am trying to perform a bulk load of data into an initially empty repository. I need to be able to load upwards of 1-5M nodes for real world data. I am hitting an OOM after loading less than 100K nodes. My configuration is:
- Java 6, JVM: -Xms1024m -Xmx1024m -XX:MaxPermSize=512m
- JBoss AS 7.1
- Modeshape 3.1
- Berkeley DB cache loader for Infinispan
- Infinispan eviction enabled and max-entries set to 500
- An asynchronous stateless EJB, with Bean Managed Transactions, using an injected UserTransaction.
- Modeshape injected via @Resource
My code loops, starting a transaction, creates 1000 nodes, performs session.save(), commits the transaction, and starts a new transaction. After a certain amount of time, an exception is thrown due to out of memory (see attached log from JBoss).
I have also tried a variation where in addition to performing the session.save() and transaction commit every 1000 nodes, I also session.logout() and then session.login(). I get the same results.
Increasing the memory just delays the OOM. Watching jconsole, the memory consumption (PS Old Gen) increases steadily.
The code for the EJB is attached.
Am I doing something wrong or is there a memory leak? If you like I can create a JIRA and provide my EJB and complete configuration to reproduce.