You're trying to materialise too much data. In this instance, retrieving 36000 entities is going to break something quickly, particularly under load. Is there any particular reason you need all that data retrieved at once?
One thing you can do is to use a session bean as a helper that retrieves references to each row on the table, and then you find the entity by the reference as you need to use it. Is that possible in your instance?
I guess I could do the session bean thing, but I want to avoid it if I could. For some of the operations I'm doing, I need to go against the whole dataset, so I can't avoid retrieving it all.
Currently, I have this working in Orion and I'm not having any problems at all with Oracle. I guess what I'm trying to figure out is that what is JBoss doing that is blowing up the Oracle shared memory pool. 36000 rows is nothing to Oracle.
Also, I have this working fine with JBoss and Firebird, so I just can't figure out what it is with Oracle. I'm having a hard time finding anybody else who has this problem, so I imagine it's something I'm doing. Or am I wrong and this is a known issue? Thanks
OK. If you absolutely need to do it, then you will need to tune Oracle for load performance. I've heard some rules of thumb that say your SGA should be 1/3 of your RAM or 1/2 if you can manage it - so 400Mb is miniscule. I have seen Sun e-commerce configurations with 2Gb allocated. On some boxen, you will also need to adjust your kernel parameters as well.
A good book on tuning will come in handy. I guess this is why we have a demand for Oracle DBAs. Your issue boils down to a memory fragmentation error as Oracle incrementally tries to add blocks. Unfortunately, with great power comes great responsibility - to quote