-
1. Re: Teiid is throwing java.lang.OutOfMemoryError while loading maximum data.
shawkins Jul 2, 2019 9:51 AM (in response to pranitag)> We have also tried to increase values of below teiid properties
Generally you don't need to adjust properties until you have identified the root cause of the issue.
> As we have checked its working with 16GB Ram but not working with 7GB Ram.
Did you capture the heap dump on out of memory?
> Would you please provide a generic solution for this issue?
See Diagnosing Issues · GitBook - there unfortunately isn't a simple switch to toggle. You will need to start with the heap dump to determine what is holding the memory. If it's for example a driver, then you'll need to see what can be done to address the issue there. If it's Teiid, then the query plan and probably a debug log will be the next step.
-
2. Re: Teiid is throwing java.lang.OutOfMemoryError while loading maximum data.
pranitag Jul 22, 2019 9:09 AM (in response to shawkins)Thanks Steven,
So we got the issue it is due to fetching huge data from database. Teiid goes out of memory.
We need someone suggestion, how we can avoid this scenario.
We are using Teiid to fetch data from different DB sources. At a time there might be 10 to 20 request to fetch data from source. Teiid got out of memory in following scenario.
1. Source table size is huge
- we try to set batch size in our application, but that is not helping. we have batch size 10K records.
2. when there are multiple request for data extract.
Please let us know how can we avoid out of memory issue.
-
3. Re: Teiid is throwing java.lang.OutOfMemoryError while loading maximum data.
shawkins Jul 22, 2019 9:25 AM (in response to pranitag)I'm sorry there is not any new information to go off of here.
The attached log is very incomplete. It just shows there is a gc overhead error - it does not show anything about the actively processing Teiid plans.
You need to identify what is holding the memory and/or provide a more complete log/reproducer. Please reread my last comment.
> we try to set batch size in our application, but that is not helping. we have batch size 10K records.
Again until you identify what is occurring, changing configuration values is not likely to help. Increasing batch sizes for example may not be desirable with extremely wide results containing things like inline lobs as the working memory for even a single batch will be quite hign.