Hello All,
I have to stream CSV data of upto 50,000 rows and around 500 columns which could be well beyond 500MB capacity.
Can you tell me how to stream this huge data or is there any possibility to flush it in parts or flush around 1000 scv rows per time?
Regards,
Satya
Hi Satya,
There is an issue reported in JIRA about memory leaks with huge data streams: https://jira.jboss.org/browse/RF-7019. So the best approach is to create a servlet and redirect requests there.