I do have a similar problem, i will try to debug that as i need it for the files attachment in the forum.
Thanks for letting us know and confirm that bug (it's in the sourceforge bugtracker as well)
I found the problem, it's in the apache library commons-upload.jar
As we use the last release, the bug is not fixed, so i did a quick and (not so) dirty fix so we can upload files without totally smashing the machine.
You can get the new jar file here.
Just replace the one in your $NUKES_HOME/thirdparty/apache-commons/lib by the one i am providing and let me know if you get some improvement.
commons-fileupload.jar from Apache of course (not commons-upload.jar as i wrote)
(I wish we were allowed to edit our posts...)
Thanks for your help Thomas but the new library commons-fileupload.jar doesn't resolve my problem. No improvement for me.
And I don't understand because in the module html of Nukes, upload of file works fine...
You get some improvement with your commons-fileupload library?
I do have a great increase, the module HTML is my test by the way, it allocates way to much memory for a form.
I used jprofiler to track the memory and realized that the Apache library was using many times big amount of memory, looking into the source code i saw that for each field of the form (any id or textform or file ...) it was allocating the size of the threshold !
(the threshold indicates from what size should the library store the fields in a file or in memory)
Our threshold is pretty high as we don't want the files to be written on the disk, so it was allocating way to much memory. For example when i was submitting a 50KB file into the HTML module, i got about 55MB increase in memory it's enough to make my computer swap a lot.
After the quick fix i still have a 20MB increase (much better, still a lot though).
If you have many fields in your form (and i repeat ANY field) the memory allocated will be way too much.
After the fixing i looked some more and found a guy complaining about the same thing and proposing a patch but Apache guys didn't want to use it...
I forgot to tell precisely the source change here it is:
In DeferredFileOutputStream there is this method:
public DeferredFileOutputStream(int threshold, File outputFile)
this.outputFile = outputFile;
memoryOutputStream = new ByteArrayOutputStream(threshold);
currentOutputStream = memoryOutputStream;
That's where it allocates 'threshold' bytes whatever the size of the field, i just removed threshold so Java just gives 32Bytes and allocate more only if needed.
Indeed, the module html have the same problem.
In my form I have 6 file fields and 6 text fields ! It's a lot !!
The unique solution for me it's to limit the number of my fields, isn't it?
Because when I use the library you have fixed, the problem is not solved for me.
I would to thank you for your help !
I am surprised you have no improvement, did you really recompile using the new library ? (do a build clean first)
Yes in that case 12 fields in a lot :)
You can also try to decrease the max size of a file, i think by default it's 4 MB.
Ok, I had, yesterday, recompile the new library (indeed, I had forgotten to recompile it). ;)
It seemed work fine...
But I am surprised because today it doesn't work. :-( Indeed, no improvement in the use of memory. I don't understand.
Can you tell me how to decrease the max size of a file.
I do already in my form
<input type=\"hidden\" name=\"MAX_FILE_SIZE\" value=\"100000\"/>
Thanks a lot in advanced ! :)
It's ok, it work fine again...
Pfffff... It's a hard fight beetween me and my computer !! ;)
But I have always my problem when I want to update a file in my database.
If someone can help me it will be very friendly...