This project has moved. For the latest updates, please go here.

Memory Issues with enormous workbook

Sep 18, 2012 at 12:02 AM
Edited Sep 18, 2012 at 12:05 AM

I am creating a workbook with approximately 60 spreadsheets.  Basically, I have medical data, and I am splitting the data into a different worksheet based on filters on claims.  The resulting spreadsheet is 44MB (.xlsx format).  The total unzipped size is 363MB

As my process is running, I see the 'working memory usage' climb to 6GB.  Why are we using almost 16 times as much memory, as the size of the resulting spreadsheet?  I am using the most recent version of ClosedXML (I downloaded a new version today).  I shut off events during workbook creating (via EventTracking.disabled) .  I am running in 64 bit mode.  I am not sure what else I can do to cut down on usage?  To check where the memory issue was, I commented out the spreadsheet creation logic, and just ran through the database.  My app never used more than about 50MB of memory.

Is there any possible way to flush to disk, every 1000 records or so?  I can only find that I can flush to disk when I am done, if I could do it incrementally, I think that would help.  These spreadsheets are built somewhat randomly, in other words, I may write to any one of the dozens that are open.  I could always redesign my app to do one at a time, but before I go through that effort, I would like to see what other options there are.

Thanks in advance for any advice you can give me.

Coordinator
Sep 18, 2012 at 4:21 AM

There isn't a memory issue.

"These spreadsheets are built somewhat randomly" That right there is the reason why 1Mb on disk != 1Mb on memory. All the niceties that the library provides comes at a price.

If you want the fastest way to write a file using the least amount of memory then use SAX.