This project has moved and is read-only. For the latest updates, please go here.

Creating Large Files - Any Suggestions?

Dec 11, 2012 at 5:51 PM

I have setup some reports utilizing ClosedXML to get my data into Excel files. My question or topic for discussion is are there any recommendations you can make on working with and creating excel files with large amounts of data. I have some reporting that could be creating sheets with for example 900k records. In some instances I will also need to create multiple tabs so the large data easily becomes a concern.

My process puts the data into a Datatable from my SQL Server database and then here is an abbreviated excerpt (in VB) of how I'm creating the Excel workbook with ClosedXML

' create the workbook
Private mCurrWorkBook As XLWorkbook
mCurrWorkBook = New XLWorkbook(XLEventTracking.Disabled)
Private msSaveFileName As String = "Myrpt.xlsx"

' create the worksheet
mCurrWorkSheet = mCurrWorkBook.Worksheets.Add("MySheet")

' add the data to the worksheet
Dim rangeWithData = mCurrWorkSheet.Cell(3, 1).InsertData(MyDT.AsEnumerable())

' save the workbook to the server
PhysicalPath = HttpContext.Current.Server.MapPath(cRepDir & msSaveFileName)

This of course can become an issue when adding the data to the worksheet of having possible memory issues/concerns. (Specifically when running the "InsertData(MyDT.AsEnumerable()" statement)  Expecially when multiple users are running reports on the site at the same time. Are there any better methods to populate the data to the worksheet? This is obviously building the workbook in memory and then saving it to the server hard disk, is it possible to build this directly the server hard disk instead of in memory? Thanks for any feedback or suggestions you may have, and for creating such a great tool and making it available to us.

Dec 11, 2012 at 6:26 PM

It's always a tradeoff. If you want the fastest way to write a file using the least amount of memory then use SAX.