Question Memory Problems - DataTables

rapitts

Member
Joined
Sep 24, 2008
Messages
5
Programming Experience
10+
I'm getting a major headache using datatables inside VB.NET (Framework 2.0 and 3.5)

I'm basically reading some data from SQL into a datatable and then passing that datatable onto another thread which deals with writing out some data to a COM object before clearing the rows and repeating the loop again.

At the moment the application is reaching over 230Mb+ and then at some point crashing because it ran out of memory.

I'm sure that the GC isnt clearing up the datatable or datarows .


The code basically performs the following (is there another way to force the removal of the datarows/datatable from memory)?



VB.NET:
--Create array of new records 
Dim Ddrows(aMsrTags.Length - 1) As DataRow
For ii = 0 To aMsrTags.Length - 1
Ddrows(ii) = dtDataTable.NewRow()
Next

'Read SQL and copy data into datarows

---Add rows to datatable 
dtDataTable.Rows.Add(Ddrows(i - 1))
dtDataTable.AcceptChanges()


--Pass datatable to thread 
mythread.DumpDataTable(dtDataTable)
      (merges the dtDatatable with the thread datatable, maybe its keeping a reference here and not forcing a clean up?)


--remove all rows added 
dtDataTable.Rows.Clear()
dtDataTable.AcceptChanges()

--force GC 
GC.Collect()
GC.WaitForPendingFinalizers()
GC.Collect()

Repeat process again (this may occur 10000+ times), on average there could be 5000 records in the datatable during each loop.

using Framework V2.0 and even tried V3.5.

Any help please.....

Thanks

Ron:eek:
 
I've found my problem.

If I add up the number of bytes per row and x by the number of records it may reach a few megabytes, its possible that the GC isnt fast enough to clean up causing the out of memory problems.

Looks like using a datatable is out of the question, thats a pity mind you, confirmed via a memory profiler alot of objects (double/int32/datarow) still exist.
 
How many records are your processing?

Also try using the same datarow in your loop instead of an array of multple rows.

VB.NET:
Dim row as datarow

For ii = 0 to aMsrTags.Length - 1

       row = dtDataTable.NewRow
       row("FieldName1") = "blah1"
       row("FieldName2") = "blah2"

       dtDataTable.Rows.Add(row)

Next ii

dtDataTable.AcceptChanges()
 
Thanks for the reply.

It could be 500,000 records it may process in total.

The problem is I'm converting a flat record into 7 data rows (hence the need for a array of rows) , I dont think the GC can keep up with the clean up operation in the background.

I'm going to change some of the logic to use arrays and see how that goes.

I guess if I did something like datatable = null on each loop it may perform better but that requires adding the data structure again.
 
Thanks for the reply.

It could be 500,000 records it may process in total.

The problem is I'm converting a flat record into 7 data rows (hence the need for a array of rows) , I dont think the GC can keep up with the clean up operation in the background.

I'm going to change some of the logic to use arrays and see how that goes.

I guess if I did something like datatable = null on each loop it may perform better but that requires adding the data structure again.

Heyya rapitts:

500,000 records per import? Wow that is alot of records but still should not cause the memory leak that you are seeing. Beside the memory, how long does this import processing take to complete?

I also work with alot of data imports and understand the need for optimizing speed & memory efficiency while processing these imports.

Can you make a generic project example with dummy data, that works with a Test database, Northwind or something similar and an replication of your import file that you can upload?
 
For half a million rows, you shouldnt even be thinking about trying to store them in memory on the client machine.. If client machines were good to store half a million rows, why would we bother with servers having massive amounts of memory, and dedicated DB software like Oracle?

Your thread that is pumping out to COM should be using a datareader and pumping the data itself, one row at a time, from a firehose cursor. You should NOT be using a dataadapter (wrapped up datareader) to fill a datatable before writing it onto elsewhere.. DT are for storage of a low number of rows that will be displayed or edited and sent back to the DB. Dumps and reports should run WITHOUT caching in the client machine
 
Back
Top