Efficiency Regarding Accessing RowState Information

TerryNewton

New member
Joined
Sep 13, 2006
Messages
2
Programming Experience
Beginner
I have a somewhat related question. I founnd the posted answers very informative and resolved my immediate issue, but as I was already accessing the deleted rows in a different fashion lead to a question of what the 'Best' way of accessing the deleted rows is.
Recomended Code:
VB.NET:
[LEFT][SIZE=2][COLOR=#0000ff]Dim[/COLOR][/SIZE][SIZE=2] dt [/SIZE][SIZE=2][COLOR=#0000ff]As[/COLOR][/SIZE][SIZE=2] System.Data.DataTable = [/SIZE][SIZE=2][COLOR=#0000ff]e[/COLOR][/SIZE][SIZE=2].dtSeriesStandardFeature.GetChanges()[/SIZE]
[SIZE=2][COLOR=#0000ff][/COLOR][/SIZE] 
[SIZE=2][COLOR=#0000ff]For[/COLOR][/SIZE][SIZE=2] x [/SIZE][SIZE=2][COLOR=#0000ff]As [/COLOR][/SIZE][SIZE=2][COLOR=#0000ff]Integer[/COLOR][/SIZE][SIZE=2] = 0 [/SIZE][SIZE=2][COLOR=#0000ff]To[/COLOR][/SIZE][SIZE=2] dt.Rows.Count - 1 [/SIZE][SIZE=2][COLOR=#0000ff]Step[/COLOR][/SIZE][SIZE=2] 1 [/SIZE]
[SIZE=2]    MsgBox(dt.Rows(x).Item("fuid_Series"))[/SIZE]
[SIZE=2][COLOR=#0000ff]Next[/COLOR][/SIZE][/LEFT]

I used the following approach instead, directly pulling fro the base 'datatable':

VB.NET:
Dim DR As DataRow
For Each DR In TargetTable.Rows
[INDENT]Select Case DR.RowState[INDENT]Case DataRowState.Deleted 'do delete[/INDENT][INDENT]DoDelete(DR)[/INDENT][INDENT]Case DataRowState.Added 'do add[/INDENT][INDENT]DoAdd(DR)[/INDENT][INDENT]Case DataRowState.Modified 'do update[/INDENT][INDENT]DoUpdate(DR)[/INDENT]End Select 
 
 
 
 
 
 

[/INDENT]Next

The method recomended above (and in other places) accesses a 'datatable' construct specific for the deleted rows. The question is one of efficiency. What overhead is related to using the seperate 'datatable' construct? Is this table already in existance and are we only getting a reference to it, or is it built on demand at the time the 'getchanges' is called?
 
Last edited:
The question is one of efficiency. What overhead is related to using the seperate 'datatable' construct? Is this table already in existance and are we only getting a reference to it, or is it built on demand at the time the 'getchanges' is called?

I believe a tool called Reflector may answer your question. I cannot directly but I propose that MS can solve the problem in one of two ways, like always something is:

lean, but slow
fat, but fast

Suppose 3 separate lists were kept, with reference to rows in a datatable that are added, updated or deleted. The lists are kept up to date and every time a row state changes it is shunted to the appropriate list. If we ask the DT for all the added rows, all it has to do is enumerate the list and shove every row into a new datatable
This approach is fat, because it requires the memory overheads of the lists

Suppose that each row only, retains its rowstate info and if we ask for the rows with state X, the entire table is scanned and a new one built.
Thus if we serially ask for all updated, then deleted, then added it will require 3 full table scans.


I believe the approach MS to have adopted be the latter, because of the way everything "feels" given what I know. I also dont find table scanning to be particularly slow.. a few hundred milliseconds were required for me to recently iterate a table of more than 100000 rows. A table of several million rows should be a low number of seconds (but if youre crazy enough to have a several million row datatable on a desktop PC client instead of a server with 16 gigs of ram and 4 cpu, then you need examined)

I can tell you that GetChanges() returns a datatable with all changed rows. If you supply an argument to getchanges of the desired states then youll get only tables of those state(s) specified. I personally feel that other areas of a program are better starting points for profiling and streamlining, so I just say
MyTableAdapter.Update(myOriginalDataTableWithChanges)

It saves the ballache of merging in changes that come back from the database, especially if those changes affect cascading relations.
 
Back
Top