If you use the SQLDataAdapter and fetch 1000 rows to a dataset, you can make changes to the dataset locally, then a simply call to sqldataadapter.update will apply all the changes for you.
Do not confuse the function call tableAdapter.Update() with the word UPDATE in sql.
ta.Update() causes the supplied datatable to be enumerated. All rows that have the RowState of Added are passed to the internal adapter's INSERT command. All rows that have a RowState of Modified are passed to the internal adapter's UPDATE command. All rows that have a RowState of Deleted are passed to the internal adapter's DELETE command.
This behaviour can be evidenced by removing the I U and D commands from the adapter in design view, then attempting to pass a series of datatables that contain only added, modified or deleted rows accordingly. In each case an exception will be encountered stating that the relevant query is not defined
What this call is doing I am not sure of, and Ive never tried to add rows to the dataset.
You do not add rows to a dataset. A dataset is a group of datatables, possibly with relations. They are an intelligent collecting container for data storage containers. They are not in themselves data containers. Please make yourself firm in this distinction as it can be highly confusing for other developers if you use incorrect terminology.
My guess is that it is an update statement.
UPDATE is an SQL keyword for updating existing data. It cannot be used to insert data into a table. The command for inserting data into an sql table starts with the sql keyword INSERT.
But how this works with rows that are new or have been deleted, Im not sure.
Again, confusion possibly arising out of Microsoft's choice of using the word Update() when they mean "persist this datatable to the back end data storage" seems to be the issue here. Update() causes a blend of INSERT, UPDATE and DELETE queries to be run depending on the RowStates of all the rows in the supplied datatable.
It may be acting as you said, making a single sql statement for each action it has to take..
I can guarantee you that further investigation will arrive at the same conclusion as that I present here.
After experience with other databases, I am still just shocked that SQL2000 doesnt have something a little more efficient. Even in mySQL there is the functionality to execute the statement :
'insert into table values(1,2,3),(2,3,4), (3,4,5)'
I do not believe your suspiscious of inefficieny to be well grounded or justified in this case. If that syntax is allowed in mySQL it is non-standard SQL and unlikely to find support from other vendors. Additionally, it is unlikely that such a statement will be an efficient approach. This statement could reach several megabytes or more in length especially in the hands of an inexperienced programmer using basic string building techniques. Already a performance problem is hit, before we have even passed the query to the database. The entire query is sent and must be stored while the parser decodes the whole lot, compiles it and executes it. Given that the number of parameters tagged in the list can vary it may not be an optimizable query either and will require interpretation each time it is run.
There are ways to replicate the statement you post, using standard sql:
INSERT INTO x
SELECT 2,3,4
UNION ALL
SELECT 3,4,5
UNION ALL
SELECT 4,5,6
but such an approach is inadvisable for the very same reasons I advocated before; query length, database resources used in parsing and inability to cache will contribute to this being a very lame query indeed.
You might also like to consider the fact that serial transmission is not slow; the IDE interface used by hard disk has grown from being a glorified parallel port into an interface that has reached its data transmission rate ceiling. The successor to IDE operates on serial rather than parallel methodology
I presume you are pursuing this line of questioning because you have timed the amount of time and recorded the client memory requirements in the following two scenarios:
start timing
open db connection
for a loop of 100,000 iterations
insert a value into a table using a parameterized, prepared query
loop
close the connection
end timing
start timing
open db connection
for a loop of 100,000 iterations
build an SQL query capable of inserting 100,000 rows using your INSERT INTO x (value),(value),(value)...(100thousandth value)
loop
execute the query
close the connection
end timing
I'm curious to know your results
When you are executing 1000's
On grammar note, it's
1000s. When pluralising, an apostrophe NEVER appears before the S. In written words, it would be
thousands not
thousand's. The rules do not change just because numbers are used.
No shop's should make claim's that they have 1000's of DVD's and CD's in their sale's.
of calls from a application from locations around the country, it would be nice if you only had to worry about 1 sql statement timing out, instead of any of the 1000's..
I dont understand what point is being made here. Youre more likely to cause a query to fail if you submit 25 megabytes of information tagged onto the end of it using some "efficient" syntax than if you repeatedly call with a small amount of info.
As an analogy, when youre downloading a file from the internet, would you prefer your computer to wait until it has downloaded the entire file before it tells you whether it failed or succeeded and how fast/how much, or do you like to be able to see how much of the file has been downloaded thus far and at what rate it is transferring?
Is your line of questioning related to an actual problem you are facing, or is it idle speculation and/or wondering if you can fix something that you are not sure is broken?