MySQL slow write

Lev Blaivas

Member
Joined
Nov 28, 2007
Messages
8
Location
Cyprus
Programming Experience
1-3
I have a Database from which I am copying 2 tables to a MySQL Database, the problem I am having that it takes 45 minutes to save 75,000 rows, and the database can grow to more than 200,000 entries which will take an unacceptable amount of time to write to the DB. Additionally some of the existing rows will change daily so there is a need to update the DB several times a day, which will take most of the time with current performance. As far as I understand there is some problem with the way I write data to the table (currently I am flushing the existing DB as rewriting all 75,000 entries again). Another sollution I see is to identify the rows that have been edited but I can not get that information from the original DB and I need to compare the retrived table to the one already stored, but I can not think of a way to do that efficiently. Does any one has any suggestions of how I can overcome this problem?
 
How do you do it now, code-like?
 
I set all the records' states to "added", and then, after I have all the data ready, I call the adapter.update method which takes 45 to finish.

I tried to use the adapter.insert method calling it after each row, but it took an even greater amount of time to complete.

I am using "Connector/Net 5.1.2" to connect to the MySQL DB
 
I solved the problem by manually generating a multiple insert query statement, to insert 100 rows at a time, and sending it directly to MySQL server bypassing the .net generated table adapter.
 
Sounds like your MySQL is getting screwed up maintaining undo logs for a transaction that is 75,000 inserts long.

As you have found, there are better ways of bulk loading data into a database. In Oracle I would use SQL arrays, and send 1000 rows at a time.. in this way I can load 750,000 rows in 3 minutes. Without them, and calling a stored procedure one row at a time, it takes around 18 minutes.
 
The data adapter also have a UpdateBatchSize property, it defaults to 1 db call each row update.
Setting the UpdateBatchSize to 0 will cause the DataAdapter to use the largest batch size that the server can handle. Setting it to 1 disables batch updates, as rows are sent one at a time.

Executing an extremely large batch could decrease performance. Therefore, you should test for the optimum batch size setting before implementing your application.
Looks like MySql support for this is from Connector/Net 5.2. (ref)
 
Thanks alot guys! I'll try to play around with these settings to maximize the performance, but only after I finish the program, my deadline is next week =).
 

Latest posts

Back
Top