SqlBulkCopy Timeout...

JaedenRuiner

Well-known member
Joined
Aug 13, 2007
Messages
340
Programming Experience
10+
Well,

this hasn't made any sense to me, but for some reason SqlBulkCopy keeps timing out when a tradition insert into statement run through a ExecuteNonQuery() works fine.

I've tried playing with the timeout wait on the sqlbulkcopy object but to no avail. Any suggestions for debugging this?

(this is the sql statement i'm running, and believe me when I say this is not the most complicated sql statement i've run through BulkCopy but for some reason it always times out on this one.)

VB.NET:
SELECT DISTINCT 	
	[A].[Customer n°], [A].[Search name], [A].[Street], [A].[City], 
	[A].[State], [A].[ZIP-code], [A].[Phone], [A].[E-Mail], 
	[B].[ReturnCode]
FROM 	[xlImport1] AS [A], 
	[xlImport2] AS [B]
WHERE 	([A].[Customer n°] = [B].[CORBI ID]) AND 
	([A].[Customer n°] NOT IN (
		SELECT 	[lst_Customers].[Cust_Id]
		FROM 	[lst_Customers]
	)) AND 
	([B].[ReturnCode] IN (
		SELECT 	[lst_CorbiLoc].[ID]
		FROM 	[lst_CorbiLoc]
	))
The Insert statement that works:
VB.NET:
INSERT INTO [lst_Customers]
	([Cust_Id], [Name], [Addr1], [City], [State], [Zip], [Phone], [Email], [ReturnCode])
SELECT DISTINCT 	
	[A].[Customer n°], [A].[Search name], [A].[Street], [A].[City], 
	[A].[State], [A].[ZIP-code], [A].[Phone], [A].[E-Mail], 
	[B].[ReturnCode]
FROM 	[xlImport1] AS [A], 
	[xlImport2] AS [B]
WHERE 	([A].[Customer n°] = [B].[CORBI ID]) AND 
	([A].[Customer n°] NOT IN (
		SELECT 	[lst_Customers].[Cust_Id]
		FROM 	[lst_Customers]
	)) AND 
	([B].[ReturnCode] IN (
		SELECT 	[lst_CorbiLoc].[ID]
		FROM 	[lst_CorbiLoc]
	))

as you can suspect, my SqlParser has these divided up so based upon a _bulk boolean flag it either uses the select statement with a SqlDataReader and the Destination table is the same as the insert statement. Turn the bulk flag off in my parser and it executes a NonQuery() as the insert statement. This is how I've done ALL of my sql in the application, and we're talking about...50 to 100 different statements generated real time with this parsing engine. Everything works, just for this one, the SqlBulkCopy times out and i have to use the NonQuery() method.

Thanks
 
hrm...
same problem...

i switched the sql over to:
VB.NET:
INSERT INTO [lst_Customers]
	([Cust_Id], [Name], [Addr1], [City], [State], [Zip], [Phone], [Email], [ContactName], [ReturnCode])
SELECT 	
	[A].[Pool #], [A].[Customer Name], [A].[Address], 
	[A].[City], [A].[State], [A].[Zip], 
	[A].[Contact Phone], [A].[Contact Email], [A].[Contact Name], 
	[B].[ReturnCode]
FROM 	[xlImport1] AS [A], 
	[xlImport2] AS [B]
WHERE 	([A].[Pool #] = [B].[CORBI ID]) AND 
	(NOT EXISTS (
		SELECT 	1
		FROM 	[lst_Customers] AS [Z]
		WHERE 	([A].[Pool #] = [Z].[Cust_Id])
	)) AND 
	(EXISTS (
		SELECT 	1
		FROM 	[lst_CorbiLoc] AS [Y]
		WHERE 	([B].[ReturnCode] = [Y].[ID])
	))
and still the SqlBulkCopy times out.
 
Would it work if you ran subqueries and then joined? (Not sitting near a SQL server so I cant test)

VB.NET:
SELECT DISTINCT 	
	[A].[Customer n°], [A].[Search name], [A].[Street], [A].[City], 
	[A].[State], [A].[ZIP-code], [A].[Phone], [A].[E-Mail], 
	[b].[ReturnCode]

FROM

(
   [xlImport1] AS [A] INNER JOIN [xlImport2] AS [b] ON [A].[Customer n°] = [b].[CORBI ID]
)
LEFT JOIN
(
   SELECT [lst_Customers].[Cust_Id] FROM [lst_Customers] GROUP BY [lst_Customers].[Cust_Id]
) AS UNWANTEDCUSTOMERS ON [A].[Customer n°] = UNWANTEDCUSTOMERS.[Cust_Id]
LEFT JOIN
(
   SELECT [lst_CorbiLoc].[ID] FROM [lst_CorbiLoc] GROUP BY [lst_CorbiLoc].[ID]
) AS WANTEDCORBILOCS ON [b].[ReturnCode] = WANTEDCORBILOCS.[ID]

WHERE
   UNWANTEDCUSTOMERS.[Cust_Id] IS NULL
AND
   WANTEDCORBILOCS.[ID] IS NOT NULL
 
Well, the SQL Server wouldn't have a problem with it, for sure.

The issue is the Parsing engine. Because of all the Database Manipulation I do in the background there are QUITE a few Insert/Update/Select Into/Delete statements being executed back and forth. I'm taking XLS Data and shifting it into a Database, Processing it for Errors and Other flags, Exporting the Errors and other data into specific XLS files again, and then inserting the Validated table data into the 'Actual' Table of the database for the intended purpose the application is there to do, which evaluates and updates specific records of that table with FK relations to another table based on running sums that divide between records and trigger an email to the client. It's rather annoying at times.

What I had to do was write an entire SqlParser() class, that allows me to not only using Shared and Parameter driven commands to Generate full SQL statements, but also parses Text Strings that are entered by the user. Once Parsed, the SqlParser can execute the sql statement via a SqlBulkCopy() (For interactive Progress Updates) or ExecuteNonQuery() for non interactive commands (Create, Delete, Update, Drop).
The Hurdles so far:
  • Excel Columns are Not Intrinsically converted, so I have to Load the schema of the Destination Table and Generate the CONVERT() functions for each XLS column that doesn't match up.
  • SqlBulkCopy will not accept an AdHoc Query (OPENROWSET()) as the Destination table so all exports have to be done via ExecuteNonQuery()
  • Not to mention all the quirks and foibles involved in the Base XL files being processed

Mostly, I don't like exception trapping in the main application. Call it a quirk but I prefer Booleans, which means everything within the Parser is trapped and controled and converted so my main app simply goes:
VB.NET:
If CMD.Execute() then
  'next step
else
   My.Application.LogError(CMD.Exception) 
' this also pops up the msg box to tell me what the error was
end if

This of course not only handles string parsing errors, but also execution errors, and it is built for progress updating for multiple command sequences. My Progress form is based on a "MyCommand" base class which the Parser is a decendent of as well as all my other interactive processing classes. So to be prudent, I wrote it so that if the SqlBulkCopy initially fails due to a TimeOut it tries the ExecuteNonQuery() which always seems to work. I was more or less curious as to why the SqlBulkCopy() bails on that statement when:
VB.NET:
SELECT DISTINCT 	
	[A].[In Pool], [A].[Location], [A].[X-Ref], 
	[A].[In Pool1], [A].[Location1], [A].[X-Ref1], 
	[A].[Ship to] AS [ShipTo], [A].[X-Ref2] AS [Cust_ID], [A].[In Pool2], 
	[A].[Name 1] AS [Cust_Name], [A].[Your refer#] AS [BOL_ID], 
	[A].[Matl Xref] AS [Material_ID], 
	[A].[Date] AS Date, 
	Sum([A].[Quantity]) AS [Quantity], [A].[BUn], 
	[M].[Description] AS [Material Description]
INTO [xlRexam]
FROM 	OPENROWSET('Microsoft.Jet.OLEDB.4.0',
		  'Excel 8.0;Database=C:\db\rexam3.xls', [Sheet1$]) AS [A], 
	[lst_Material] AS [M]
WHERE 	([A].[Matl Xref] = [M].[Mat_ID]) AND 
	([Your refer#] <> '')
GROUP BY 	
	[A].[In Pool], [A].[Location], [A].[X-Ref], [A].[In Pool1], [A].[Location1], 
	[A].[X-Ref1], [A].[Ship to], [A].[X-Ref2], [A].[In Pool2], 
	[A].[Name 1], [A].[Your refer#], [M].[Description], [A].[Matl Xref],
	[A].[Date], [A].[BUn]
(which is broken into the destination table "xlRexam" and the encapsulated "Select Statement" above -minus the into- for the SqlBulkCopy) works just fine. This above statement is handling around 5000-30000 records, where as the previous one listed in this thread that causes SqlBulkCopy() to fail is only handling about 130. Granted for that size I don't really need to use the SqlBulkCopy() but I can't always be assured the size will be so small.

I've thought of using some DataSet/DataTable/DataRow cyclic processing but that is 1 record at a time. For the queries that generate 19 rows that may be faster but for the queries that generate 4000 rows it isn't.

*sigh* Cest la vie, I'll keep playing with every available method of working the data, and slowly but surely it will all work itself out.

Thanks
 
Last edited:
I've had time-out errors returned in the past that winded up being bad data values (value not fitting the field, field missing from the xml file, null etc) instead of a more fitting error message. I would suggest looking at the record that it breaks on to see if you can eliminate this as the potential problem if your are sure your query is working properly.
 
But wouldn't bad data or some other error of that manner kick out on the Insert INto direct SQL statement? If ExecuteNonQuery() works, (and i've verified the data in the table post process) souldn't the BulkCopyp work as well?
 
I wasnt sure if you tried both ways with the exact same data or not and just offering a possibility to be considered for the process of elimination.
 

Latest posts

Back
Top