Question SQL Performance

JeffandTamara

New member
Joined
Oct 20, 2008
Messages
2
Programming Experience
3-5
Hello:
I have a MS SQL Stored procedure that runs from VB. There are two parts to the SQL portion of the Stored procedure, the first deletes a Temporary table then builds it back. Table has about 166 records...Records effected are 166.

The next part does the same thing but updates and inserts into what I want to be a Permenant table. This table currently has 166 records...When the process ins done, it has effected 27,556 records....in other words, it looks like it took every record and "Effected every Record" .

The first part of deleting the table and writting back happens very quickly, the second part takes a while longer. I am afraid that as the table gets larger, It will become really slow... Any ideas on what I might do differently to properly handle building the permenant table?? I would really appreciate any ideas.. Oh yea, Primary Key for both tables is Lot Number.

Here is some of the Logic & code I am using...

VB.NET:
Expand Collapse Copy
Delete Table
Start LOOP.....
  
 IF EXISTS (SELECT * from dbo.TempTable
             WHERE  dbo.TempTable.LOT_NUMBER  = @lot_no  )         
        begin 
                      Update   dbo.TempTable
               set LOT_NUMBER = @lot_no,
                 CASTID = @cast_id,
                 QCI_NO = @qci,
                ...
        end
    ELSE 
       BEGIN
	INSERT INTO  dbo.TempTable (LOT_NUMBER,CASTID,QCI_NO,....
                                     VALUES     ( @lot_no ,@cast_id ,@qci ,......
      END

 IF EXISTS (SELECT * from dbo.PermTable
             WHERE  dbo.PermTable.LOT_NUMBER  = @lot_no  )         
        begin 
                      Update   dbo.PermTable
               set 
                 CASTID = @cast_id,
                 QCI_NO = @qci,
                ...
        end
    ELSE 
       BEGIN
	INSERT INTO  dbo.TempTable (LOT_NUMBER,CASTID,QCI_NO,....
                                      VALUES     ( @lot_no ,@cast_id ,@qci ,......

      END
 
Resolved:

Looking around on this forum and other places I was able to come up with a "Fix" to my concerns...
The main issue was in the update procedure, it had effected 27,556 records with a database that only had 166 records....

What I added to resolve was to add a WHERE clause at the end of the Update command, "WHERE LOT_NUMBER = @lot_no ...
effected records now is 166 for that procedure.

Also I added in a "SET NOCOUNT ON"

Both statements together had a real noticable improvement on database performance..... I can't really explain the why-for's but it seemed to help a lot....
 
Back
Top