how to import an excel worksheet

gilbre

New member
Joined
May 21, 2009
Messages
1
Programming Experience
1-3
I am looking for the best way to import an excel spreadsheet using ado.net. Here is the breakdown:\
The information will get exported by another company to excel. Should I have then start at PK_ID 100,000 or something like that and maintain the key in the export for all the different data tables in SQL?
The spreadsheet will have over 100 columns that will need to maintain referential integrity throughout 9 tables. I also have to check one field in particular in the spreadsheet and compare it to the data table and if it exists in the data table in SQL then in is an update instead of an insert. So SQLBulkCopy won't work...

Is using a data table the way to go? Any suggestions would be helpful.

I was thinking maybe select the entire spreadsheet into a data table and then disburse the info into the 9 tables. Would a stored procedure be best? i can use return @@ID.

any examples of code would be great as I am no expert on this type of coding.
 
The spreadsheet will have over 100 columns that will need to maintain referential integrity throughout 9 tables.
So the data from a single spreadsheet having 100 columns will be distributed among 9 tables?

I also have to check one field in particular in the spreadsheet and compare it to the data table and if it exists in the data table in SQL then in is an update instead of an insert.
This sounds like a good candidate for your primary key. Further, don't bother checking whether the row exists before you send it, just choose the more common need (INSERT or UPDATE) and do that first, doing the other in the try/catch handler. E.g. if you think 75% of the time the row will exist, do the UPDATE first, and if 0 rows are updated, do the INSERT. If you think that 90% of the time the row won't exist, do the INSERT, and then do the update in the catch handler (an error will occur when you try to insert the same record twice)

Is using a data table the way to go?
I'd say yes. DataTables can easily hold up to 65 000 rows (the max excel supports) without being too much of a problem, just make sure you run this op on a machine with a good amount of memory. If you don't have that, use a DataReader and read the records one at a time

Would a stored procedure be best?
If your database uses bulked transfer (the oracle drivers can have arrays as parameters so hundreds of values can be transferred in one go) this wouldnt be necessary. If your db doesnt have anything like that then you can get a performance improvement by e.g. concatenating 20 values together in a fixed width string, passing it to a stored procedure and then cutting the string apart again in the sproc. I use this trick at work to load a 1-million row file into the db every night in a situation where I cannot use bulked transfer because the driver doesnt support it. Compared to inserting one by one, which goes at ~900 rows/sec, this method does ~3000rows/sec.

If speed is not an issue, reduce your coding headache, do it one by one

i can use return @@ID.
The data youre loading already has a suitable ID

any examples of code would be great as I am no expert on this type of coding.
ConnectionsStrings.com has the info you need to connect to an excel sheet as though it were a DB. the DW2 links in my signature have advice on database donkey work in general. Start with the Creating a Simple Data App one
 
Back
Top