Question Global Connection Object

Tom

Well-known member
Joined
Aug 23, 2005
Messages
746
Programming Experience
5-10
Two part question,

Should a single global connection object in a module be used for the life of the application rather then a ton of local connections?

And if so, should the connection.state remain open or constantly be re-opened & closed after each query?

I'm running an app that will have very few users connected to this specific database. So room for connection pooling shouldnt be an issue.

Speed is priority in the program, the individual queries will be small; return a few records; update the record(s) and print a report.

Although each query is rather of small; this will be done tens of thousands per day. So returning the record(s), updating and sending to the printer as fast as possibile is priority.

I know that it is recommended to use local connection obects and to open & close the objects as fast as possibile. But this does seem to conflict with the time & resources it takes to keep re-creating objects and establishing new connections.

I think it would be faster to have one global connection that remains open even though it seems not to be the recommended way.

Looking for a detailed response to the pro's and cons of both ways.
 
It's considered best practice to open the connection as late as possible in your code and to close it as soon as possible. Connection pooling will make sure that your connection to the database is reused.

A couple of links for further reading.

We strongly recommend that you always close the connection when you are finished using it so that the connection will be returned to the pool. You can do this using either the Close or Dispose methods of the Connection object, or by opening all connections inside a using statement in C#, or a Using statement in Visual Basic. Connections that are not explicitly closed might not be added or returned to the pool. For more information, see using Statement (C# Reference) or How to: Dispose of a System Resource for Visual Basic.

SQL Server Connection Pooling (ADO.NET)

How to: Dispose of a System Resource
 
It's considered best practice to open the connection as late as possible in your code and to close it as soon as possible. Connection pooling will make sure that your connection to the database is reused.

Agreed and I do understand the recommended way. However personal results just doesnt show it as effiecient to create a new object and new connection every 5-10 seconds rather then leaving a connection open.

Hoping to find something that actually proves this wrong rather then just simply stating it is wrong.
 
Matt, the link you provided argues for the opposite now. This one is suggesting using a single connection object and keeping it open.
 
So the reason to keep one connection around is to avoid the expense of
re-creating a connection, but the ADO library is doing this for you. There
is always one available in the pool.

Also note that connections are automatically closed after a certain amount
of time of inactivity. So even if you think you're keeping a connection open
for the duration of an App - it is actually being open/close implicitly
anyway.

Seems to be suggesting that you create an object and reuse it. It would also suggest that even if you think you're leaving the connection open you're not.
 
Seems to be suggesting that you create an object and reuse it. It would also suggest that even if you think you're leaving the connection open you're not.

Yes opening & closing isnt as much of an issue or as time consuming as actually creating a new object in the first place with each scan.

For example, the program will work along with a bar code scanner. After each item is scanned, the program will query the asscociated data.

With a global connection (whether or not it remains open or is re-opened with each scan) executes alot faster then creating a new connection object and opening after each item is scanned.

But as your first post suggests and I already know; this isnt the recommended way.

So I'm trying to find more concrete proof as to the best way of doing it rather then having the program create thousands of unnecessary connection objects each day just because "it is recommended" while at the same time it decreases performance speed.
 
The disconnected architecture of ADO.Net uses the connection pool by default. The pools job is to manage actual connections to the database. In it's current implementation it basically keeps one or more connections open unless certain conditions occur for the lifetime of the application. These conditions you can read about in help, but the most apparent one is if some kind of exception occurs, then the pool creates a new connection for the next request. Normally when you create a "new connection" in code the pool just return the same currently open database connection. So you see the "disconnected architecture" does not necessarily mean that database connections actually is closed, it means that ADO.Net is designed to keep a local cache of data retrieved from the database, and as far as the application need to be concerned the connection is closed. Using ADO.Net opposed to ADO/connected architecture usually means much better performance, since the connected would generate much more network traffic.

Creating a "connection object" is not very expensive, but you don't need to, you can create one (or more) connection objects and use them each time to "open" and "close" the connection to database. This is also what the wizard does when generating data access classes for a strongly typed dataset with table adapter, it keeps a single connection object declared private in class, the db interaction methods of the TA open and close it at will.

Learning to use the visual designers for data access is recommended, also inspecting the code they generate to understand more, and of course frequently consulting the help documentation for anything you're curious about.
 
Thanks for the reply John, Again I do completly understand and have researched the recommended way, however there doesnt seem to be any evidence that this way is accurate. The below isnt to debate you personally, just to discuss where I see differences from the recommended way.

The disconnected architecture of ADO.Net uses the connection pool by default. The pools job is to manage actual connections to the database. In it's current implementation it basically keeps one or more connections open unless certain conditions occur for the lifetime of the application. These conditions you can read about in help, but the most apparent one is if some kind of exception occurs, then the pool creates a new connection for the next request.

Yes if the connection is closed, it will open a new connection from the pool. The same will be done whether it is a global connection that closed or a brand new connection. So the above doesnt prove anything advantage as far as constantly creating new connections.

Likewise usings a global connection or creating seperate new connections, they all still use the same connection string to the database, therefore are using the same pool.

Using ADO.Net opposed to ADO/connected architecture usually means much better performance, since the connected would generate much more network traffic.

Creating a "connection object" is not very expensive, but you don't need to, you can create one (or more) connection objects and use them each time to "open" and "close" the connection to database. This is also what the wizard does when generating data access classes for a strongly typed dataset with table adapter, it keeps a single connection object declared private in class, the db interaction methods of the TA open and close it at will.

I would disagree with the above statement. Connecting to the database is one of the most expensive operations a server-based application might need to do.

In the overall picture, the expense of initating a new object and opening it may look trivial but comparing the speed of executing this to re-opening a global connection object that has already been initated previously; creating a new object is slower and seems less efficient.

I do agree with opening and closing the connection object as needed just for good coding practices but just for clarification of this specific app, there may only be a total of 12 users connected at most. So its not so much an issue of the amount of users connected at a time.

Learning to use the visual designers for data access is recommended, also inspecting the code they generate to understand more, and of course frequently consulting the help documentation for anything you're curious about.

Again this is something I fully understand and have researched, just have a difference of opinion. The main point of this thread is looking for proof that my opinion is wrong here so that I can go forward with using the recommended way.

Picture a loop that runs 10,000 times per day; in each loop it queries the db and returns a small result of 1-20 records. Can you honestly say that creating, insitating and opening 10k new connection objects inside of the loop is going to run faster and use less resources then creating the object once at form or module level?

If creating new objects is more efficient then it should be able to be proven rather then just quoted.
 
I would disagree with the above statement. Connecting to the database is one of the most expensive operations a server-based application might need to do.
I'm not talking about creating a connection to the database, that is the job of the pool, if it need to, which it normally doesn't as I have explained. When I say "connection object" I mean the ADO.Net connection class instance, for example a SqlConnection. This is exactly what I think your initial confusion is about, and is why I replied to this thread.

I do agree with opening and closing the connection object as needed just for good coding practices but just for clarification of this specific app, there may only be a total of 12 users connected at most. So its not so much an issue of the amount of users connected at a time.
Again the confusion about VB.Net connections and database connections, as I have explained you practically end up with same amount of database connections, and they do keep open even if you close the VB.Net connection.
Picture a loop that runs 10,000 times per day; in each loop it queries the db and returns a small result of 1-20 records. Can you honestly say that creating, insitating and opening 10k new connection objects inside of the loop is going to run faster and use less resources then creating the object once at form or module level?
Yes, you would not consume more resources, because a VB.Net memory object created in a loop is disposed every iteration (and compiler will allocate stack space only once for the loop), and creating the VB.Net object is not expensive, and the database connection keeps open, and you benefit from getting a new database connection from the pool when needed without having to manage this yourself. But as mentioned, you'd declare the the connection object at a higher level normally, and keep the reference, and just open/close it to interact with the current pool.
 
thks again for your reply John, and yes I agree with everything you are saying including the the memory should manage itself properly. I was more woried about the additional time it takes to create multiple objects rather then re-use an existing one.

Although I would fully expect using one connection 10,000 times to be faster then creating 10,000 connections being used once each; I have proven (at least to myself) that I am wrong and the "recommended way" is faster.

The attachement is a sample speed test I made. It first tests the milliseconds to complete 10,000 queries using a existing form level connection and then tests the same with creating a new connection object with each itteration of the loop.

Surprisingly creating the 10,000 connections was faster (slightly) then using the single connection. So I do stand corrected and will create thousands of connections I guess :eek:
 

Attachments

  • sqlConnectionTest.zip
    16.4 KB · Views: 37
There is no difference between those test apart from the SqlConnection object created in the local test, both test open and close the SqlConnection for each iteration, both test use the same underlying and open database connections. To explain; the global SqlConnection you create in Form load is opened (which makes the pool open a database connection for that connection string) and then closed in Finally section, this global connection object is assigned a TA, the TA will upon Fill open the SqlConnection if closed, and only close it if it was closed when Fill was called.

My impression of what you wanted to compare would be to open the global SqlConnection in form Load and leave it open until FormClosing or the like. TA.Fill will now let the SqlConnection object remain open between fills. The global test would then benefit from the step of pulling the existing database connection from the pool, but you have to add code to catch exeptions and open new connections in that case.

To add another test, try to set Pooling=false in connection string, then each dispose/close of the SqlConnection object will actually close the database connection, you should be seeing significant decrease in performance for this.

A side note, you will get much better work performance by using multi-threading than DoEvents. DoEvents should be avoided except for some cases of UI update.
 
There is no difference between those test apart from the SqlConnection object created in the local test, both test open and close the SqlConnection for each iteration, both test use the same underlying and open database connections. To explain; the global SqlConnection you create in Form load is opened (which makes the pool open a database connection for that connection string) and then closed in Finally section, this global connection object is assigned a TA, the TA will upon Fill open the SqlConnection if closed, and only close it if it was closed when Fill was called.

My impression of what you wanted to compare would be to open the global SqlConnection in form Load and leave it open until FormClosing or the like. TA.Fill will now let the SqlConnection object remain open between fills. The global test would then benefit from the step of pulling the existing database connection from the pool, but you have to add code to catch exeptions and open new connections in that case.
.

You are correct, opening & closing the object with each use wasnt much of a concern. I know that I shouldnt hold/lock that open connection to a db even though I could get away with it, since its a very small amount of users. I was more concerned whether or not to use once form/global level connection or to create new in each loop.

Although I agree with the above, Ive also tested & noticed that for example if the global connection is left open for the entire loop, it actually executed even slower (little bit of surprise from that).

To add another test, try to set Pooling=false in connection string, then each dispose/close of the SqlConnection object will actually close the database connection, you should be seeing significant decrease in performance for this.
.

Yes I am aware of this and do have a nice book example showing exactly this.

A side note, you will get much better work performance by using multi-threading than DoEvents. DoEvents should be avoided except for some cases of UI update.

Do you think it is necessary in this situation? Use is limited to updating the interface loop counter progress and status bar. Threading simply for this seems like it may be more complicated then needed.
 
A side note, you will get much better work performance by using multi-threading than DoEvents. DoEvents should be avoided except for some cases of UI update.
Do you think it is necessary in this situation? Use is limited to updating the interface loop counter progress and status bar. Threading simply for this seems like it may be more complicated then needed.

Actually, yes. The Application.DoEvent() call takes quite a lot of ressources at every iteration. An alternative may be to call it only if "intCounter mod 10 <= 0". You can tweak the modulo to get the desired level of performance / UI responsiveness. If the rest of your loop executes fast enough and your UI is complicated enough, you will see a nice performance increase.

You can use a thread for this using a simple BackgroundWorker! It is nowhere near as complicated as it may seem. Add to the form in the designer, handle an event and you're done! There are a few pitfalls you need to know though, but it is pretty simple to get started.

By the way, the difference between your two tests come from the fact that they come right after one another. The second one is probably the fastest no matter which one it is.
 
Actually, yes. The Application.DoEvent() call takes quite a lot of ressources at every iteration. An alternative may be to call it only if "intCounter mod 10 <= 0". You can tweak the modulo to get the desired level of performance / UI responsiveness. If the rest of your loop executes fast enough and your UI is complicated enough, you will see a nice performance increase.

You can use a thread for this using a simple BackgroundWorker! It is nowhere near as complicated as it may seem. Add to the form in the designer, handle an event and you're done! There are a few pitfalls you need to know though, but it is pretty simple to get started.

By the way, the difference between your two tests come from the fact that they come right after one another. The second one is probably the fastest no matter which one it is.

I did experement with switching the called subs order back & forth and maintained similar results.

If either have an example to provide of these other options to the doevents, I would be interested in seeing it.
 
Back
Top