ASP.NET/ADO.NET: Handling many database connections inside a .NET Object?

We have a .NET object that does a lot of reading/writing with the database. Throughout the life cycle of this object (or the asp page that uses it), it may hit the database with a query/update anywhere from 1 to 10 times.

Instead of opening and closing a database connection every time the object needs to hit the database, it simply opens a connection to the database during instantiation and then closes the connection during the object's termination event. Is that a bad practice?

The assumption was that since the object hits the database every time it's instantiated (and usually multiple times), it would be better to just open a connection at the beginning of the object's life and then close it at the end.

The other alternative is to open and close the database connection before and after each query/operation.

What's the best practice here in order to maximize performance?

****update**** Thanks for the tips guys. Can anyone speak more to the method of opening/closing a connection inside an object's instantiation/termination events and the repercussions of doing so?


Open an close the connection as needed. ADO.NET has built in connection pooling that works. You will not notice any performance issues unless you are doing this in a loop with thousands of open/closes.

edit See Should I persist a sqlconnection in my data access layer? for more information as to the pitfalls of connection persistence.

Open and close each time... Open immediately (as close as possible) before the line of code that actual performs the database operation, and close as soon as possible immediately afterwards. When you do this in this way, does NOT actually close the connection, it just releases it back into the Connection Pool, where it sits and waits for the next request for a connection with the same conenction string. You are not incurring the ovberhead of actually recreating the connection each time...

The only issue is if you are asynchronously doing so many connection attempts that you exceed the maximum number of connections in the pool.... and there are solutions to this issue as well, using the System.Threading.ThreadPool class...

To add to the trust in connection pooling argument - holding the connection open for longer than needed can actually decrease overall performance, as the connection pool can't share that connection to other components needing a DB connection.

So, yes, open and close the connection as needed. Though, it'd be even faster if you could batch your queries into a single exec call.

Even when does NOT actually close the connection, when you do: Conn.Close() it executes "sp_reset_connection" at the server, even when sp_reset_connection is a lightweight store procedure, it generate some network traffic. So for example, i would not recommend closing and opening the connection inside a loop.

Need Your Help

Implementing a particular comparison function

python algorithm sorting comparison

I am implementing the parallel DC3, pDC3 algorithm from this paper:

Create ActiveRecord attributes without overriding select

ruby-on-rails activerecord

Using SQL's AS operator, we may create ActiveRecord attributes that don't correspond to a real column in our database: