Is a dynamic sql stored procedure a bad thing for lots of records?

I have a table with almost 800,000 records and I am currently using dynamic sql to generate the query on the back end. The front end is a search page which takes about 20 parameters and depending on if a parameter was chosen, it adds an " AND ..." to the base query. I'm curious as to if dynamic sql is the right way to go ( doesn't seem like it because it runs slow). I am contemplating on just creating a denormalized table with all my data. Is this a good idea or should I just build the query all together instead of building it piece by piece using the dynamic sql. Last thing, is there a way to speed up dynamic sql?

Answers


It is more likely that your indexing (or lack thereof) is causing the slowness than the dynamic SQL.

What does the execution plan look like? Is the same query slow when executed in SSMS? What about when it's in a stored procedure?

If your table is an unindexed heap, it will perform poorly as the number of records grows - this is regardless of the query, and a dynamic query can actually perform better as the table nature changes because a dynamic query is more likely to have its query plan re-evaluated when it's not in the cache. This is not normally an issue (and I would not classify it as a design advantage of dynamic queries) except in the early stages of a system when SPs have not been recompiled, but statistics and query plans are out of date, but the volume of data has just drastically changed.

Not the static one yet. I have with the dynamic query, but it does not give any optimizations. If I ran it with the static query and it gave suggestions, would applying them affect the dynamic query? – Xaisoft (41 mins ago)

Yes, the dynamic query (EXEC (@sql)) is probably not going to be analyzed unless you analyzed a workload file. – Cade Roux (33 mins ago)

When you have a search query across multiple tables that are joined, the columns with indexes need to be the search columns as well as the primary key/foreign key columns - but it depends on the cardinality of the various tables. The tuning analyzer should show this. – Cade Roux (22 mins ago)


I'd just like to point out that if you use this style of optional parameters:

AND (@EarliestDate is Null OR PublishedDate < @EarliestDate)

The query optimizer will have no idea whether the parameter is there or not when it produces the query plan. I have seen cases where the optimizer makes bad choices in these cases. A better solution is to build the sql that uses only the parameters you need. The optimizer will make the most efficient execution plan in these cases. Be sure to use parameterized queries so that they are reusable in the plan cache.


As previous answer, check your indexes and plan.

The question is whether you are using a stored procedure. It's not obvious from the way you worded it. A stored procedure creates a query plan when run, and keeps that plan until recompiled. With varying SQL, you may be stuck with a bad query plan. You could do several things:

1) Add WITH RECOMPILE to the SP definition, which will cause a new plan to be generated with every execution. This includes some overhead, which may be acceptable.

2) Use separate SP's, depending on the parameters provided. This will allow better query plan caching

3) Use client generated SQL. This will create a query plan each time. If you use parameterized queries, this may allow you to use cached query plans.


The only difference between "dynamic" and "static" SQL is the parsing/optimization phase. Once those are done, the query will run identically.

For simple queries, this parsing phase plus the network traffic turns out to be a significant percentage of the total transaction time, so it's good practice to try and reduce these times.

But for large, complicated queries, this processing is overall insignificant compared to the actual path chosen by the optimizer.

I would focus on optimizing the query itself, including perhaps denormalization if you feel that it's appropriate, though I wouldn't do that on a first go around myself.

Sometimes the denormalization can be done at "run time" in the application using cached lookup tables, for example, rather than maintaining this o the database.


Not a fan of dynamic Sql but if you are stuck with it, you should probably read this article: http://www.sommarskog.se/dynamic_sql.html He really goes in depth on the best ways to use dynamic SQL and the isues using it can create.

As others have said, indexing is the most likely culprit. In indexing, one thing people often forget to do is put an index on the FK fields. Since a PK creates an index automatically, many assume an FK will as well. Unfortunately creating an FK does nto create an index. So make sure that any fields you join on are indexed.

There may be better ways to create your dynamic SQL but without seeing the code it is hard to say. I would at least look to see if it is using subqueries and replace them with derived table joins instead. Also any dynamic SQl that uses a cursor is bound to be slow.


If the parameters are optional, a trick that's often used is to create a procedure like this:

CREATE PROCEDURE GetArticlesByAuthor (
    @AuthorId int,
    @EarliestDate datetime = Null )
AS
   SELECT  * --not in production code!
   FROM Articles
   WHERE AuthorId = @AuthorId
   AND (@EarliestDate is Null OR PublishedDate < @EarliestDate)

There are some good examples of queries with optional search criteria here: How do I create a stored procedure that will optionally search columns?


As noted, if you are doing a massive query, Indexes are the first bottleneck to look at. Make sure that heavily queried columns are indexed. Also, make sure that your query checks all indexed parameters before it checks un-indexed parameters. This makes sure that the results are filtered down using indexes first and then does the slow linear search only if it has to. So if col2 is indexed but col1 is not, it should look as follows:

WHERE col2 = @col2 AND col1 = @col1

You may be tempted to go overboard with indexes as well, but keep in mind that too many indexes can cause slow writes and massive disk usage, so don't go too too crazy.

I avoid dynamic queries if I can for two reasons. One, they do not save the query plan, so the statement gets compiled each time. The other is that they are hard to manipulate, test, and troubleshoot. (They just look ugly).

I like Dave Kemp's answer above.


I've had some success (in a limited number of instances) with the following logic:

CREATE PROCEDURE GetArticlesByAuthor (    
    @AuthorId int,    
    @EarliestDate datetime = Null 
    ) AS   

SELECT SomeColumn
FROM Articles   
WHERE AuthorId = @AuthorId   
AND @EarliestDate is Null
UNION
SELECT SomeColumn
FROM Articles   
WHERE AuthorId = @AuthorId   
AND PublishedDate < @EarliestDate

If you are trying to optimize to below the 1s range, it may be important to gauge approximately how long it takes to parse and compile the dynamic sql relative to the actual query execution time:

 SET STATISTICS TIME ON;

and then execute the dynamic SQL string "statically" and check the "Messages" tab. I was surprised by these results for a ~10 line dynamic sql query that returns two rows from a 1M row table:

 SQL Server parse and compile time: 
    CPU time = 199 ms, elapsed time = 199 ms.

 (2 row(s) affected)

 SQL Server Execution Times:
     CPU time = 0 ms,  elapsed time = 4 ms.

Index optimization will doubtfully move the 199ms barrier much (except perhaps due to some analyzation/optimization included within the compile time).

However if the dynamic SQL uses parameters or is repeating than the compile results may be cached according to: See Caching Query Plans which would eliminate the compile time. Would be interesting to know how long cache entries live, size, shared between sessions, etc.


Need Your Help

Ordering a dictionary to maximize common letters between adjacent words

algorithm language-agnostic puzzle

This is intended to be a more concrete, easily expressable form of my earlier question.

Is there an ActiveRecord framework for node.js and PostgreSQL?

postgresql node.js activerecord

I'm new to Node.js and coming from the Rails world, I'm fond of the ActiveRecord pattern/framework it provides. Is there anything like it for node? Or am I stuck with building queries? Also, Pos...