Chris Webb's BI Blog

Analysis Services, MDX, PowerPivot, DAX and anything BI-related

SSRS and SSAS-Sourced Parameter Dataset Performance Issues

with 14 comments

Ahhh, Reporting Services and Analysis Services integration – a never ending source of pain and frustration! Although I shouldn’t complain because it’s been quite lucrative for me in terms of consultancy work…

Anyway, recently I was tuning a SSRS report based on a SSAS cube for a customer. Looking at Profiler I could see a number of slow queries being executed, which was strange given that it was a fairly simple report. It turned out that during the design phase this report had had a number of parameters created in the SSAS query designer that had later been deleted; however, when you delete a parameter in the SSAS query designer BIDS does not delete the hidden dataset that it is bound to. What’s worse is that when an SSRS report is executed all dataset queries are also executed, even if the datasets aren’t used anywhere in the report, which means you get the overhead of running extra queries. It’s an easy mistake to make and in this case the execution of unused datasets was adding several seconds to the execution time of the report.

You can reproduce this problem very easily by creating a simple report based on the Adventure Works cube. Once you’ve created the data source, open the query designer, drag the Internet Sales Amount measure onto the grid, drag the Customer hierarchy from the Customer dimension onto the filter pane and check the Parameters box:

image

Now close the query designer and reopen it, then remove the Customer hierarchy from the filter pane, close the query designer again and delete the report parameter. When you Preview the report you’ll see the following in Profiler:

image

The highlighted row shows the hidden dataset is being executed. What you need to do to fix this is to right-click on your data source and check the Show Hidden Datasets option:

image

You’ll then see the offending, automatically-generated, hidden dataset and you can delete it:

image

Luckily, BIDS Helper has functionality to find unused datasets in your reports for you:
http://bidshelper.codeplex.com/wikipage?title=Dataset%20Usage%20Reports&referringTitle=Home

And there’s more! What I found really interesting about this parameter dataset query was how long it was taking to execute. In this example 2.5 seconds, even on a warm cache, seems like a very long time to me even though there are a lot of members on the Customer hierarchy. Once the report is deployed that goes down to a consistent 2.1 seconds, and when I run the same query through SQL Management Studio it goes down to 1.5 seconds. Why the difference in execution times? I’m not sure, but I suspect it’s a combination of the connection string properties used and the use of a flattened rowset. In any case, 1.5 seconds is still slow and it’s certainly not good if you actually do want to use a query like this in a dataset bound to a parameter.

Luckily, if our parameter datasets are causing performance problems, we can usually rewrite the queries involved to make them faster. Here’s the original query from the parameter in the example:

WITH
MEMBER [Measures].[ParameterCaption] AS
[Customer].[Customer].CURRENTMEMBER.MEMBER_CAPTION
MEMBER [Measures].[ParameterValue] AS
[Customer].[Customer].CURRENTMEMBER.UNIQUENAME
MEMBER [Measures].[ParameterLevel] AS
[Customer].[Customer].CURRENTMEMBER.LEVEL.ORDINAL
SELECT
{[Measures].[ParameterCaption],
[Measures].[ParameterValue],
[Measures].[ParameterLevel]}
ON COLUMNS ,
[Customer].[Customer].ALLMEMBERS
ON ROWS
FROM [Adventure Works]

If we decide that we can make do without the All Member and the level-based indenting that goes on in the parameter dataset (this is an attribute hierarchy, after all, so there’s just one level), we can use the following query in the dataset instead:

WITH MEMBER MEASURES.DUMMY AS NULL
SELECT
{MEASURES.DUMMY}
ON COLUMNS ,
[Customer].[Customer].[Customer].MEMBERS
DIMENSION PROPERTIES MEMBER_CAPTION, UNIQUE_NAME
ON ROWS
FROM [Adventure Works]

Once the first query above has been replaced with the second, and the report parameter has been hooked up to use the new fields, Profiler shows that the time taken to execute the parameter dataset has gone down to around 0.7 seconds:

image

That is, of course, almost 2 seconds faster than the original query in Preview mode and almost 1.5 seconds faster than the original query in the deployed report. Not a lot on its own but certainly noticeable, and if you have more than one large parameter the cumulative gain could be quite significant. If you create a separate OLEDB connection to the cube and use the second query in a dataset, the execution time is even faster, going down to around 0.45 seconds:

image

Incidentally, some of the more astute may be asking why I need to include MEASURES.DUMMY in the query above when I can use an empty set on the columns axis instead. Two reasons: one, if you use an empty set on columns in the OLEDB connection you get no rows returned; two, I noticed when the query was being executed in SSRS a Query Subcube event was raised suggesting measure data was being requested from the cube – this didn’t happen when I ran the query in SQL Management Studio. I suspect both problems are something to with SSRS using a flattened rowset, so I’ll investigate and post back here when I get an answer.

Written by Chris Webb

March 2, 2010 at 12:26 am

Posted in Reporting Services

14 Responses

Subscribe to comments with RSS.

  1. Your suggestion to delete unused datasets applies equally to SQL datasets. It appears that RS fires every dataset, whether they are used or not.

    Richard

    March 2, 2010 at 4:51 am

  2. Great stuff Chris, i work with SSRS and SSAS on most of my consultancy jobs and it indeed is astounding to see how bad SSRS and SSAS work together. Indeed good for us but bad for customers who are report builders.It should be a great match but we have to hack it at every turn! The rumor is that this will improve much with SQL 11, which i hope :)

    Kasper

    March 3, 2010 at 8:32 am

  3. They\’ve been saying that the integration will improve ever since I saw the first beta of SSRS back in, ohhhh, 2002 I think. I\’ll believe it when I see it!

    Chris

    March 3, 2010 at 3:15 pm

  4. I second that one Chris. Slowly waiting for this integration to improve as well. Nice catch on the parameter dataset. Crazy. Thanks for sharing your tips.

    Dan

    March 10, 2010 at 10:28 am

  5. Can you provide any info on what SSRS does in the background to process a report from an OLAP cube that uses Query Parameters?Our report performance is very poor (compared with the Excel 2007 pivot tables). For example, we have a report that has 5 query parameters. We have no unused datasets. When we profile the report, the sql trace shows that each query parameter dataset is executed twice in a row. Then once they\’ve all been executed, it executes each of them one more time before it begins the main dataset query. Now, each of these query parameters do not take much time, but the main datset for this particular report will take nearly 22 seconds. We\’ve tried comparing the times in profiler to the SSRS Execution log, but we cannot get them to tie out. And, the execution log shows that the greater amount of time appears to be in the TimeDataRetrieval (vs. the TimeProcessing and TimeRendering). Yet, we cannot find any information to point us to how SSRS is processing the data, datasets, reports, etc. themselves to figure out what is causing the poor performance. Just wondering if you can provide any insight on what it is doing? Thanks

    N

    March 24, 2010 at 1:50 pm

  6. Which version of SSRS are you using, N? I seem to remember seeing SSRS trying to validate queries before it ran reports – and because MDX doesn\’t support any kind of query validation, this involved SSRS running the queries to validate them and then running them again.If your main dataset always takes 22 seconds to run, then it sounds like the query needs some tuning. Do you have lots of calculations on the cube or in the WITH clause of the query? My guess is that for some reason the calculations are not being cached – which could be due to this issue: http://cwebbbi.spaces.live.com/blog/cns!7B84B0F2C239489A!3057.entry

    Chris

    March 24, 2010 at 3:15 pm

  7. We are using SSRS 2005 Standard Edition (SP3). Yes, we\’re thinking the main dataset needs some tuning. This is where our issue lies. How do we determine where the issue is within the dataset? This is where we are \’stuck\’ and have been searching for answers.I write the reports (and new to it as well, not real familiar with MDX code), not the cube. The MDX query for the SSRS report is generated by SSRS. I do not believe I have any calculations in the query (if there are, then they were generated by SSRS). Our thought, here, is that all these calculations are in the cube. So we should, essentially, just be able to \’drag & drop\’ the dataset query onto the SSRS report for display. I\’d copy the SSRS generated code here, but I don\’t know if that\’s appropriate.I will forward your link you included to those here who build the cube to see if they can determine if the cacheing would help.Thanks and any other insight would be greatly appreciated.

    N

    March 24, 2010 at 5:14 pm

  8. thanks chris!

    Elizabeth

    March 26, 2010 at 4:21 am

  9. Hello Chris,is there a way to avoid this autogeneration of the hidden datasets. I have several datasets that should use all the same parameter but go over different cubes. But I want one specific cube to be the source for the hidden dataset and not the last one I put in the parameter. greetings from Zurich ;-)

    Ernesto

    September 30, 2010 at 2:48 pm

  10. You can\’t avoid the autogeneration of hidden datasets, but you can delete them and change the parameters to use the datasets you create.

    Chris

    September 30, 2010 at 11:54 pm

  11. Cant believe how badly SSRS and SSAS interact. Do the dev teams not like each other.

    I have gone down the route of trying to report from cubes and am having all sorts of problems.

    Excellent article btw, I am a noob to this the steps above really helped me troubleshoot a problem I had.

    Chris

    December 15, 2010 at 10:14 am


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 3,302 other followers

%d bloggers like this: