Chris Webb's BI Blog

Analysis Services, MDX, PowerPivot, DAX and anything BI-related

Archive for December 2010

Sixth Blog Birthday

with 6 comments

Every year on December 30th (or thereabouts) I write a blog post summarising what I’ve been up to this year, to mark this blog’s birthday. Today is my blog’s sixth birthday which, to be honest, I find pretty amazing – but then every year I’m amazed at how long I’ve kept at this! I only wish I could apply the same kind of willpower to making cash (I’d be a billionaire by now) or dieting, exercise or anything else…

From a blog point of view, the big change this year was moving away from Windows Live Spaces, where I’d started out, to WordPress. I’d been wanting to make the move for ages but had been too lazy to do so, and also unwilling to lose all my old posts and traffic; luckily when MS made the decision to retire Live Spaces they made sure it was very easy to migrate all my old content to WordPress and for that I’m very grateful – it almost makes up for the shabby treatment Live Spaces users had been getting in the preceding years.

I also got one of my biggest ever days from a blog traffic point of view during the controversy over the future of SSAS at PASS; I’m not even going to link to that particular post because it still gets plenty of hits and it’s something I’d prefer to move on from. It’s better to look forward to the undoubtedly cool stuff that will be coming with Denali rather than brood over any disappointment about what has happened, and one thing I’m certainly relieved about is that in 2011 there will be lots of cool stuff to blog about – at some points in the last year I’ve struggled to find anything new or interesting to blog about for SSAS (although as Greg Galloway has just shown, there’s still lots of interesting SSAS topics to blog about if you look). Bring on BISM!

Professionally things go from strength to strength and I’m going to be busier in the first half of 2011 than I’ve ever been in January/February, which are usually relatively quiet times for me. Again, with significant new technology arriving in the BI stack in Denali I would expect a lot of consultancy and training work to follow; I’m going to be expanding my training activities in other ways next year too, and I’ll be making a separate announcement about that here any day soon. SQLBits also continues to grow and grown, and I’m really excited about the plans for SQLBits 8 in Brighton this April. Since it doesn’t look like there’s going to be a PASS Europe this year I would say that anyone who’s serious about SQL Server in Europe should plan to come to what’s going to be the biggest and best SQL conference in Europe bar none; and even if you live outside Europe, but fancy a vacation, why not come too? We have an ever increasing number of US attendees and speakers!

Anyway, it’s time for me to go and do some real work and stop being self-indulgent. I hope everyone reading this has a great 2011. Thanks for reading!

Written by Chris Webb

December 30, 2010 at 1:25 pm

Posted in Uncategorized

New Trace Events in SSAS2008 R2 CU5

with 2 comments

Greg Galloway has just pointed out to me a new KB article outlining some major improvements/changes to trace events that are coming in CU5 for SSAS 2008 R2:
http://support.microsoft.com/kb/2458438/

This represents part of the new monitoring functionality that will be coming to SSAS in Denali. These changes will allow us to:

  • Monitor what happens during the execution of the MDX Script in more detail
  • Track resource usage for individual commands (this alone is very interesting)
  • Track locks more effectively

These changes are going to be very useful to anyone running a high-end SSAS implementation.

Written by Chris Webb

December 23, 2010 at 2:24 pm

Posted in Analysis Services

SQLBits 8–Beside the Sea!

leave a comment »

The dates for the next SQLBits have been announced: it’s going to be on April 7th-9th at the Grand Hotel in Brighton (for those of you outside the UK, Brighton is a very cool town on the south coast of England, very easy to reach from London and Gatwick Airport). More details, as always, are to be found at http://www.sqlbits.com/

We’ve also posted most of the videos from SQLBits 7, which are available to view for free from the SQLBits site at the link above. Stay tuned for more announcements about the exciting stuff we have planned for Brighton!

Written by Chris Webb

December 23, 2010 at 2:15 pm

Posted in Events

Tagged with

Google Books Ngram Viewer

with 2 comments

This has been blogged about quite a bit over the last few days, but if you’re looking to kill some time at work with something vaguely BI-related then take a look at Books Ngram Viewer, the latest thing on Google Labs:

http://ngrams.googlelabs.com/

It’s pretty simple: it allows you to visualise the number of occurrences of words in books held in Google books by year of publication, and therefore track the popularity of ideas and concepts over time (well, that’s the theory). You can even download the source data – if I’ve got some spare time it might be interesting to see this data in PowerPivot…!

There are already some interesting examples of things you can see with this data out there, for example on the Information Is Beautiful site, and on the Google Blog post announcing it. I can’t resist adding one more, though: Microsoft vs Oracle vs IBM vs Google – notice how IBM peaks in about 1988 then declines; Oracle and Microsoft have a similar peak in 2004 and then decline; while Google is still on the up…

Written by Chris Webb

December 20, 2010 at 11:37 pm

Posted in Google

Dryad and DryadLINQ in Beta

leave a comment »

I’ve just seen the news on the Windows HPC blog that Dryad and DryadLINQ are now in beta:
http://blogs.technet.com/b/windowshpc/archive/2010/12/17/dryad-beta-program-starting.aspx

If you’re wondering what Dryad is, Daniel Moth has blogged a great collection of introductory links; and Jamie and I have been following Dryad for some time too. Although it seems like its integration with SSIS has fallen by the wayside its relevance for very large-scale ETL problems remains: it will allow you to crunch terabytes or petabytes of data in a highly parallel way. And given that you can now join Windows Azure nodes to your HPC cluster, it sounds like it’s a solution that can scale out via the cloud, making it even more exciting. I hope the people working on Microsoft’s regular and cloud BI solutions are talking to the HPC/Dryad teams.

Written by Chris Webb

December 18, 2010 at 9:27 pm

Posted in ETL

Tagged with

Raising your own Errors in Dynamic Security

with 6 comments

Last week I was working with a customer that is going to implement dynamic security using the CustomData function: they have their own web-based reporting app and want to use the CustomData connection string property to pass information about the user from the app back to a role in SSAS, which then dynamically generates an allowed set for dimension security based on that information. You can do something similar in the Adventure Works cube by doing the following:

  • Create a new role, called “Role”, and give it access to the Adventure Works cube
  • Go to the Dimension Data tab and select the Country attribute on the Customer cube dimension and select the ‘Deselect all members’ radio button
  • Go to the Advanced tab and enter the following MDX in the Allowed Member Set textbox:
    {STRTOMEMBER("[Customer].[Country].&[" + CUSTOMDATA() + "]", CONSTRAINED)}
  • Deploy the solution
  • Go to SQL Management Studio and click the MDX button to open a new MDX query window
  • In the connection dialog that appears, click the Options button, go to the Additional Connection Parameters tab, and then enter:
    Roles=Role;Customdata=Australia
  • Ignore what you see in the metadata pane (it’s a different connection), watch out for this bug, and note that when you run the following query, only the country Australia is returned:

    SELECT {} ON 0,
    [Customer].[Country].[Country].MEMBERS ON 1
    FROM [Adventure Works]

Once this was working, the next question the customer had was what happens if you forget to pass in the CustomData connection string property, or you pass an invalid value in (in this case, a country that doesn’t exist on the hierarchy). Well, the good thing is that after you connect you can’t run any queries, but you do get an unhelpful error message like this:

The ‘Country’ attribute in the ‘Customer’ dimension has a generated dimension security expression that is not valid.
DimensionPermission (1, 2) The restrictions imposed by the CONSTRAINED flag in the STRTOMEMBER function were violated.

Not good for debugging, and a bit confusing for your users if they see it. Also, in some implementations of dynamic security (for example if you’re not using the Constrained flag or using a Filter) instead of getting an error you’ll be able to query the cube but just see no data in it, which is even more confusing.

So what we want to be able to do is display our own error messages in these situations as well as forcing a lock-out, and we can do this by using the MDX Error() function (I always knew I’d find a use for it someday!). In our example if we want to trap situations where the developer has forgotten to pass in the Customdata connection string property we can use the following MDX in our role:

IIF(
CUSTOMDATA()=NULL,
ERROR("You forgot to pass in the Customdata connection string property!"),
{STRTOMEMBER("[Customer].[Country].&[" + CUSTOMDATA() + "]")})

Now, when you connect you’ll see a slightly more helpful error message. Here’s what Excel shows when you open a connection using the role but don’t supply anything in Customdata:

image

Of course you’d probably want to handle more complex scenarios than this, so here’s a more complex example that handles missing and invalid Customdata values:

IIF(
CUSTOMDATA()=NULL,
ERROR("You forgot to pass in the Customdata connection string property!"),
IIF(
COUNT(
INTERSECT({STRTOMEMBER("[Customer].[Country].&[" + CUSTOMDATA() + "]")}
  ,[Customer].[Country].[Country].MEMBERS)
)=0,
ERROR("The country " + CUSTOMDATA() + " doesn’t exist on the Country hierarchy of the Customer dimension"),
{STRTOMEMBER("[Customer].[Country].&[" + CUSTOMDATA() + "]",CONSTRAINED)}))

And here’s what you get in Excel with this if you pass the value “Norway” using Customdata:

image

This may all seem like window dressing, but it can save you time later on when you’re trying to debug security problems over the phone with users or other developers.

Written by Chris Webb

December 15, 2010 at 7:20 pm

Posted in MDX

Subscribing to Traces

leave a comment »

There’s a few blog posts out there that describe how to define and run server-side traces for SSAS (such as this one) but recently I was looking into how to subscribe to a server-side trace that’s already running, and while there’s a good thread on the forum I didn’t find much other information out there on the subject. Time for a blog post, then.

First of all, you need a trace running on the server before you can subscribe to one. Luckily, you can find out what traces are running by querying the discover_traces DMV as follows:

select * from $system.discover_traces

Unless you’ve specifically turned it off, you’ll see at least one trace running on each SSAS instance: the flight recorder trace, which has the ID FlightRecorder. You can then subscribe to a trace by executing an XMLA command like this:

<Envelope xmlns="http://schemas.xmlsoap.org/soap/envelope/">
  <Header>
  </Header>
  <Body>
    <Execute xmlns="urn:schemas-microsoft-com:xml-analysis">
      <Command>
        <Subscribe xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
          <Object xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
            <TraceID>FlightRecorder</TraceID>
          </Object>
        </Subscribe>
      </Command>
      <Properties>
        <PropertyList>
        </PropertyList>
      </Properties>
    </Execute>
  </Body>
</Envelope>

Or, alternatively, anywhere you can run an MDX query (for example from an MDX query window in SQL Management Studio) you can run just the Subscribe:

<Subscribe xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
 <Object xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
  <TraceID>FlightRecorder</TraceID>
 </Object>
</Subscribe>

 

Once you’ve executed either of the above, the command will run indefinitely until you cancel the session it’s running on (it may look like it’s hanging, but it isn’t). To cancel the session you’ll need to find its SPID, which again you can find by querying the discover_commands DMV:

select * from $system.discover_commands order by command_start_time

You can then take the SPID and execute an XMLA Cancel command like this one:

<Cancel xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
  <SPID>1887</SPID>
</Cancel>

 

Once you’ve done this you’ll see the data from your trace returned at last:

image

I had thought this might be an easy-ish way of getting trace data into PowerPivot for analysis, but unfortunately as soon as you kill the session the subscribe is running on you see an error in the PowerPivot window. But it will work in SSRS if you use an OLEDB connection and can find an easy way of killing a session from inside an SSRS report – I’m sure it’s possible, using this technique, but it won’t be elegant. To be honest, though, in most cases saving the trace to a table in SQL Server, or to a file, will be more convenient if you want to use trace data for analysis.

Written by Chris Webb

December 8, 2010 at 11:30 pm

Posted in Analysis Services

Follow

Get every new post delivered to your Inbox.

Join 3,299 other followers