Chris Webb's BI Blog

Analysis Services, MDX, PowerPivot, DAX and anything BI-related

Archive for February 2005

Comparison of Oracle and Microsoft OLAP

leave a comment »

A typically intelligent and even-handed comparison of Oracle and Microsoft’s forthcoming OLAP offerings is available here from Mark Rittman (whose blog is also well worth a read). From a technical perspective I agree with everything he says, particularly about query language being a key differentiating feature. A little-known fact about Analysis Services is that you can query it using SQL, but it only supports a very basic dialect; it would have been interesting to see what the reaction of users would have been if AS’s SQL support was even slightly better than it is. Would it have affected the uptake of MDX? Possibly, since it might have discouraged users from investing the time and effort in learning the language which in turn would mean they never discovered the benefits it offers over SQL. On the other hand it would have been useful to have SQL-centric BI tools like Business Objects be able to query an AS cube natively; in the case of Business Objects, although their support for MDX and Analysis Services has actually been pretty strong I’ve always felt it’s somewhat peripheral, never properly integrated into their product line.

Written by Chris Webb

February 25, 2005 at 11:58 am

Posted in On the internet

Maestro

leave a comment »

Interesting nugget from Barbara Darrow here, regarding the next release of the scorecard tools that are currently available as Solution Accelerators. I’m prepared to accept that the features it will contain will be ‘useful and startling’ (when was the last time that a software vendor told you anything else about a forthcoming version?) but to me the bigger news is the fact that it might ship bundled with SQL Server itself.

I already knew from reports such as this that MS decided that the previous ‘free download’ approach of marketing the Solution Accelerators was a failure. That didn’t surprise me: MS relies heavily on partners sell and implement BI solutions and in my experience they were very often lukewarm in their response to the Solution Accelerators. Why? Partly because these companies had existing relationships with vendors such as Proclarity who had their own tools with similar, and often more sophisticated functionality; partly because they were sceptical about the long-term future of a free tool (and let’s face it, the second report above vindicates this scepticism); and partly because the accelerators weren’t promoted much, so no-one knew about them. Bundling these tools with the database itself would automatically generate more visibility for them in terms of partners and more importantly customers (why do I need to buy Proclarity? I just want what this free tool on the SQL Server CD gives me…) and would help give the impression that MS is offering a much more rounded overall BI solution incorporating client as well as server components.

So where does this leave the partners? Well, there will always be a market for more sophisticated client tools than the ones that MS offers, and this approach by MS of offering more and more basic functionality spurs them on to improve their own offerings and make more of an effort selling them. Despite the fact that customers would prefer to deal with only one vendor when buying a BI solution (Marco Russo’s comments here are pretty typical, I think) the partner/third party client tool model is unlikely to go away. MS just likes to keep them on their toes, that’s all…

Written by Chris Webb

February 25, 2005 at 10:36 am

Posted in On the internet

MDX ‘Automated Unit Testing’

leave a comment »

My colleague Jon Axon sent round a very interesting mail at work this morning, detailing a new method he’s using for testing during cube development. Rather than try to paraphrase it, here’s what he sent:

When I’m modelling cubes, I generally tend to create a basic model which produces correct figures, and then tweak it for performance e.g. refine calculations, play around with dimension settings; this is similar to code refactoring and an important part of this is constant repetition of unit tests to ensure you don’t alter the behaviour of the cube as you improve performance i.e. the numbers stay the same. You could always hack some ADOMD code to run the same query on your new cube iteration and on an established baseline cube and compare cellsets, but this is a pain. However thanks to a couple of MDX tricks you can actually go a surprisingly long way to producing a generic framework for this purely within an MDX query, as shown below.

The gist of it is that you produce a fixed baseline cube with correct behaviour, and then copy-and-paste this within the same database to create a version of the cube for refining; only the latter is subequently altered. The two cubes don’t have to have identical structure (e.g. identical dimensionality), however you can only compare values of cells whose context is defined by identically named members in both cubes (e.g. you can’t compare measure M in one cube to measure N in another). The template is as follows; only the sections marked as USER DEFINED need to be changed i.e. a couple of set/member definitions at the top, and the cube name at the bottom of the query (unfortunately MDX syntax doesn’t allow us to group these items into a single location). Think of the user-defined section as parameters for a procedure, these are:

TestDomain: a set expression defining the subcube (minus measures) whose values you want to compare across your two target cubes.

TestMeasures

: the set of measures to be compared within the context of the subcube defined above.

TargetCube: a string specifying the name of the cube (within the same database as the cube in the FROM clause) against which the comparison is to be made.

DisplayDetails

: a string "Y" or "N" indicating whether executing the query should display a detailed report i.e. the results of the comparison on a cell-by-cell basis, or whether an "executive summary" should be given i.e. a single cell indicating complete success or at least one failure (the first such failure will be indicated).

Here is an example:

 

WITH

/* USER DEFINED SECTION */

SET TestDomain asNonEmptyCrossjoin(Except(DimA.Members, {DimA.DefaultMember}),

Except(DimB.Members, {DimB.DefaultMember}),

Except(DimC.Members, {DimC.DefaultMember}))’

SET TestMeasures as ‘{BadCalc}’

MEMBER Measures.TargetCube as"TestCubeDeltaA"

MEMBER Measures.DisplayDetails as"Y"

/* END USER DEFINED SECTION */

SET FinalMeasures asExcept(TestMeasures, {TargetCube, CurrentTuple, Indicator, Result, DisplayDetails})’

MEMBER Measures.CurrentTuple as")"

CELL CALCULATION Calc for ‘({Measures.CurrentTuple})’ asIif(CalculationCurrentPass() = 129, "(", "") +
Iif(Dimensions(129 – CalculationCurrentPass()) IS Measures, "",
Dimensions(129 – CalculationCurrentPass()).CurrentMember.UniqueName +
Iif(CalculationCurrentPass() = (130 – Dimensions.Count), "", ",")) +
CStr(
CalculationPassValue(Measures.CurrentTuple, -1, RELATIVE))’,
CALCULATION_PASS_NUMBER = 129, CALCULATION_PASS_DEPTH = 128,
CONDITION = ‘
CalculationCurrentPass() > (129 – Dimensions.Count)’

MEMBER Measures.Indicator asIif(Len(Generate(FinalMeasures,
Iif(FinalMeasures.Current = LookupCube(CStr(TargetCube), "{{" + CStr(CurrentTuple) + "} * {" +
TupleToStr(FinalMeasures.Current) + "}}.Item(0)"), "", "x"))) > 0,
"MISMATCH!", "OK")’

MEMBER Measures.Result asIif(Count(Filter(TestDomain, CStr(Indicator) <> "OK") as failures) = 0, "All OK", "First failure: " + TupleToStr(failures.Item(0)))’

SELECT

StrToSet(Iif(UCase(CStr(DisplayDetails)) = "Y", "Union(FinalMeasures, {Indicator, Result})", "{Result}")) on 0,
StrToSet(Iif(UCase(CStr(DisplayDetails)) = "Y", "TestDomain", "{TestDomain.Item(0).Item(0)}")) on 1

FROM

/* USER DEFINED SECTION */
TestCube
/* END USER DEFINED SECTION */

One interesting "geeky" point about this aside from any useful purpose it may have is that it demonstrates a bona fide situation in which Chris’ generic "currentmember from every dimension" cell calculation (see his blog) is necessary to work around MDX limitations; the semantics of LookupCube() are such that to my knowledge there is no other way of accomplishing this generically.

Unfortunately due to both the cell calculation and the LookupCube(), performance can decrease spectacularly as the subcube size increases (particularly when DisplayDetails is set to "Y"), so try to restrict use to relativey small cellsets (as one would expect of unit tests) rather than entire cubes at once. I’ve made no particular effort to tune any of this template so you may well be able to improve it dramatically.

You can of course also accomplish a lot of this with less performance overhead by merging two such target cubes into a single virtual cube and comparing measures directly, but this imposes further restrictions e.g. only one custom rollup formula can be in operation at a time on a specific dimension in a virtual cube, whereas by using LookupCube() across two base cubes you can employ separate rollups on each (in fact this is exactly the situation for which I originally came up with this).

NB/ if you get cells reporting COM errors then this is typically due to you requesting something in the LookupCube() that can’t be found; check your TestDomain and TestMeasures definitions to ensure they are common to both cubes.

Written by Chris Webb

February 23, 2005 at 11:34 am

Posted in MDX

MDX Webcast

leave a comment »

I’ve just come across a new webcast available on Technet on ‘Common MDX Mistakes and Solutions in AS2K’, available here (for some reason the sound on the webcast wasn’t working when I viewed it, but you can download the transcript separately). It has got some really good content in it – I certainly learned a few things. Highlights include more details on how the mysterious Cache Policy connection string property works, the good news that an MDX Debugger will be available for download soon for AS2K (I’ve seen an early version and it is pretty useful), and the fact that some VBA functions are now supported natively in MDX so any queries which use them will perform better.

Written by Chris Webb

February 23, 2005 at 11:27 am

Posted in MDX

PASS presentation

with one comment

I’m going to be giving a presentation at the 2005 PASS European Conference in May this year on new cube design features in AS2005. This will cover stuff like the new attribute-based dimension model, the new dimension types (for example many-to-many, reference, fact and role-playing) and measure expressions. Although I’ve had some experience with these features through the course of my beta testing I would be very interested in hearing from anyone else who has used them too, especially people who have tested them in real-world scenarios, so we can compare notes. I’d like to be able to get as many useful tips and tricks into my presentation as possible and avoid just rehashing what’s already in BOL! So please either leave a comment or mail me on cwebb_olap.removethisbit@hotmail.com if you can help me…

Written by Chris Webb

February 22, 2005 at 3:03 pm

Posted in Analysis Services

BI Blogs site

leave a comment »

Just had an email from Scott Mitchell, drawing my attention to his new site: http://www.biblogs.com/
…a good place to catch up with the latest entries on various BI-related blogs.

 

Written by Chris Webb

February 21, 2005 at 2:37 pm

Posted in On the internet

Raising your own errors in Yukon

with one comment

Another thing I’ve noticed in Books Online – the new ERROR function, which allows you to raise your own errors. For example in AdventureWorks, try the following query:

WITH MEMBER MEASURES.TEST AS
IIF(MEASURES.[INTERNET ORDER COUNT]>2000
, MEASURES.[INTERNET ORDER COUNT]
, ERROR("TOO LOW!")
)
SELECT {MEASURES.[INTERNET ORDER COUNT], MEASURES.TEST} ON 0,
[DATE].[CALENDAR TIME].[ALL PERIODS].CHILDREN ON 1
FROM
[DIRECT SALES]

There’s a section in BOL on error handling which isn’t filled in yet so I guess this is probably only the tip of the iceberg as far as this area of functionality is concerned, but it’s interesting nonetheless.

Written by Chris Webb

February 18, 2005 at 3:25 pm

Posted in Analysis Services

Follow

Get every new post delivered to your Inbox.

Join 3,240 other followers