SSIS Methods to Process SSAS Tabular Partitions

I’ve been working on a way to process SSAS Tabular Partitions using SSIS. At the current company I’m working at we need to use SSIS to process our tabular models weather it’s a full process on the database or just a partition. This blog focuses on processing a partition because I only want to process the most current data to my model so I base the partition on a date range. I’m using the Adventure Works 2012 tabular model for my example in this article. I want to focus on 2 methods of which either one can be used it’s just based on your comfort with the technology.

The 2 SSIS Methods to process the tabular Model:

  • C# Script using AMO
  • SSIS Analysis Services Execute DDL Task

The first thing I will mention is that you need to look at how the SSAS Tabular Partition is processed manually to see how you can use these 2 methods in SSIS.

1. Get the XMLA (XML for Analysis) from SQL Server Management Studio. To access the SSAS Tabular instance you right click on the table with the partition and select Partitions.

ssas1

Select the Partition you want and the Process button.

ssas2

Select the Process Mode in our case you want to Process Full then select the Script option and choose the Script Action to New Query Window.

ssas3

This gives us the XMLA script that you can execute to process the partition selected.

<Process xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
  <Type>ProcessFull</Type>
  <Object >
    <DatabaseID>AdventureWorks Tabular Model SQL 2012</DatabaseID>
    <CubeID>Model</CubeID>
    <MeasureGroupID>Internet Sales_fdac9a13-4019-4773-b193-7cca3a4883eb</MeasureGroupID>
    <PartitionID>Internet Sales_eda26cf6-7bda-42ad-b31b-d7dc3ec08e6f</PartitionID>
 </Object >
</Process>

2. Create a C# script within SSIS that will allow you to generate the XMLA script and add variables so that it can be dynamic.
Create several variables to allow us to make dynamic changes to the script because you may change databases, models, or partitions.

ssas5

In our script task you want to add the ServerID, DatabaseID, CubeID, PartitionID, Type, and MeasureGroupID to the ReadOnlyVariables. You also want to add the XMLA variable to the ReadWriteVariables.

The following is a code snippet of the XMLA generated dynamically from the variables in a C# script and sent to the XMLA variable.

public void Main()
{
 try
 {
 StringBuilder xmla = new StringBuilder();
 xmla.AppendLine("<Process xmlns=" + "\"http://schemas.microsoft.com/analysisservices/2003/engine\"" + ">");
 xmla.AppendLine("<Type>" + Dts.Variables["Type"].Value.ToString() + "</Type>");
 xmla.AppendLine("<Object >");
 xmla.AppendLine("<DatabaseID>" + Dts.Variables["DatabaseID"].Value.ToString() + "</DatabaseID>");
 xmla.AppendLine("<CubeID>" + Dts.Variables["CubeID"].Value.ToString() + "</CubeID>");
 xmla.AppendLine("<MeasureGroupID>" + Dts.Variables["MeasureGroupID"].Value.ToString() + "</MeasureGroupID>");
 xmla.AppendLine("<PartitionID>" + Dts.Variables["PartitionID"].Value.ToString() + "</PartitionID>");
 xmla.AppendLine("</Object >");
 xmla.AppendLine("</Process>");

 Dts.Variables["XMLA"].Value = xmla.ToString();
 }
 catch (Exception e)
 {
 Dts.Variables["ScriptError"].Value = e.Message.ToString();
 }

 Dts.TaskResult = (int)ScriptResults.Success;
}

The idea is that you use the StringBuilder library to design our same XMLA script but you add the all the variables in the right places and send it the XMLA variable for use in the next piece.

3. Method 1: C# Script using AMO

With the XMLA script above created you just need a way to execute it. First you need to add the Microsoft.AnalysisServices.XMLA assembly to your script references so you can access the Microsoft.AnalysisServices.Xmla namespace. This can be a pain in the butt to find but I found it on my machine under C:\windows\assembly\gac_msil\microsoft.analysisservices.xmla and I picked the latest version of the microsoft.analysisservices.xmla.dll.

ssas8

ssas10

Finally you just add to the script we used to generate the XMLA script and execute it
with AMO commands. You instantiate the XmlaClient and connect with the Server
variable. Then you execute the send command and disconnect from the server.

public void Main()
{
 try
 {
 StringBuilder xmla = new StringBuilder();
 xmla.AppendLine("<Process xmlns=" + "\"http://schemas.microsoft.com/analysisservices/2003/engine\"" + ">");
 xmla.AppendLine("<Type>" + Dts.Variables["Type"].Value.ToString() + "</Type>");
 xmla.AppendLine("<Object >");
 xmla.AppendLine("<DatabaseID>" + Dts.Variables["DatabaseID"].Value.ToString() + "</DatabaseID>");
 xmla.AppendLine("<CubeID>" + Dts.Variables["CubeID"].Value.ToString() + "</CubeID>");
 xmla.AppendLine("<MeasureGroupID>" + Dts.Variables["MeasureGroupID"].Value.ToString() + "</MeasureGroupID>");
 xmla.AppendLine("<PartitionID>" + Dts.Variables["PartitionID"].Value.ToString() + "</PartitionID>");
 xmla.AppendLine("</Object >");
 xmla.AppendLine("</Process>");        

 Dts.Variables["XMLA"].Value = xmla.ToString();

 XmlaClient clnt = new XmlaClient();
 clnt.Connect(Dts.Variables[&amp;quot;ServerID&amp;quot;].Value.ToString());
 clnt.Send(xmla.ToString(), null);
 clnt.Disconnect();
 }
 catch (Exception e)
 {
 Dts.Variables["ScriptError"].Value = e.Message.ToString();
 }

 Dts.TaskResult = (int)ScriptResults.Success;
}

This is one way of processing the partition and a preferred preference if you feel comfortable with C# and Analysis Management Objects.

4. Method 2: Analysis Service Execute DDL Task

This method uses the SSIS task designed to execute XMLA scripts. Again you are using the original C# script to generate the XMLA Script. You simply make a connection to the SSAS Tabular instance via a connection string. You add the XMLA variable to the DDL command so that we pass in the generated XMLA script to the SSIS task.

ssas12

Once you add the connection string and the XMLA variable set in the task you are good to go. You have a second method to process the XMLA Script using a SSIS Analysis Service Execute DDL Task.

ssas11

SSAS Tabular Documentation Tool Part 1: The Overview

Do you have a need for real-time SSAS Documentation?

BISM Tabular models have started to play a crucial role in Microsoft’s Data warehouse design as the new technologies like Power View and Excel Pivot Tables with PowerPivot have become the front end presentation layers for our data and we need a way to understand the objects in our tabular models. We always have story about our data that needs to be told beyond the naming conventions we give each field or objects. We look at the data and say what does this value mean when all we have is a simple field name like order quantity. The Tabular documenter is a tool that provides a means to extract all the information from any BISM tabular model within SQL Server Analysis Services and allows the user to document it. Tabular Models can be built in Visual Studio and deployed to SSAS and they can be built in Excel PowerPivot and deployed from SharePoint to SSAS. The tabular documenter is an integral part to present a layer of documentation to support these efforts and make the designs more clear and understandable.

Here is screen shot of the Application:

tabdoc1

The application allows for a connection to a single instance of a BISM tabular model and the user interface uses a tab control containing the following subject areas: Dimensions, Measure Groups, and relationships. Each of these tabs contains controls that have an excel like spreadsheet appeal with information from each of the objects. The user can select a row within the controls which enables access to a description field textbox to add a description. This data can be saved back to a database and edited whenever desired.

This application is designed to run as a stand-alone application and also is an Excel Add-In. The Excel Add-In will allow for access to the stand-alone application from the excel tab ribbon design.

Stay tuned for my next post SSAS Tabular Documentation Tool Part 2: The Design that will give a look into the design of the application and all the features. Currently I am making some additions for Multidimensional cubes that I will talk about in the next blog on the design.

TabularDocumenterSplash

TimeXtender – Lookups and Relationships

TimeXtender is a new tool that I’ve been using to rapidly increase our ETL Development. Check out the site: TimeXtender

There are two ways you can add a lookup and have relationships configured in TimeXtender:

Method 1 is to drag the column down from the lookup table to the main table and set the join by the keys that match between the 2 tables. For example we have ProductSubcategory and Product where ProductSubcategoryID is the key between the lookup table ProductSubcategory and the main table Product.This is considered a Left outer Join which will send rows that don’t match into the Warning table.

lookup

Method 2 is to drag the key from the lookup table to the foreign key of the main table. So for this example we drag the ProductCategoryID from ProductCategory table to the ProductCategoryID in the ProductSubCategory table. This will give an option to create a relation.

relation

We can see the relationship created for the ProductCategoryID below:

created_relation

Now we want to drag the Name from ProductCategory to the ProductSubCategory table and it will ask us if we want to use that relation that we just created to join the table for this field.

catnamerelation

So we can see that join uses the same Keys that the relation has and this now gives us some leverage to change how that joins acts.

finalcatnamejoin

Lastly we can change the way that the join behaves by a right click on the relationship to change the options as follows:
(Click to enlarge image)

RelationView

So we need to define the 3 relationship settings in terms of joins:
1. Error (Inner Join)
2. Warning (Left Outer Join)
3. Relation (Full Outer Join)

Tabular AMO 2012

This is a great article on codeplex about creating a tablular model using AMO 2012. I am in the process of designing methods to extract information from the BISM tabular model using AMO and XML. I just wanted to share this with you as I will begin to analyze this method further. I will be posting some code here and there to show the differences from Multidimensional and tabular AMO.

Link: Tabular AMO 2012

SQL Saturday #174 Charlotte BI Edition

What a great success SQL Saturday #174 was in Charlotte. My presentation on “Programmatically managing the BI Stack” went well and I was asked to post some links to my code and my PowerPoint presentation. My links can be found in my menu above under Code and Presentations.

Here are the follwing links:
Mod Package Code
AMO Cube Documenter
SQL Saturday #174 Programmatically managing the BI Stack Presentation