Automating Processing Azure Analysis Services

How to Automate Processing of Azure Analysis Services Models

I’ve been working on a proof of concept for a customer that involved using Azure Analysis Services as a cache for some data in an Azure Data Warehouse instance. One of the things I’ve been working on is scheduling the automatic processing of the Azure AS database. I did find the following documentation on the process, but the screenshots of the Azure portal are out of date and I did find some errors in the instructions. I also found this very extensive project for partition management in Azure AS, but this was a little overkill for my purposes and I was just interested in the very basics.

Read my recap for MS Data Summit here

These previously mentioned resources led me to write this blog post. In this post I’m going to leverage the previously mentioned article and walk through creating an Azure Function App to automatically refresh my Azure Analysis Services model, while correcting a few errors and updating the screenshots.

If you’re new to Azure Analysis Services, take a look at this documentation. For the purposes of this post, I’m going to assume you have a basic understanding of Analysis Services.

I created a Tabular Model with a connection to an Azure SQL Database with one table that had a couple columns. I visualized the data in Power BI so I could verify the data was being refreshed after I processed the Azure AS database.

image

The Steps for Automating Processing of an Azure Analysis Services Model

1. Create an Azure Function App

The first step is to create an Azure App Function. Navigate to the Azure portal and create a Function App.

image

2. Create a new Function

After you’ve created the Function App, we need to add a new Timer function. Click the + button next to Functions, select Timer, and click Create this function.

image

3. Configure the Timer

Give your Timer a name by filling in the textbox for Timestamp parameter name. The default name is myTimer, but you can change this. Just make sure you remember what the name of your timer is because we will need it later.

image

The Schedule text box expects a CRON expression to define the days and times that the function should execute. Click the little Documentation button on the screen above to read about CRON expressions. But a CRON expression similar to what I’m using would execute the script every 4 hours of every day. Click Save when you’re done.

4. Configure the Function App

Download the latest client libraries for Analysis Services. This needs to be done on your local machine so you can then copy these files to your Azure Function App.

After you’ve downloaded the client libraries, the DLLs can be found in C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies. The two files you need are:

C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies\Microsoft.AnalysisServices.Core.DLL
C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies\Microsoft.AnalysisServices.Tabular.DLL

This step is important because the documentation in Azure references the 130 assemblies, which will not work. You need the assemblies in 140 otherwise you’ll get errors.

image

Select your function, TimerTriggerCSharp1, and expand the View files windows on the far right of your screen.

Now you need to add a folder called “bin”. To do this, select your Function App, click Platform features, and under Development Tools, click Advanced tools (Kudu).

automate azure analysis services processing

In Kudu, click Debug console and select Cmd. Navigate to the site\wwwroot\TimeTriggerCSharp1 folder and add the “bin” folder here by clicking the + button.

azure analysis services schedule refresh

Once you’ve added the “bin” folder, go back over to the Azure portal and select your function TimerTriggerCSharp1.

On the right under View files, navigate to the bin folder. Then click the Upload button to add the two previously mentioned DLLs to the bin folder.

image

You should see the two DLLs in your bin folder now.

image

5. Add the connection string for your Azure Analysis Services database

Click the name of your Function App, then select Platform features. Select Application settings under General Settings.

image

Now we need to add our connection string under the Connection strings section. You’ll need your AAS server name and a user ID and password that has access to the AAS database.

You can find your AAS server name by navigating to your AAS database in the Azure portal and copying the value found under Server name:

image

Your connection string should look like this:

Provider=MSOLAP;Data Source=<your aas server>; Initial Catalog=<aas database name>;User ID=<your username>;Password=<your password>

Fill in the Name textbox with a name for your connection string (remember this, we’ll need it later) and paste your connection string in the Value text box:

image

Click Save near the top.

6. Add your code

Select TimeTriggerCSharp1.

image

To programmatically process the tabular model, we’ll leverage Analysis Services Management Objects (AMO). If you’re new to AMO, I’d suggest starting here.

Paste in the following code (you can also download the .cs file I used with the script here). The highlighted sections of the code represent the piece you need to change specific to your function app. The green text represents the commands you should change based on the type of processing operation you wish to execute. I’m processing the whole model with a ProcessFull command so I’ve left the first green line uncommented.

#r “Microsoft.AnalysisServices.Tabular.DLL”

#r “Microsoft.AnalysisServices.Core.DLL”

#r “System.Configuration”

using System;

using System.Configuration;

using Microsoft.AnalysisServices.Tabular;

public static void Run(TimerInfo <TheNameOfYourTimer>, TraceWriter log)

{

    log.Info($”C# Timer trigger function started at: {DateTime.Now}”); 

    try

            {

                Microsoft.AnalysisServices.Tabular.Server asSrv = new Microsoft.AnalysisServices.Tabular.Server();

                var connStr = ConfigurationManager.ConnectionStrings[“<YourConnectionStringName>“].ConnectionString;

                asSrv.Connect(connStr);

                Database db = asSrv.Databases[“<YourDatabaseName>“];

                Model m = db.Model;

                db.Model.RequestRefresh(RefreshType.Full);     // Mark the model for refresh

                //m.RequestRefresh(RefreshType.Full);     // Mark the model for refresh

                //m.Tables[“Date”].RequestRefresh(RefreshType.Full);     // Mark only one table for refresh

                db.Model.SaveChanges();     //commit  which will execute the refresh

                asSrv.Disconnect();

            }

            catch (Exception e)

            {

                log.Info($”C# Timer trigger function exception: {e.ToString()}”);

            }

    log.Info($”C# Timer trigger function finished at: {DateTime.Now}”);

}

 

Click the Save button at the top.

7. Time to test your function app!

Click the Run button at the top to test the function app.

image

After clicking Run, you should see the following results:

image

And now my Power BI report is updated!

image

Troubleshooting

Here’s some things to think about if you run into trouble.

  1. Verify your connection string is correct.
  2. Verify that you’ve correctly referenced your timer, connection string, and database name in your script.
  3. Make sure you used the latest version of the DLLs.

I had to do a little bit of troubleshooting to get this to work and some of this was trial and error. But keep poking at it and leave a comment below if you have any questions.

Resources

Processing Azure Analysis Services with Function Apps: https://azure.microsoft.com/en-us/blog/automating-azure-analysis-services-processing-with-azure-functions/?v=17.23h

Automated partition management with Azure Analysis Services whitepaper and sample codes: https://azure.microsoft.com/en-us/blog/azure-as-automated-partition-management/?v=17.23h

17 thoughts on “How to Automate Processing of Azure Analysis Services Models”

    1. That’s a good question. I haven’t tried that yet but I know from experience that you can’t use a SQL Agent Job to process an AAS database using a stored credential, proxy account, and an Analysis Services command. But the SSIS Analysis Services command would be something to try.

  1. Question? When I am at the create bin folder in TimerTriggerCSharp1 by clicking add step, I’m only able to create a file and I do not have an option to create a folder. Am I missing something?

    1. Sorry for the confusion. I updated the blog post to include the additional steps. My apologies!

    1. My apologies for leaving that step out. You can do that piece in Kudu. Click on your Function App, select Platform Features. Under Development Tools, select Advanced Tools (Kudu). Under Debug Console, select CMD. Navigate to site>wwwroot>TimerTriggerCSharp1 and then click the + button to add the Bin folder. Sorry for not being clearer. I update the blog post shortly.

  2. This seems to a way too much work in order to do the most basic task of processing the model. Any idea whether it will change in future?

    1. Yes this is a bit of work. But it’s nice to be able to use AMO. Using Azure run books is much easier. I should do a blog on that.

  3. My code executed in less than 1 seconds successfully, but my model isn’t refreshed. I get this error,

    2017-12-04T21:50:33.110 C# Timer trigger function exception: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. —> Microsoft.IdentityModel.Clients.ActiveDirectory.AdalException: unknown_user_type: Unknown User Type

    at Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext.RunAsyncTask[T](Task`1 task)

    at Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext.AcquireToken(String resource, String clientId, UserCredential userCredential)

    — End of inner exception stack trace —

  4. Hi Dustin, thanks for this article, very goodI I have done everything that you explain, but when I click the run button, nothing happens … I do not even receive error messages….

Comments are closed.