[spacer] [spacer] [spacer]

Sign up with your email address to be the first to know about new publications

[mc4wp_form id="4946"] [spacer]

Azure Data Factory Refreshing Azure Analysis Services Model

Are you looking to easily refresh your Azure Analysis Service models and partitions from Azure Data Factory? In this blog post, I show how easy it is to include this feature in Azure Data Factory. 

This tutorial will help you build a pipeline that allows you to asynchronously refresh any Azure Analysis Services model using parameters. Once you finish this tutorial, you’ll have a pipeline that you can use and extend for more specific needs. 

Post contents 

  • Asynchronous execution vs synchronous execution 
  • Giving Azure Data Factory access to Azure Analysis Services 
  • Creating the reusable pipeline 

Asynchronous execution vs synchronous execution 

  • Asynchronous execution – when you trigger a refresh of a model, but you don’t know the final status of that refresh. You get the response of the REST API call, but not the final status of the refresh. Most REST APIs work under this method. 
  • Synchronous execution – when you trigger a refresh, the response will not come back until the execution finishes, so you know the status of the execution. 

Can you build workaround in Azure Data Factory? Yes, I will cover this in an upcoming post, but let’s build something reusable first.  

In addition, you need to consider that a synchronous execution means that you need to pay more for your Azure Data Factory pipelines. 

Giving Azure Data Factory access to Azure Analysis Services 

Firstly, you need to give Azure Data Factory access to your Azure Analysis Services model to perform these operations using managed service identities.  

There isn’t an easy way to do this from the Azure portal without getting confused. So, let’s use PowerShell. 

Azure Data Factory has a managed identity created in the backend that you can use to access Analysis Services. 

Pre-requisite 

Install-module Az 

Execute this command in Power Shell and copy the output, download a copy of the script from here.

# This script returns the Azure Data Factory MSI to give access to your service 

# Install module Azure if it is not available 

# Install-module Az 

# Pre-requisite, connect to your Azure account 

# Connect-AzAccount 

$AzureDataFactoryName = "" 

$ResourceGroupName = "" 

 

$TenantId= (Get-AzDataFactoryV2 -ResourceGroupName "rg-dataanalytics"  -Name "df-techtalkcorner").Identity.TenantId 

$PrincipalId= (Get-AzDataFactoryV2 -ResourceGroupName "rg-dataanalytics"  -Name "df-techtalkcorner").Identity.PrincipalId 

$ApplicationId = Get-AzADServicePrincipal -ObjectId $PrincipalId 

$ApplicationId =($ApplicationId).ApplicationId 

 

# Copy the following user and give it access in Azure Analysis Services 

Write-Host "app:$ApplicationId@$TenantId"  # This scripts returns the Azure Data Factory MSI to give access to your service
# Install module Azure if it is not available
# Install-module Az
# Pre-requisite, connect to your Azure account
# Connect-AzAccount
$AzureDataFactoryName = ""
$ResourceGroupName = ""

$TenantId= (Get-AzDataFactoryV2 -ResourceGroupName $ResourceGroupName  -Name $AzureDataFactoryName).Identity.TenantId
$PrincipalId= (Get-AzDataFactoryV2 -ResourceGroupName $ResourceGroupName  -Name $AzureDataFactoryName).Identity.PrincipalId
$ApplicationId = Get-AzADServicePrincipal -ObjectId $PrincipalId
$ApplicationId =($ApplicationId).ApplicationId

# Copy the following user and give it access in Azure Analysis Services
Write-Host "app:$ApplicationId@$TenantId"# This script returns the Azure Data Factory MSI to give access to your service 

# Install module Azure if it is not available 

# Install-module Az 

# Pre-requisite, connect to your Azure account 

# Connect-AzAccount 

$AzureDataFactoryName = "" 

$ResourceGroupName = "" 

 

$TenantId= (Get-AzDataFactoryV2 -ResourceGroupName "rg-dataanalytics"  -Name "df-techtalkcorner").Identity.TenantId 

$PrincipalId= (Get-AzDataFactoryV2 -ResourceGroupName "rg-dataanalytics"  -Name "df-techtalkcorner").Identity.PrincipalId 

$ApplicationId = Get-AzADServicePrincipal -ObjectId $PrincipalId 

$ApplicationId =($ApplicationId).ApplicationId 

 

# Copy the following user and give it access in Azure Analysis Services 

Write-Host "app:$ApplicationId@$TenantId"  

Then, you need to give Azure Data Factory access to Analysis Services. 

Give Azure Data Factory access to Analysis Services

Go to security and click “add.” 

Finally, don’t forget to save it. 

Creating the reusable pipeline 

Azure Data Factory can refresh Azure Analysis Services tabular models, so let’s create a pipeline. You can download a copy of the pipeline from here.

First, create the required parameters to make the pipeline reusable across different models. Don’t forget to define a generic name for your pipeline. 

Drag and drop a Web activity in the pipeline. To refresh the model, I use Azure Analysis Services REST APIs

Configure the Web Activity as follows. You want to use the parameters that we have previously created. Copy the strings. 

  1. REST API endpoint 
@concat('https://',pipeline().parameters.Region,'.asazure.windows.net/servers/',pipeline().parameters.ServerName,'/models/',pipeline().parameters.ModelName,'/refreshes') 
  1. HTTP Body 
@concat( 

'{ 

    "Type": "',pipeline().parameters.RefreshType,'", 

    "CommitMode": "transactional", 

    "MaxParallelism":10, 

    "RetryCount": 2, 

    }' 

) 
  1. https://*.asazure.windows.net  

You can see in the picture above that we are using managed identities (MSI) to access Azure Analysis Services Rest API.  

Now let’s test it. Include the correct values for the parameters. 

The output should tell you if it was able to connect and trigger the refresh. Remember, this is an asynchronous execution so you won’t know the status of the refresh execution. 

Summary 

You have created a reusable Azure Data Factory pipeline that you can use to refresh Azure Analysis Services models asynchronously. It’s easy to extend it with new futures.  

What’s Next? 

In upcoming blog posts, we’ll continue to explore Azure Data Services features.  

Please  follow Tech Talk Corner on Twitter for blog updates, virtual presentations, and more!         

Check out these other posts:

No Comments Yet.

Do you want to leave a comment?

Your email address will not be published. Required fields are marked *