Process SQL Analysis Services Cubes via the Azure Data Factory

If you work with Analysis Services on Azure, it is another way to process a cube. On premises you use mostly the SQL Server Agent to do this, but this option is not given on Azure. There you can use Azure Automation, PowerShell, or a Data Factory (ADF). This post is dedicated to the ADF, because mostly you need to process a table, partition, or the whole database after you made a data intake.

First we need to grant permissions to the SSAS Cube from the ADF. So, we need to create a managed identity. To get the needed data, open the properties of the ADF at the Azure Portal.

With the Tenant and the ADF ID, we can build the credential-

Next, connect to the AS instance via the Management Studio and go the server administrators and add this user. Via the Azure portal it is not possible, but after that, you can view the result.

To check the result, if a refresh has worked, I have added a View to my data source and imported the data inside the cube.

You should format the column as a date type and format the column as a G, so you also get the time inside the output.

Okay, the infrastructure is ready! Now, we can add a new pipeline inside our ADF with a web activity.

https://eastus.asazure.windows.net/servers/blog001/models/blog/refreshes

This is the schema: https://<region>.asazure.windows.net/servers/<servername>/models/<modelname>/refreshes
For the fifth point add following URL for the authentication. https://*.asazure.windows.net

The following XMLA code will initiate a full process
{

    „Type“: „Full“,

    „CommitMode“: „transactional“,

    „MaxParallelism“: 2,

    „RetryCount“: 2,

    „Objects“: []

}
After running the ADF pipeline you should get a full refresh of the cube:



How to make a Power BI report more useful for your users?

In the past I have seen so many reports which are not very handy for the users. The reports had a good data model, cool measures, good visualizations, but mostly it can only handle by the authors or product owners. The reason was mostly, that other users do not get help how to use the reports and/or visuals. So, here are some ideas how to provide some help to the report consumers:

1. An introduction page
Build up a page which provides some explanation about the report and the story. You can also build up s page like this for every chapter.

It is also very good to provide here some screenshots of every page with some explanations and some buttons which refers the users to the page. It is also a good idea to provide some links to your data catalog which explains your used data and KPI’s.

2. Use tooltips for providing help

If you have some visualizations who need an explanation, maybe you have a chart which have a hierarchy and a forecast. This can be very complex for a business user to use this visual, so it is good idea use a tooltip for this.

3. Use page help with bookmarks

You can implement bookmarks to display help for your whole page. To do this, make a screenshot of your page, put this image into Power Point or another tool to create images, descibe the visuals and put this picture back to a bookmark inside Power BI

So, this can be very helpful for you report users …

…. to be continued.

Why should you create explicit measures?

Some people ask me, why should I create a measure by using Power BI? Yes, you can use every column which contains a number and put them into a value box in Power BI and create different calculations on it. My first answer by this question is, if you need a calculation e.g., Sales Amount without canceled orders, you must filter on ever page or on the whole report the canceled order out. But if you want to compare the canceled orders to the others, then it is maybe a little bit complicated. F you create a measure with and without the filter, it is much easier.
Another point is, when you want to analyze the dataset with Excel to use it with an XMLA endpoint by using Excel. There it is not possible to put a column, which is not a measure, into the values. The reason is, that the Power BI dataset is an Analysis Services Model in behind.
So, I suggest creating for every calculation a measure, it does not matter if it is only a simple SUM() or a complex calculation. If you do so, then you do not have troubles by using other tools than Power BI.

Stream Analytics and Power BI

Azure Stream Analytics can used to stream data to different targets and to Power BI. In this post, I will show how to stream data to a Power BI dataset from a Blob storage. Normally you will get at this scenario data from IoT devices or something which push very frequently data, but to make this post a little bit easier for me for you, when you want to reproduce something of this, it can be created with resources which are normally available. Okay, let us start to build up the environment.

First, I create a storage account. Please make sure, that you create this account in an area, where you also want to create the Stream Analytics. This is important to make the setup easier and to save money and minimize the latency.

Next, we need our data intake container where we store the data which will imported into out Power BI environment.

To this container I upload a simple CSV file
Category;Date;OrderId;Amount
Trucks;9/5/2021;763;2289

Imagine that your store will send you for ever sale a file which should be immediately available in your Power BI report.

Next, we need our Stream Analytics Job, you will find this resource at the Azure marketplace:

Please ensure that you create this job at the same region as you created the storage account. By creating this job, I leave everything by default. The Units are by default at 3 and can scaled up by need and after creating this resource.

After a successful deployment we need to configure the data in-take. You can configure this by navigating zo input and select “new” and Blob storage. For the Authentication mode I select “connection string”. By selecting this, maybe you need to provide a key, this key can be grab from your storage account under access keys. Please make also sure, that you configure the data format to CSV with semicolon as a delimiter.

Now we need our output. For this I created at my Power BI tenant only a new workspace. You can use an existing, but to get a better overview for myself, I created a new one. By selecting outputs, I created a new with Power BI.

By going back to the Overview, we can edit out query which selecting our data from the input and push the data to the output.

Now everything is ready, and we can start the job. After that, and we provide new data, we get a new dataset in our Power Bi workspace where we get our data from the blob storage. To provide new data, you can easily upload a new file. You can check this at the overview page.

By creating a new report at Power BI based on the new dataset, you can check the data. The dataset contains additionally some information about the data file and the import process. I think it makes more fun when you automatically get new data, for this approach I created a PowerShell script. You need by configuring this script also the access key from your blob storage. After that, you can run this script in a loop.

$accesskey = "f1vxAsiHHWvdEF8lfAExLq66rghWfqzXZoR3ugq9p5qnqDTnIt4DAUvz4luHEJ5w1Gbcg+yy6jT+EEvHJGaXtA=="
$StorageAccountName = "streamingdata0105"
$containerName = "data"

$date = Get-Date
$category = "Trucks"
$amount = $date.Millisecond


$filename = $category + $date.Hour + $date.Minute + $date.Second + ".csv"
$path = "c:\temp\files\"
$tempfilepath = "c:\temp\files\" + $filename

$output =  @()
$output =  @("Category;Date;OrderId;Amount")
$output += @($category + ";" + $date.Day + "/" + $date.Month + "/" + $date.Year + ";" + $amount + ";" + $amount * 3) 
$output | Out-File -FilePath $tempfilepath

Connect-AzAccount
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $accesskey 
Set-AzStorageBlobContent -File $tempfilepath -Container $containerName -Blob $filename -Context $StorageContext 

Multiple Row Level Security Roles and Object Level Security?

With Object Level Security (OLS) you can secure columns, so users who can modify reports or can access the data model can not see the column or can access via DAX to the content. Mostly hide report builders the columns, this is okay, because you can use the columns in relationships, measures, or calculated columns/tables. But when you want to protect the columns, this cannot be securely done with hiding a column. With Row Level Security (RLS) you can protect the data by using horizontal filters which are applied to the table. Now you can use OLS to protect columns. So, users cannot use data which are protected with RLS and, they cannot see columns which are protected with OLS. https://powerbi.microsoft.com/en-us/blog/object-level-security-ols-is-now-generally-available-in-power-bi-premium-and-pro/

But in the past, I see so much data models who are do not using a dynamic RLS (https://www.flip-design.de/?p=539 ). Instead of this, they are using multiple roles for each case. Like this:

But here is the problem inside. When you want to protect a column with OLS for all roles, this cannot be done. You cannot mix multiple roles when you are using OLS. Imagine you want for this scenario protect the column “Units” and you set up the OLS:

Now you will receive following error message when you assign to a user multiple role:

If you need are multiple roles, because you have security needs for different use cases, I suggest using one role with a dynamic security. So, you need only assign one role to each user and you can use OLS to protect also whole columns.

Migrate your Power BI Reports to use the Azure Text Analytics API V3.1

When you want to use the API Version 3.1 (actually in preview, https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/whats-new) with Power BI and you used before the Version 2, you must do some changes. The first change is, that you must use another URL with the region inside, e.g.: https://northeurope.api.cognitive.microsoft.com/text/analytics/v3.1-preview.1/
The next change to identify the language is very easy, because the returned JSON field names not detectedLanguages in this version, the name is only detectedLanguage without an “s”. Okay, easy, right? To detect the Key Phrases there are no changes. But to detect the sentiment there are more changes to do. You see the M code below, the JSON gives us now the confidence score and it delivers three columns for positive, neutral, and negative. Before you get only one column. Mostly it is enough to analyze only the positive column, because when it contains a small number, it is more negative, when it contains 0.5, it is neutral. So, you can expand the columns with M and use only the positive column and start to analyze the data.

Attached you will find an example Power BI Report