Stream Analytics and Power BI

Azure Stream Analytics can used to stream data to different targets and to Power BI. In this post, I will show how to stream data to a Power BI dataset from a Blob storage. Normally you will get at this scenario data from IoT devices or something which push very frequently data, but to make this post a little bit easier for me for you, when you want to reproduce something of this, it can be created with resources which are normally available. Okay, let us start to build up the environment.

First, I create a storage account. Please make sure, that you create this account in an area, where you also want to create the Stream Analytics. This is important to make the setup easier and to save money and minimize the latency.

Next, we need our data intake container where we store the data which will imported into out Power BI environment.

To this container I upload a simple CSV file

Imagine that your store will send you for ever sale a file which should be immediately available in your Power BI report.

Next, we need our Stream Analytics Job, you will find this resource at the Azure marketplace:

Please ensure that you create this job at the same region as you created the storage account. By creating this job, I leave everything by default. The Units are by default at 3 and can scaled up by need and after creating this resource.

After a successful deployment we need to configure the data in-take. You can configure this by navigating zo input and select “new” and Blob storage. For the Authentication mode I select “connection string”. By selecting this, maybe you need to provide a key, this key can be grab from your storage account under access keys. Please make also sure, that you configure the data format to CSV with semicolon as a delimiter.

Now we need our output. For this I created at my Power BI tenant only a new workspace. You can use an existing, but to get a better overview for myself, I created a new one. By selecting outputs, I created a new with Power BI.

By going back to the Overview, we can edit out query which selecting our data from the input and push the data to the output.

Now everything is ready, and we can start the job. After that, and we provide new data, we get a new dataset in our Power Bi workspace where we get our data from the blob storage. To provide new data, you can easily upload a new file. You can check this at the overview page.

By creating a new report at Power BI based on the new dataset, you can check the data. The dataset contains additionally some information about the data file and the import process. I think it makes more fun when you automatically get new data, for this approach I created a PowerShell script. You need by configuring this script also the access key from your blob storage. After that, you can run this script in a loop.

$accesskey = "f1vxAsiHHWvdEF8lfAExLq66rghWfqzXZoR3ugq9p5qnqDTnIt4DAUvz4luHEJ5w1Gbcg+yy6jT+EEvHJGaXtA=="
$StorageAccountName = "streamingdata0105"
$containerName = "data"

$date = Get-Date
$category = "Trucks"
$amount = $date.Millisecond

$filename = $category + $date.Hour + $date.Minute + $date.Second + ".csv"
$path = "c:\temp\files\"
$tempfilepath = "c:\temp\files\" + $filename

$output =  @()
$output =  @("Category;Date;OrderId;Amount")
$output += @($category + ";" + $date.Day + "/" + $date.Month + "/" + $date.Year + ";" + $amount + ";" + $amount * 3) 
$output | Out-File -FilePath $tempfilepath

$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $accesskey 
Set-AzStorageBlobContent -File $tempfilepath -Container $containerName -Blob $filename -Context $StorageContext 

Multiple Row Level Security Roles and Object Level Security?

With Object Level Security (OLS) you can secure columns, so users who can modify reports or can access the data model can not see the column or can access via DAX to the content. Mostly hide report builders the columns, this is okay, because you can use the columns in relationships, measures, or calculated columns/tables. But when you want to protect the columns, this cannot be securely done with hiding a column. With Row Level Security (RLS) you can protect the data by using horizontal filters which are applied to the table. Now you can use OLS to protect columns. So, users cannot use data which are protected with RLS and, they cannot see columns which are protected with OLS.

But in the past, I see so much data models who are do not using a dynamic RLS ( ). Instead of this, they are using multiple roles for each case. Like this:

But here is the problem inside. When you want to protect a column with OLS for all roles, this cannot be done. You cannot mix multiple roles when you are using OLS. Imagine you want for this scenario protect the column “Units” and you set up the OLS:

Now you will receive following error message when you assign to a user multiple role:

If you need are multiple roles, because you have security needs for different use cases, I suggest using one role with a dynamic security. So, you need only assign one role to each user and you can use OLS to protect also whole columns.

Migrate your Power BI Reports to use the Azure Text Analytics API V3.1

When you want to use the API Version 3.1 (actually in preview, with Power BI and you used before the Version 2, you must do some changes. The first change is, that you must use another URL with the region inside, e.g.:
The next change to identify the language is very easy, because the returned JSON field names not detectedLanguages in this version, the name is only detectedLanguage without an “s”. Okay, easy, right? To detect the Key Phrases there are no changes. But to detect the sentiment there are more changes to do. You see the M code below, the JSON gives us now the confidence score and it delivers three columns for positive, neutral, and negative. Before you get only one column. Mostly it is enough to analyze only the positive column, because when it contains a small number, it is more negative, when it contains 0.5, it is neutral. So, you can expand the columns with M and use only the positive column and start to analyze the data.

Attached you will find an example Power BI Report

Power BI Adoption?

I read so much about the Adoption of Power BI in the past. Microsoft released to this topic a great article, but I read so less about the use of the “get help” information. On the most tenants with I am worked in the past, the URL’s referring to the standard Microsoft websites and the business users do not use this information, because the most did not understand these topics … It is also hard to figure out, which department is responsible for the license assignment. Imagine that you want a Pro license – which person do you want to ask for it? You can ask your manager, but if he agree that you get one, what is the right process to get one? The URL’s can help you out. At the tenant settings you can customize these links:

There are four options:
1. Training documentation
2. Discussion Forum
3. Licensing requests
4. Help Desk
You find these at the Admin Portal -> Tenant Setting -> Get Help
Here are the endpoints:

If you modify the links, you must take care of the maintenance of the websites and keep the content up to date.

Write Azure Automation Output to Azure Blob Storage

In my last Blog posts, I described how you can get data from the Power BI REST API and how you can execute them via Azure Automation. When you use the last one to execute your scripts, it is also nice to save the output 😊 Now I want to describe how you can write the output to the Azure Blob Storage. The following script exports the assigned Users to a Power BI Pro License – the same story from my last post. I have created before a Storage account and a container within.

$connectionName = "AzureRunAsConnection"

try {
    # Get the connection "AzureRunAsConnection "
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

     "Logging in to Azure..."
    Add-AzureRmAccount `
    -ServicePrincipal `
    -TenantId $servicePrincipalConnection.TenantId `
    -ApplicationId $servicePrincipalConnection.ApplicationId `
    -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
catch {
    if (!$servicePrincipalConnection)
        $ErrorMessage = "Connection $connectionName not found."
        throw $ErrorMessage
    } else{
        Write-Error -Message $_.Exception
        throw $_.Exception

# the following user writes to the blob storage and needs also access to the AD
$myPassword = '<Password>'
$myUsername = ''

$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)
Connect-AzureAD -Credential $credential

$date = Get-Date -Format "ddMMyyyy"

$PBIProServicePlanID = "70d33638-9c74-4d01-bfd3-562de28bd4ba"
$ProLicenseFile = 'proLicense' + $date  + '.csv'

$ADUsers = Get-AzureADUser -All $true | Select-Object ObjectId, ObjectType, DisplayName, UserPrincipalName, UserType, @{Name="Date Retrieved";Expression={$RetrieveDate}}

$OrgO365Licenses = Get-AzureADSubscribedSku | Select-Object SkuID, SkuPartNumber,CapabilityStatus, ConsumedUnits -ExpandProperty PrepaidUnits | `
    Select-Object SkuID,SkuPartNumber,CapabilityStatus,ConsumedUnits,Enabled,Suspended,Warning, @{Name="Retrieve Date";Expression={$RetrieveDate}} 
$UserLicenseDetail = ForEach ($ADUser in $ADUsers)
        $UserObjectID = $ADUser.ObjectId
        $UPN = $ADUser.UserPrincipalName
        $UserName = $ADUser.DisplayName
        Get-AzureADUserLicenseDetail -ObjectId $UserObjectID -ErrorAction SilentlyContinue | `
        Select-Object ObjectID, @{Name="User Name";Expression={$UserName}},@{Name="UserPrincipalName";Expression={$UPN} `
        } -ExpandProperty ServicePlans
$ProUsers = $UserLicenseDetail | Where-Object {$_.ServicePlanId -eq $PBIProServicePlanID}

Set-AzureRmCurrentStorageAccount -StorageAccountName savedfiles -ResourceGroupName PBI
$MSOPwd = ConvertTo-SecureString -String $myPassword -AsPlainText -Force
$MSOCred = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $myUsername, $MSOPwd

# append the output to a CSV format
write-output $ProUsers | Export-Csv savedfiles -Append
# write the output to the blob storage
Set-AzureStorageBlobContent -Container test -File savedfiles -Blob $ProLicenseFile

After running the script, the output is available at the container and can also used at Power BI with the Blob Connector.

Which user does really need a Power BI Pro license?

In a Power BI Premium license environment, only the Report creators have a Power BI Pro license, because they create Reports, Dashboards and so on. The other users, who only consume the data, are assigned to a Free license. In the past, I saw so many organization’s where users have a Pro license, because they need a in the past, but now they do not do anything with it. For an administrator it can be horrible to find out, which user really need a license, because, when he does not use the activity logs, he must ask every user which has one assigned. In a previous Blog Post, I have explained how you can get the Activities of the last 90 days into CSV files. If you run this automatically and over a long term of time, you get a good time series ( With this data you can easily create a filter, to check which user creates reports or so on:

 Now you need the information which user has a Pro license assigned. To get this information, you can request your Active Directory, get this data into a CSV file, and merge it with this data. The following script exports every user into the defined file who has a license.

$myPassword = '<password>'
$myUsername = ''

$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)
Connect-AzureAD -Credential $credential

$RetrieveDate = Get-Date 
$BasePath = "C:\Users\plenz\OneDrive\Power BI WS"
$AzureADUsersCSV = $BasePath + "Users.csv"
$OrgO365LicensesCSV = $BasePath + "OrgO365Licenses.csv"
$UserPBIProLicensesCSV = $BasePath + "UserPBIProLicenses.csv"

See MS Licensing Service Plan reference:
$PBIProServicePlanID = "70d33638-9c74-4d01-bfd3-562de28bd4ba"

$ADUsers = Get-AzureADUser -All $true | Select-Object ObjectId, ObjectType, CompanyName, Department, DisplayName, JobTitle, Mail, Mobile, `
            SipProxyAddress, TelephoneNumber, UserPrincipalName, UserType, @{Name="Date Retrieved";Expression={$RetrieveDate}}

$OrgO365Licenses = Get-AzureADSubscribedSku | Select-Object SkuID, SkuPartNumber,CapabilityStatus, ConsumedUnits -ExpandProperty PrepaidUnits | `
    Select-Object SkuID,SkuPartNumber,CapabilityStatus,ConsumedUnits,Enabled,Suspended,Warning, @{Name="Retrieve Date";Expression={$RetrieveDate}} 
$UserLicenseDetail = ForEach ($ADUser in $ADUsers)
        $UserObjectID = $ADUser.ObjectId
        $UPN = $ADUser.UserPrincipalName
        $UserName = $ADUser.DisplayName
        $UserDept = $ADUser.Department
        Get-AzureADUserLicenseDetail -ObjectId $UserObjectID -ErrorAction SilentlyContinue | `
        Select-Object ObjectID, @{Name="User Name";Expression={$UserName}},@{Name="UserPrincipalName";Expression={$UPN}}, `
        @{Name="Department";Expression={$UserDept}},@{Name="Retrieve Date";Expression={$RetrieveDate}} -ExpandProperty ServicePlans
$ProUsers = $UserLicenseDetail | Where-Object {$_.ServicePlanId -eq $PBIProServicePlanID}

$ProUsers | Export-Csv $UserPBIProLicensesCSV

After that, you can create a relationship between both tables and you can check, which user really needs the Pro license, or you can assign it to another user.