Power BI Service Principals

With the Power BI Embedded OnBoarding Tool you can easily embed applications. But the standard mode uses an users with a password. This users requires a Power BI Pro license and normally this users stands under some organizational standards, such as the password must be change after X days. But the biggest issue are, that the password is unencrypted at the web.config file.
These issues can be resolved by using a “service principal” account. Microsoft has documented the way at this page:https://docs.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal

But here are some missing hints to get the principal work, such as the API permissions. This Blog Post will show you step by step how you can configure a account and setup a demo application.

First you must enable at the Admin Portal of your Power BI tenant the use of the “service principals”. (in a productive scenario you should use groups, not the entire organization)

To configure our embedded application we need our workspace and report id. (needed to configure the config file)

1 = workspace id
2 = report id

Next we must create at the Azure tenant an “App registration”:

Then we must create the client secret

Important: After you left this page, you cannot get the secret again!

Now we must give the service principal API permissions for Power BI

Cgeck that you only grant the required permissions!
you must grant the permissions to your tenant!

After that, you must give the app registration user adin access to the workspace:

Now you must setup the web.config:

  1. Setup the Authentication Type to “Service Principal”
  2. Set the application Id, you will get it from the overview page of the created service principal
  3. Setup the workspace and report id (step 3 and 4)
  4. Paste here the secret
  5. Setup your tenant id, you will also get the id from the overview page of the service principal

Finally we can run our app by using the principal

That’s it! Enjoy!

Connect your on-premises AS Cube with Power BI Embedded and re-use your existing security

If you have deployed your SSAS Cubes on premises and you don’t want to deploy them into the cloud, but you want Power BI? One option is to use Power BI Report Server, but there you don’t have all capabilities which Power BI offers you. You can only embed the Reports with an iFrame and you also don’t have the latest features.
With Power BI Embedded you can integrate your Reports, Dashboards and also Tiles into your own application and you can design the app by your needs. Power BI Embedded can only used by the cloud environment. So, there is no option to use this technology on your own on premises infrastructure.
When you also have some limitations from your management like:

  • GDPR
  • Compliance
  • Row Level Security on your Cubes
  • safe money; don’t give every user in your organisation a Power BI Pro licence or buy a Premium capacity

… it can be very complicated to use Power BI Embedded.

For the first to things you can implement a Hybrid Scenario. You have your own infrastructure, install a Power BI Data Gateway and connect your Power BI Tenant to your local environment. But this solves not the two other problems. If you use a Power BI Embedded application, you can run it with a „Embed for your customers“-Mode, so you spent money for the capacity, for one Power BI Pro license, but the application has only one user, so you get in struggle with the Row Level Security. So you must by for every user a license or for the app a capacity? No! You can set up the Gateway with an Impersonation Mode. With this setting you can push your user names to this Cubes and re-use the Row Level Security.

So, GDPR und Compliance can be solved by the Gateway, because you have only Runtime data at the cloud. The Row-Level Security can be used by the Impersonation and you can also safe money by using Power BI Embedded „Embed for your customers“ and with this technology you can design your own application.

Now I want to demonstrate how you achieve this:

To create an Emdedded App you need:

  1. a Service User
  2. a Power BI Pro license
  3. Setup a Data Gateway
  4. Analysis Services on our Infrastructure as Service environment
  5. to Setup the Gateway
  6. Invoke the Gateway for the Impersination
  7. Power BI Embedded APP
  1. Service User
The user will only assigned to a group which group will synchronized to our local Server.

2. Power BI Pro

To assign very easily a Pro Trial license to this user, log in with this account to Power BI and create a new Workspace; No other Office 365 licences are needed!

3. Setup a Data Gateway

Now we must install a Data Gateway on our on premises environment. This is also a reason why we need a Pro license. The Data Gateway connects our AS Cube to the Cloud.

4. Analysis Services on our Infrastructure as Service environment

First we must add the Gateway user as an SSAS Administrator. This permission is needed to change the user context.
In the Cube we have a role named as „RLS“. The role will filter the data. The members are AD Accounts which will don’t have a Power BI or O365 license.

5. Setup the Gateway

I’ve created a Power BI Report which will use a Live connection to the Cube. The report display a Sales Amount to test the RLS and the current username who views the report.
After the report published to the service, we must setup the Gateway connection
The dataset must be connected to the Gateway. After that you can view the report with the data from the on premises cube.

6. Invoke the Gateway for the Impersination

Here comes the magic. We must setup the Gateway connection to impersonate the users. It’s really no magic, but it’s very cool :) The setting cannot be done via the portal, it must be done via the REST API.

On the msdn website (Power BI REST API documentation) is the magic. There you can find the hint how you can setup an impersonation


"emailAddress": "HybridService001
@plenz.onmicrosoft.com",
"datasourceAccessRight": "ReadOverrideEffectiveIdentity"
}

You can setup the Gateway without Postman or other tool. The try it button will set the properties. You need the data source and the gateway id. You can get the id’s from the gateway configuration website.

1: Gateway Id, 2: Connection Id
After a successful run you should get a 200 code

7. Power BI Embedded APP
You can register very easily an APP with the on-boarding Tool.
Or you can download the code and manually configure the APP: https://github.com/microsoft/PowerBI-Developer-Samples/tree/master/PowerShell%20Scripts

After the wizard completion you can download a configured Visual Studio solution
Now we must edit the Report ID in the web.config to our live report.

Now we can test the Report by providing another username with the Rolename.

Conclusion

With the impersonation you can connect Power BI to your Cubes which are using a Row Level Security. So, Hybrid scenarios are not very new, but when you Embedded, a Data Gateway and this setting, you can provide Power BI solutions very easily and very cheap, because you can use the A-SKU for Embedding. But you must aware of, that you must code an application which makes the authentication.


Azure Runbooks and pausing Power BI A-SKU’s

If you have A-SKU’s deployed on Azure to run Power BI Embedded Resources for your developers, it is good to save money at non-working times, when you pause your capacity. This also applies when you have an application which is only used on your business hours.

So you can use Azure Automation to trigger a start and stop event on appreciated times..

Okay, this is the ressource:

When we want to implement a Azure Automation Runbook to automate to start/stop the resource, we need an Azure Automation Account in our tenant:

Now we can create a runbooks (type: Power Shell)

The Script documentation can be found here:
https://docs.microsoft.com/en-us/rest/api/power-bi-embedded/capacities

If you have multiple subscription in your tenant, you must set the default subscription at your runbook with following command. Otherwise you will receive following error:
Select-AzureSubscription : The subscription name MVP doesn’t exist.

Code for selecting the default subscription:

$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

 

"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}

Okay, this command is for starting:
Resume-AzureRmPowerBIEmbeddedCapacity -Name "pbiem001" -ResourceGroupName "PBI" -PassThru

This one is for stopping the resource:
Stop-AzureRmPowerBIEmbeddedCapacity -Name "pbiem001" -ResourceGroupName "PBI" -PassThru

Now you can add schedules to every runbook to trigger the code to your appreciated time to start or stop the ressource.

That’s it, now you can safe money 😊

Generate a Power BI Embed Token

Last Saturday a friend of mine called me and he asked, how can generate a Power BI Embed Token. He want to test the embedding on the Power BI Playground (https://microsoft.github.io/PowerBI-JavaScript/demo/v2-demo/index.html).

I said, that’s easy. Follow the instructions of “Guy in a Cube” – he explained that on a YouTube Video: https://www.youtube.com/watch?v=4KuyPNtVijo&t=319s

After 30 minutes he called me again, he struggled on the Power Shell execution …

Okay, long story short. To run the script successfully, you must do this steps:

  1. Run the PowerShell ISE as an Admin
  2. Execute following script to install the Power BI Cmdlet:
    Install-Module -Name PowerBIPS
  3. Edit the URL from the PowerShell Script sample

1. Navigate to your report on PowerBI.com

1: Your workspace ID

2: Your Report ID

Go to your PowerShell Script, In the video it seems so easy, only replace the workspace id … it’s not ;-) Edit the URL

  • Don’t use the URL from the Playground, use the one from the PowerShell Script. Replace the workspace and the report ID.
  1. Run the script and paste the Embed Token, the Report ID and grab the Embed URL from the Playground. Replace the Report ID and the Workspace (Group) ID

Done. Weekend!


Hints when you struggle with deploying the registration tool of the Azure Data Catalog

In this Blog Post I want to share my troubleshooting experience with the Azure Data Catalog. First, there was some errors to reach the Azure Endpoints, because my IT has blocked some URL’s. But I only received an unknown error in the frontend.

You can take a look into this Logfile to get more detailed information:

%appdata%\AzureDataCatalogErrorEventLogProd.log

On a other machine the tool has some issues by the login. The error was cleared via the Microsoft support. There was an error on the TLS authentication because there was some missing updates on the .NET Framework: https://docs.microsoft.com/en-us/mem/configmgr/core/plan-design/security/enable-tls-1-2-client#bkmk_winhttp

Hope this helps when other struggles with deploying the Azure Data Catalog.

Text Analysis with Power BI in different languages

According to the msdn article „How to integrate Text Analysis into Power BI„, I needed to detect the language of an comment and make a sentiment analysis of it. The reason to make it parametrized and not change the language key is easy, mostly in Germany I have comments in English, German Spain etc.

So, after I completed the Howto above, i created a new M Function named „Language“ with this code:

// Returns the two-letter language code (for example, 'en' for English) of the text
(text) => let
     apikey      = "YOUR_API_KEY_HERE",
    endpoint    = "https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics" & "/v2.1/languages",
    jsontext    = Text.FromBinary(Json.FromValue(Text.Start(Text.Trim(text), 5000))),
    jsonbody    = "{ documents: [ { id: ""0"", text: " & jsontext & " } ] }",
    bytesbody   = Text.ToBinary(jsonbody),
    headers     = [#"Ocp-Apim-Subscription-Key" = apikey],
    bytesresp   = Web.Contents(endpoint, [Headers=headers, Content=bytesbody]),
    jsonresp    = Json.Document(bytesresp),
    language    = jsonresp[documents]{0}[detectedLanguages]{0}[iso6391Name]
in  language

The code above is 1:1 from the msdn webpage to get a two letter code of the language of the key merged subject and body.

This is also similar to the other steps, but it must be the first invoke function. The next step is to get the Key Phrases, but depending of the language of the column which are created before.
// Returns key phrases from the text in a comma-separated list
(text,lang as text) => let
     apikey      = "YOUR_API_KEY_HERE",
    endpoint    = "https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics" & "/v2.1/keyPhrases",
    jsontext    = Text.FromBinary(Json.FromValue(Text.Start(Text.Trim(text), 5000))),
    jsonbody    = "{ documents: [ { language: """ & lang & """, id: ""0"", text: " & jsontext & " } ] }",
    bytesbody   = Text.ToBinary(jsonbody),
    headers     = [#"Ocp-Apim-Subscription-Key" = apikey],
    bytesresp   = Web.Contents(endpoint, [Headers=headers, Content=bytesbody]),
    jsonresp    = Json.Document(bytesresp),
    keyphrases  = Text.Lower(Text.Combine(jsonresp[documents]{0}[keyPhrases], ", "))
in  keyphrases

The JSon Body has no longer the hard coded en, is uses a new parameter language which is given on the function header. So you must edit the „invoke custom Function“-call with our new language column:

Now, we have our Key Phrases depending on the detected language and we can get the sentiment score also depending on the language with this code:
// Returns the sentiment score of the text, from 0.0 (least favorable) to 1.0 (most favorable)
(text,lang as text) => let
     apikey      = "YOUR_API_KEY_HERE",
    endpoint    = "https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics" & "/v2.1/sentiment",
    jsontext    = Text.FromBinary(Json.FromValue(Text.Start(Text.Trim(text), 5000))),
    jsonbody    = "{ documents: [ { language: """ & lang & """, id: ""0"", text: " & jsontext & " } ] }",
    bytesbody   = Text.ToBinary(jsonbody),
    headers     = [#"Ocp-Apim-Subscription-Key" = apikey],
    bytesresp   = Web.Contents(endpoint, [Headers=headers, Content=bytesbody]),
    jsonresp    = Json.Document(bytesresp),
    sentiment   = jsonresp[documents]{0}[score]
in  sentiment

That’s it, after we make another invoke function call, we get the sentiment score based on the given language. The order after that in the M Query Editor: