Need to translate the Adventure Works Tabular Model into German?

Very often I need a demonstration model for presentations or workshops and I use the Adventure Works SSAS model. Mostly for customers who are based in Germany and I’m not so happy when I must use a model in English. With Power BI.com (Premium) or Analysis Services (Tabular) you can use translations. So, I’ve created with Tabular Editor and the SSAS Tabular Translator a Json file for the Adventure Works Power BI model which provides a translation into the German language: https://github.com/philipplenz/AdventureWorksASTranslation

Some more links regarding this topic
https://www.flip-design.de/?p=810
https://tabulareditor.com/2019/10/23/TabularEditorTricks3.html
https://github.com/PowerBi-Projects

Safe money, time and performance! Or use Power Automate, Sentiment Analysis and Power BI with incremental refresh.

Okay, the title looks like marking stuff! At my last session for the Power BI user group for the Netherlands, an attendee asks me, if it’s possible to use Power BI, the Azure Sentiment Analysis and the incremental refreshing functionality. I said yes, but to be honest, I was not sure. Now I have tested it, and now, I can say YES!
The story for this session was very easy, you get mails, put them to a storage, read the data to a Power BI report and use the Sentiment Azure Analysis API to get the Language and the sentiment of it and analyze the data. This blog post shows this step by step and also with the incremental refreshing of a Power BI data set.

The Mailbox:
I used for this approach an Office 365 mailbox for my received mails:

Next, I’ve created an Excel File into my OneDrive.

Then I created a Power Automate Flow which connects to the mailbox and also the Excel file:

It’s a good idea to convert the body to text, because when you receive HTML mails, it can be possible you get confusing results when you use the text analysis. You can test it, when you send an email to this box, the trigger should write the data to the file.
Next I’ve created a Text Analysis service at Azure. How to do it, please take a look to this post.

Now we need the Power BI report. Incremental refresh is now GA and also available with the Pro-license: https://powerbi.microsoft.com/de-de/blog/incremental-refresh-is-generally-available/
If you are more interested about this, please take a look to this blogs:
https://radacad.com/all-you-need-to-know-about-the-incremental-refresh-in-power-bi-load-changes-only
https://blog.crossjoin.co.uk/2020/04/13/keep-the-existing-data-in-your-power-bi-dataset-and-add-new-data-to-it-using-incremental-refresh/

To make a better test, I’ve created also with Powre Query (M) a calculated column with the imported date/time:

If you create this column with DAX, it will be refreshed on every refresh. This column will be only refreshed if the row will be imported or gets an update.

After I setup the incremental refresh, published the report to PowerBI.com and configured the refresh, it’s time to wait to get new data from your mailbox and for a new timeframe …
After this you can view your data:

When you check the metrics at Azure for the sentiment service, you see only the calls for newly received mails.
The Azure API are available for different pricing plans, also for a free plan without cost. But it is limited. But when you receive many emails, you reach very quickly the limit, because for every mail you need three calls (Language, Key Phrases, Sentiment). With the incremental refresh you only send new rows to the API and you can safe money. The refresh is also very quick, because not the whole data must be refreshed.

I think this can be a good solution for companies, to the sentiment of their customers who send them emails about their services, products etc. You only need a Power BI Pro license, a free Azure subscription and an Office 365 account.

Connection fails to Text Analysis from Power BI?

Last weekend I was a little bit confused. I deployed a report with the Azure Service “Text Sentiment” to the Power BI service and I would to configure a scheduled refresh. But I can’t setup the credentials, because the wizard said to me, that the URL are not found:

Failed to update data source credentials: Web.Contents failed to get contents from 'https://southcentralus.api.cognitive.microsoft.com/text/analytics/v3.1-preview.1/languages' (404): Resource Not Found

I tested the connection with Postman, and it works … But the reason was very simple, I needed to deactivate the connection test and after that, the refresh works:

If you want to test it with the newest API, you can go to the Quick Start menu and there you must select the region. The region must to entered into the URL:

https://southcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-1/operations/Languages/console

Here some more information about this feature:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/tutorials/tutorial-power-bi-key-phrases
https://www.flip-design.de/?p=677
https://azure.microsoft.com/en-us/services/cognitive-services/

Why should you aware of the connection details in a Power BI file?

Yesterday a colleague of mine had problem to setup a schedule refresh at a Power BI file on a Power BI Report Server. The refresh fails every time with an error message that the credentials are missing for the data source. But the credentials are setup for the data source. It seems that we entered in an error from the October release (https://community.powerbi.com/t5/Report-Server/Power-BI-Report-Server-October-2020-update-has-been-released/td-p/1503433). But the error was fixed, and we had installed the updated release.
The fix was easy, because when you create connections to an earlier data source, but with another notation, you get two different connections:

data source settings

For the Power BI service this is not an issue, but the Report Server thinks there two connections, but you will only see one of them at data source wizard. So, when you harmonize the notations and publish the report again to the report server, the scheduled refresh works correctly.

DirectQuery Realtime Datasets and the composite model

I really like the Realtime datasets in Power BI, especially why it is really easy to create one 😊 on the past you create on use it in the service or use it via the Desktop by using the Datasets connector. But there was not so much possibilities to make more with the data, e.g. to merge it with other sources. So you created “complex” processes via Azure, push the Realtime data into a SQL DB and then you are able to merge the data.
With the Direct Query mode for datasets it is more easier. You can connect to the Realtime Dataset, import other data and analyze them.

A small example:

Create a Streaming Dataset at Power BI

With the examples for PowerShell and so on it is very easy to add data to this dataset

Now you can connect to this dataset via the Power Desktop (you must enable the preview feature) and load other data from different datasources into your model:

The you can create relationships between the data and analyze them

Unfortunately it is not possible to modify the metadata, like to add translations via the Tabular Editor to the data from the Streaming Dataset. It is only possible from the data from other sources.

Azure Runbooks and pausing Power BI A-SKU’s

If you have A-SKU’s deployed on Azure to run Power BI Embedded Resources for your developers, it is good to save money at non-working times, when you pause your capacity. This also applies when you have an application which is only used on your business hours.

So you can use Azure Automation to trigger a start and stop event on appreciated times..

Okay, this is the ressource:

When we want to implement a Azure Automation Runbook to automate to start/stop the resource, we need an Azure Automation Account in our tenant:

Now we can create a runbooks (type: Power Shell)

The Script documentation can be found here:
https://docs.microsoft.com/en-us/rest/api/power-bi-embedded/capacities

If you have multiple subscription in your tenant, you must set the default subscription at your runbook with following command. Otherwise you will receive following error:
Select-AzureSubscription : The subscription name MVP doesn’t exist.

Code for selecting the default subscription:

$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

 

"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}

Okay, this command is for starting:
Resume-AzureRmPowerBIEmbeddedCapacity -Name "pbiem001" -ResourceGroupName "PBI" -PassThru

This one is for stopping the resource:
Stop-AzureRmPowerBIEmbeddedCapacity -Name "pbiem001" -ResourceGroupName "PBI" -PassThru

Now you can add schedules to every runbook to trigger the code to your appreciated time to start or stop the ressource.

That’s it, now you can safe money 😊