Organize and monitor the Power BI scheduled refreshes.

With Power BI you can set up a scheduled refresh of any dataset which have an accessible data source. But you must be aware of, how many refreshes are at the same time. With every particular capacity you have only limited parallel refreshing resources. So, you can only refresh one dataset at the same time when you use an A1 capacity, up to 24 datasets when you use an P3 or A6. To manage this, it can be very hard, because it is mostly impossible to overview every workspace and if your users manage this, it is impossible!

Source: Admin in a Day, Microsoft
Source: Admin in a Day, Microsoft

But you can create an overview with the Power BI REST API by using PowerShell. In my example I created a workspace named “Test” with two datasets which have configured a scheduled refresh. The following PowerShell script calls the REST API and filters all workspaces and datasets in you tenant by a configured scheduled refresh.

The data from the following script will be written into a CSV file. The days and times which are configured to each dataset will be written comma delimited into one column. This can be split off with Power Query by using “Split into Columns/Rows”. I think this easier as writing one row for each day.

# if you are not authenticated, uncomment following two lines

#$myCred = Get-Credential
#Connect-PowerBIServiceAccount -Credential $myCred

$Workspaces = Get-PowerBIWorkspace

$ExportFile = 'c:\temp\ex.csv'

foreach($workspace in $Workspaces)
{
$DataSets = Get-PowerBIDataset -WorkspaceId $workspace.Id | where {$_.isRefreshable -eq $true}    
    foreach($dataset in $DataSets)
    {
        $URI = "groups/" + $workspace.Id + "/datasets/" + $dataset.id + "/refreshSchedule"
       
        $Results = Invoke-PowerBIRestMethod -Url $URI -Method Get | ConvertFrom-Json
        if($Results.enabled -eq $true) {
            $days = $Results.days -join ','
            $time = $Results.times -join ','
            $row = New-Object psobject
            $row | Add-Member -Name "Workspace" -Value $workspace.Name -MemberType NoteProperty    
            $row | Add-Member -Name "Dataset" -Value $Dataset.Name -MemberType NoteProperty   
            $row | Add-Member -Name "Days" -Value $days -MemberType NoteProperty 
            $row | Add-Member -Name "Time" -Value $time -MemberType NoteProperty         
            $row | Export-Csv -Path $ExportFile -Append -Delimiter ';' -NoTypeInformation  
        }
    }
}

After creating the Dataset with Power BI Desktop, you can analyze the configured refreshes and check how many are at the same time and reconfigure some datasets.

You can automate this with Azure Function Apps and write this file into a Blob Store to get automatically Power BI reports with actual data on you tenant. Also, it should be a good idea to get the refreshing duration of each dataset into this report to organize this a little bit better.

Analyze the Power BI Activity Logs

To get insights into your activities of Power BI from your users you have multiple options:
– The Usage Metrics
– The Audit Logs
– … and the Activity Logs

For me fits the last one, because you do not need an Exchange license (you need this for the Audit Logs) and the Metrics App is too graphical and, at least you cannot export the data. To export the data, I have written this PowerShell Script:

Login-PowerBI
# 90 days backwards90..1 |foreach {    $Date = (((Get-Date).Date).AddDays(-$_))    "$Date"
    $StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)    $EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)        $ActivityLogs = Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate | ConvertFrom-Json    $ActivityLogSchema = $ActivityLogs | `    Select-Object Id,RecordType,CreationTime,Operation,OrganizationId,UserType,UserKey,Workload, `        UserId,ClientIP,UserAgent,Activity,ItemName,WorkspaceName,DatasetName,ReportName, `        WorkspaceId,CapacityId,CapacityName,AppName,ObjectId,DatasetId,ReportId,IsSuccess, `        ReportType,RequestId,ActivityId,AppReportId,DistributionMethod,ConsumptionMethod, `        @{Name="RetrieveDate";Expression={$RetrieveDate}}    # Write a file per day     $ActivityLogSchema | Export-Csv c:\temp\audit\PowerBI_AudititLogs-$((($Date).ToUniversalTime()).ToString("yyyyMMddThhmmssZ")).csv
}

This script exports the data of the last 90 days. (Power BI saves only the last 90 days). If you have already exported a single day, the scripts overwrite the existing files which are already exists. In my experience I got sometimes a timeout for some days by exporting. So, I think it is not a good idea to export only after this timeframe and it is better to export every day or week.
After you have exported the logs, you can import the folder into Power BI Desktop on analyze the data like this:

You can also implement an Azure Function and write the files into a Blob storage. With this solution you can automate the export and create an automatic refresh of your Power BI report at the service.

Translation of your Power BI Embedded Report (App owns data)

If you use a Power BI Embedded Application (App owns data), the App runs in an English environment. But if your users need another language, you must enable the translation.
Okay, the following report is deployed on the Power BI tenant:

At your Embedded Application you can set that your app runs in another language and you can set the format. This information will be sent to the PBI tenant. If you also set some meta data translations, these data will be also translated.

When you run your application with these settings, your meta data and the report will be translated. The good thing is, when you use an A1 Power BI Embedded plan, you will also get the meta data translation. This option is normally with an P1 or higher SKU available.

Incomplete Data at a self service BI project with Power BI?

Mostly at self service BI Projects the business users imports data from unproven sources – like Excel. But if you have a project and/or source like this, data can be have problems, because at a CSV or Excel file the data format can be anything.
When you take look at following screenshot. There a two tables. One has the facts (sales per date and branch). The second table provides the branches and here, I’ve added a row with the ID “-2”, this provides as a key, which doesn’t exists at the most sources. This key can be used to map error rows.

So, if you import the data, everything is fine and you can make insights of it:

But after some time, it can be happen that user insert some data which can be not used by the data model:

Then you get following error when you refresh the data and the row will not be loaded and the end-users thinks everything is fine.

To avoid this problem you can edit your Power Query ETL process and replace this error by set the “-2”

Now, you will get the row and will mapped to the Key from the dimension

To notice the users or to build up a QA dashboard, you can write some DAX code to count the errors and give some notice to the users

Missings =
    var missings = CALCULATE(COUNTA(Facts[BranchId]), FILTER(Branch, Branch[BranchId] = -2 ) )
    var message = IF(missings > 0, "Some branches are have incomplete data", "")
    return message

Need to translate the Adventure Works Tabular Model into German?

Very often I need a demonstration model for presentations or workshops and I use the Adventure Works SSAS model. Mostly for customers who are based in Germany and I’m not so happy when I must use a model in English. With Power BI.com (Premium) or Analysis Services (Tabular) you can use translations. So, I’ve created with Tabular Editor and the SSAS Tabular Translator a Json file for the Adventure Works Power BI model which provides a translation into the German language: https://github.com/philipplenz/AdventureWorksASTranslation

Some more links regarding this topic
https://www.flip-design.de/?p=810
https://tabulareditor.com/2019/10/23/TabularEditorTricks3.html
https://github.com/PowerBi-Projects

Safe money, time and performance! Or use Power Automate, Sentiment Analysis and Power BI with incremental refresh.

Okay, the title looks like marking stuff! At my last session for the Power BI user group for the Netherlands, an attendee asks me, if it’s possible to use Power BI, the Azure Sentiment Analysis and the incremental refreshing functionality. I said yes, but to be honest, I was not sure. Now I have tested it, and now, I can say YES!
The story for this session was very easy, you get mails, put them to a storage, read the data to a Power BI report and use the Sentiment Azure Analysis API to get the Language and the sentiment of it and analyze the data. This blog post shows this step by step and also with the incremental refreshing of a Power BI data set.

The Mailbox:
I used for this approach an Office 365 mailbox for my received mails:

Next, I’ve created an Excel File into my OneDrive.

Then I created a Power Automate Flow which connects to the mailbox and also the Excel file:

It’s a good idea to convert the body to text, because when you receive HTML mails, it can be possible you get confusing results when you use the text analysis. You can test it, when you send an email to this box, the trigger should write the data to the file.
Next I’ve created a Text Analysis service at Azure. How to do it, please take a look to this post.

Now we need the Power BI report. Incremental refresh is now GA and also available with the Pro-license: https://powerbi.microsoft.com/de-de/blog/incremental-refresh-is-generally-available/
If you are more interested about this, please take a look to this blogs:
https://radacad.com/all-you-need-to-know-about-the-incremental-refresh-in-power-bi-load-changes-only
https://blog.crossjoin.co.uk/2020/04/13/keep-the-existing-data-in-your-power-bi-dataset-and-add-new-data-to-it-using-incremental-refresh/

To make a better test, I’ve created also with Powre Query (M) a calculated column with the imported date/time:

If you create this column with DAX, it will be refreshed on every refresh. This column will be only refreshed if the row will be imported or gets an update.

After I setup the incremental refresh, published the report to PowerBI.com and configured the refresh, it’s time to wait to get new data from your mailbox and for a new timeframe …
After this you can view your data:

When you check the metrics at Azure for the sentiment service, you see only the calls for newly received mails.
The Azure API are available for different pricing plans, also for a free plan without cost. But it is limited. But when you receive many emails, you reach very quickly the limit, because for every mail you need three calls (Language, Key Phrases, Sentiment). With the incremental refresh you only send new rows to the API and you can safe money. The refresh is also very quick, because not the whole data must be refreshed.

I think this can be a good solution for companies, to the sentiment of their customers who send them emails about their services, products etc. You only need a Power BI Pro license, a free Azure subscription and an Office 365 account.