Which user does really need a Power BI Pro license?

In a Power BI Premium license environment, only the Report creators have a Power BI Pro license, because they create Reports, Dashboards and so on. The other users, who only consume the data, are assigned to a Free license. In the past, I saw so many organization’s where users have a Pro license, because they need a in the past, but now they do not do anything with it. For an administrator it can be horrible to find out, which user really need a license, because, when he does not use the activity logs, he must ask every user which has one assigned. In a previous Blog Post, I have explained how you can get the Activities of the last 90 days into CSV files. If you run this automatically and over a long term of time, you get a good time series (https://www.flip-design.de/?p=916). With this data you can easily create a filter, to check which user creates reports or so on:

 Now you need the information which user has a Pro license assigned. To get this information, you can request your Active Directory, get this data into a CSV file, and merge it with this data. The following script exports every user into the defined file who has a license.

$myPassword = '<password>'
$myUsername = 'philipp@plenz.onmicrosoft.com'

$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)
Connect-AzureAD -Credential $credential

$RetrieveDate = Get-Date 
$BasePath = "C:\Users\plenz\OneDrive\Power BI WS"
$AzureADUsersCSV = $BasePath + "Users.csv"
$OrgO365LicensesCSV = $BasePath + "OrgO365Licenses.csv"
$UserPBIProLicensesCSV = $BasePath + "UserPBIProLicenses.csv"

<#
See MS Licensing Service Plan reference: 
https://docs.microsoft.com/en-us/azure/active-directory/users-groups-roles/licensing-service-plan-reference
#>
$PBIProServicePlanID = "70d33638-9c74-4d01-bfd3-562de28bd4ba"


$ADUsers = Get-AzureADUser -All $true | Select-Object ObjectId, ObjectType, CompanyName, Department, DisplayName, JobTitle, Mail, Mobile, `
            SipProxyAddress, TelephoneNumber, UserPrincipalName, UserType, @{Name="Date Retrieved";Expression={$RetrieveDate}}

$OrgO365Licenses = Get-AzureADSubscribedSku | Select-Object SkuID, SkuPartNumber,CapabilityStatus, ConsumedUnits -ExpandProperty PrepaidUnits | `
    Select-Object SkuID,SkuPartNumber,CapabilityStatus,ConsumedUnits,Enabled,Suspended,Warning, @{Name="Retrieve Date";Expression={$RetrieveDate}} 
   
$UserLicenseDetail = ForEach ($ADUser in $ADUsers)
    {
        $UserObjectID = $ADUser.ObjectId
        $UPN = $ADUser.UserPrincipalName
        $UserName = $ADUser.DisplayName
        $UserDept = $ADUser.Department
        Get-AzureADUserLicenseDetail -ObjectId $UserObjectID -ErrorAction SilentlyContinue | `
        Select-Object ObjectID, @{Name="User Name";Expression={$UserName}},@{Name="UserPrincipalName";Expression={$UPN}}, `
        @{Name="Department";Expression={$UserDept}},@{Name="Retrieve Date";Expression={$RetrieveDate}} -ExpandProperty ServicePlans
    }
    $ProUsers
$ProUsers = $UserLicenseDetail | Where-Object {$_.ServicePlanId -eq $PBIProServicePlanID}

$ProUsers | Export-Csv $UserPBIProLicensesCSV

After that, you can create a relationship between both tables and you can check, which user really needs the Pro license, or you can assign it to another user.

Run an Azure Automation runbook with PowerShell to get data from the Power BI REST API

Unfortunately, it is not possible to make a login to Power BI with the Cmdlet’s (Login-PowerBIServiceAccount) by Azure Automation. So, you cannot use the Power BI PowerShell commands to get very easy the requested information. But if you want to automate the running of your script, like to export some data in your Blob Storage automatically, you can use Azure Automation, but you need to make another authentication and another usage of the commands to get the data. So, you must create an authentication token, like you do that by Power BI Embedded, and then you can invoke a REST call. Sounds easy? Here is the script:

Developing Azure Automation with the PowerShell ISE
$connectionName = "AzureRunAsConnection"

try {
    # Get the connection "AzureRunAsConnection "
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

     "Logging in to Azure..."
    Add-AzureRmAccount `
    -ServicePrincipal `
    -TenantId $servicePrincipalConnection.TenantId `
    -ApplicationId $servicePrincipalConnection.ApplicationId `
    -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
    if (!$servicePrincipalConnection)
    {
        $ErrorMessage = "Connection $connectionName not found."
        throw $ErrorMessage
    } else{
        Write-Error -Message $_.Exception
        throw $_.Exception
    }
}

# Application id from https://app.powerbi.com/apps
$clientId = "<Id>"

# Credetials of the user
$myPassword = '<Password>'
$myUsername = '<Username>'

# function to create an app token with user credentials
 function GetAuthToken {
     $redirectUri = "urn:ietf:wg:oauth:2.0:oob"
     $resourceAppIdURI = "https://analysis.windows.net/powerbi/api"
     $authority = "https://login.microsoftonline.com/common/oauth2/authorize";
     $authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority
     $PWord = ConvertTo-SecureString -String $myPassword -AsPlainText -Force
     $credentials = New-Object System.Management.Automation.PSCredential $myUsername,$PWord
     $AADcredential = New-Object “Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential” -ArgumentList $credentials.UserName,$credentials.Password
     $authResult = $authContext.AcquireToken($resourceAppIdURI, $clientId, $AADcredential)
     return $authResult
}

# Save the API token
$token = GetAuthToken

# Create an autentification header
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/json")
$headers.Add("Authorization", $token.CreateAuthorizationHeader())

# Get all workspaces ...
$a = Invoke-RestMethod -Uri "https://api.powerbi.com/v1.0/myorg/groups" -Method Get -Headers $headers
foreach ($b in $a.value ) {
    $b
}

Orchestrate schedule refreshes in Power BI

In my last Blog Post I showed you, how you can get every scheduled refresh of your Power BI tenant to organize them better and to avoid the eviction when you refresh too many datasets at the same time: https://www.flip-design.de/?p=921
To get a better timeslot, it is also helpful to know how much time are needed when a particular dataset is refreshing. For this task you can also use the Power BI REST API. The following Power Shell script exports the history and the planned refreshes as well:

After you imported both files into Power BI Desktop, you can analyze them better to find a good timeslot:

$myPassword = '<Password>'
$myUsername = '<Username>'

$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)

Login-PowerBIServiceAccount -Credential $credential

$Workspaces = Get-PowerBIWorkspace

$date = Get-Date -Format "ddMMyyyy"

$HistoryFile = '<Path>\history' + $date  + '.csv'
$RefreshFile = '<Path>\Refresh' + $date + '.csv'

foreach($workspace in $Workspaces)
{
    $DataSets = Get-PowerBIDataset -WorkspaceId $workspace.Id | where {$_.isRefreshable -eq $true}    
    foreach($dataset in $DataSets)
    {
        $RefreshURI = "groups/" + $workspace.Id + "/datasets/" + $dataset.id + "/refreshes"
        $ScheduleURI = "groups/" + $workspace.Id + "/datasets/" + $dataset.id + "/refreshSchedule"
       
        $Results = Invoke-PowerBIRestMethod -Url $RefreshURI -Method Get | ConvertFrom-Json

        #
        $RefreshURISchedule = "groups/" + $workspace.Id + "/datasets/" + $dataset.id + "/refreshSchedule"      
        $ResultsSchedule = Invoke-PowerBIRestMethod -Url $RefreshURISchedule -Method Get | ConvertFrom-Json

        # Only enabled datasets fro refreshing
        if($ResultsSchedule.enabled -eq $true) {

            foreach($content in $Results.value) {           
                $row = New-Object psobject
                $row | Add-Member -Name "Workspace" -Value $workspace.Name -MemberType NoteProperty 
                $row | Add-Member -Name "WorkspaceId" -Value $workspace.Id -MemberType NoteProperty    
                $row | Add-Member -Name "Dataset" -Value $Dataset.Name -MemberType NoteProperty   
                $row | Add-Member -Name "DatasetId" -Value $Dataset.Id -MemberType NoteProperty  
                $row | Add-Member -Name "refreshType" -Value $content.refreshType -MemberType NoteProperty 
                $row | Add-Member -Name "Status" -Value $content.status -MemberType NoteProperty
                $row | Add-Member -Name "StartTime" -Value $content.startTime -MemberType NoteProperty  
                $row | Add-Member -Name "EndTime" -Value $content.endTime -MemberType NoteProperty           
                $row | Export-Csv -Path $HistoryFile -Append -Delimiter ';' -NoTypeInformation  
            }
        }

        $refreshScheduleURI = "groups/" + $workspace.Id + "/datasets/" + $dataset.id + "/refreshSchedule"
       
        $Results = Invoke-PowerBIRestMethod -Url $refreshScheduleURI -Method Get | ConvertFrom-Json
        # Only enabled datasets fro refreshing
        if($Results.enabled -eq $true) {
            $days = $Results.days -join ','
            $time = $Results.times -join ','
            $row = New-Object psobject
            $row = New-Object psobject
            $row | Add-Member -Name "Workspace" -Value $workspace.Name -MemberType NoteProperty 
            $row | Add-Member -Name "WorkspaceId" -Value $workspace.Id -MemberType NoteProperty    
            $row | Add-Member -Name "Dataset" -Value $Dataset.Name -MemberType NoteProperty   
            $row | Add-Member -Name "DatasetId" -Value $Dataset.Id -MemberType NoteProperty    
            $row | Add-Member -Name "Days" -Value $days -MemberType NoteProperty 
            $row | Add-Member -Name "Time" -Value $time -MemberType NoteProperty         
            $row | Export-Csv -Path $RefreshFile -Append -Delimiter ';' -NoTypeInformation  
        }
    }
}

After you imported both files into Power BI Desktop, you can analyze them better to find a good timeslot:

Organize and monitor the Power BI scheduled refreshes.

With Power BI you can set up a scheduled refresh of any dataset which have an accessible data source. But you must be aware of, how many refreshes are at the same time. With every particular capacity you have only limited parallel refreshing resources. So, you can only refresh one dataset at the same time when you use an A1 capacity, up to 24 datasets when you use an P3 or A6. To manage this, it can be very hard, because it is mostly impossible to overview every workspace and if your users manage this, it is impossible!

Source: Admin in a Day, Microsoft
Source: Admin in a Day, Microsoft

But you can create an overview with the Power BI REST API by using PowerShell. In my example I created a workspace named “Test” with two datasets which have configured a scheduled refresh. The following PowerShell script calls the REST API and filters all workspaces and datasets in you tenant by a configured scheduled refresh.

The data from the following script will be written into a CSV file. The days and times which are configured to each dataset will be written comma delimited into one column. This can be split off with Power Query by using “Split into Columns/Rows”. I think this easier as writing one row for each day.

# if you are not authenticated, uncomment following two lines

#$myCred = Get-Credential
#Connect-PowerBIServiceAccount -Credential $myCred

$Workspaces = Get-PowerBIWorkspace

$ExportFile = 'c:\temp\ex.csv'

foreach($workspace in $Workspaces)
{
$DataSets = Get-PowerBIDataset -WorkspaceId $workspace.Id | where {$_.isRefreshable -eq $true}    
    foreach($dataset in $DataSets)
    {
        $URI = "groups/" + $workspace.Id + "/datasets/" + $dataset.id + "/refreshSchedule"
       
        $Results = Invoke-PowerBIRestMethod -Url $URI -Method Get | ConvertFrom-Json
        if($Results.enabled -eq $true) {
            $days = $Results.days -join ','
            $time = $Results.times -join ','
            $row = New-Object psobject
            $row | Add-Member -Name "Workspace" -Value $workspace.Name -MemberType NoteProperty    
            $row | Add-Member -Name "Dataset" -Value $Dataset.Name -MemberType NoteProperty   
            $row | Add-Member -Name "Days" -Value $days -MemberType NoteProperty 
            $row | Add-Member -Name "Time" -Value $time -MemberType NoteProperty         
            $row | Export-Csv -Path $ExportFile -Append -Delimiter ';' -NoTypeInformation  
        }
    }
}

After creating the Dataset with Power BI Desktop, you can analyze the configured refreshes and check how many are at the same time and reconfigure some datasets.

You can automate this with Azure Function Apps and write this file into a Blob Store to get automatically Power BI reports with actual data on you tenant. Also, it should be a good idea to get the refreshing duration of each dataset into this report to organize this a little bit better.

Analyze the Power BI Activity Logs

To get insights into your activities of Power BI from your users you have multiple options:
– The Usage Metrics
– The Audit Logs
– … and the Activity Logs

For me fits the last one, because you do not need an Exchange license (you need this for the Audit Logs) and the Metrics App is too graphical and, at least you cannot export the data. To export the data, I have written this PowerShell Script:

Login-PowerBI
# 90 days backwards90..1 |foreach {    $Date = (((Get-Date).Date).AddDays(-$_))    "$Date"
    $StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)    $EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)        $ActivityLogs = Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate | ConvertFrom-Json    $ActivityLogSchema = $ActivityLogs | `    Select-Object Id,RecordType,CreationTime,Operation,OrganizationId,UserType,UserKey,Workload, `        UserId,ClientIP,UserAgent,Activity,ItemName,WorkspaceName,DatasetName,ReportName, `        WorkspaceId,CapacityId,CapacityName,AppName,ObjectId,DatasetId,ReportId,IsSuccess, `        ReportType,RequestId,ActivityId,AppReportId,DistributionMethod,ConsumptionMethod, `        @{Name="RetrieveDate";Expression={$RetrieveDate}}    # Write a file per day     $ActivityLogSchema | Export-Csv c:\temp\audit\PowerBI_AudititLogs-$((($Date).ToUniversalTime()).ToString("yyyyMMddThhmmssZ")).csv
}

This script exports the data of the last 90 days. (Power BI saves only the last 90 days). If you have already exported a single day, the scripts overwrite the existing files which are already exists. In my experience I got sometimes a timeout for some days by exporting. So, I think it is not a good idea to export only after this timeframe and it is better to export every day or week.
After you have exported the logs, you can import the folder into Power BI Desktop on analyze the data like this:

You can also implement an Azure Function and write the files into a Blob storage. With this solution you can automate the export and create an automatic refresh of your Power BI report at the service.

Translation of your Power BI Embedded Report (App owns data)

If you use a Power BI Embedded Application (App owns data), the App runs in an English environment. But if your users need another language, you must enable the translation.
Okay, the following report is deployed on the Power BI tenant:

At your Embedded Application you can set that your app runs in another language and you can set the format. This information will be sent to the PBI tenant. If you also set some meta data translations, these data will be also translated.

When you run your application with these settings, your meta data and the report will be translated. The good thing is, when you use an A1 Power BI Embedded plan, you will also get the meta data translation. This option is normally with an P1 or higher SKU available.