How To get Users who are assigned to Power BI Row Level Security Roles (Report Server and Power BI online)

If you created a Power BI Report with Row Level Security (RLS), it is good idea to monitor which users are assigned to the different roles to meet requirements, such as Monitoring, Audit and Security rules. This information can be logged into a database or something like this.

Report Server

With the Power BI Report Server (on premises) it is easy. You have two options, you can use with PowerShell the REST API (https://docs.microsoft.com/de-de/power-bi/report-server/rest-api) or query the Report Server DB with T-SQL. Last one, is in my point of view, easier and more useful. So, if you use following query and assign users to roles:

SELECT 
       catalog.Path
       , dbo.Catalog.name
       , datamodelrole.ModelRoleName
       , users.UserName
  FROM [ReportServer].[dbo].[UserDataModelRole]
  INNER JOIN dbo.DataModelRole ON DataModelRole.DataModelRoleID = UserDataModelRole.DataModelRoleID
  INNER JOIN dbo.Users ON Users.UserID = UserDataModelRole.UserID
  INNER JOIN dbo.Catalog ON Catalog.ItemID = DataModelRole.ItemID

You get all users which are assigned to the different roles of each report are using RLS.

Power BI online

With Power BI online it is more difficult. Because the users are managed by the portal, but the assignments are saved inside the OLAP cube (every dataset is a OLAP cube in behind). If you want to query the users, you need a Premium or PPU capacity assigned to the workspaces. With this license, you can connect to the dataset / or cube by using tools like the DAX Studio. Here you can use this  query to get the users of the cube. You don’t have the possibility to query all datasets at once. There is no PowerShell or REST API command available to do this. If you want to do this, you can iterate over your workspaces

SELECT * FROM $System.TMSCHEMA_ROLE_MEMBERSHIPS

To determine which dataset has a RLS configured, you can check this by using a REST API call:

But, if you download and upload the PBIX file and upload it with another name, the role assignment is lost. The assigned users are not stored inside the dataset! A little bit confusing, but the reason is very simple, Power BI files doesn’t store the assignments, a OLAP cube will do that. So, if you want to migrate a report, check the REST API, get  the assignments, and re-assign the users.

Continuous Deployment of Azure Analysis Services Cubes – Part 2; SQL Authentication

In my last blog post, I’ve described how you can deploy SSAS cubes to different stages with the Tabular Editor by using a OATUH authentication for the data source. This post describes a cube by using a SQL authenticated connection. Firstly, I’ve created a cube by using a SQL Auth connection. Visual Studio does not save the password for the connection inside the BIM file. But you can add this, by adding this parameter inside the BIM file:

But you must awa43e of, that the password is stored as plain text inside your code. You don’t have the possibility to encrypt it. After deploying the cube, I created s pipeline inside Azure DevOps like the other from Part 1.

The opposite to the other pipeline is, that we have a step to edit the data source.

This PowerShell script will add the password to the connection string:

$AlterDatasourceCommand = @"
{
  "createOrReplace": {
    "object": {
      "database": "Cube",
      "dataSource": "SQL/sqlpl001 database windows net;sql01"
    },
    "dataSource": {
      "type": "structured",
      "name": "SQL/sqlpl001 database windows net;sql01",
      "connectionDetails": {
        "protocol": "tds",
        "address": {
          "server": "sqlpl002.database.windows.net",
          "database": "sql01"
        },
        "authentication": null,
        "query": null
      },
      "credential": {
        "AuthenticationKind": "UsernamePassword",
        "kind": "SQL",
        "path": "sqlpl002.database.windows.net;sql01",
        "Username": "ok",
		"Password": "XXXXX",
        "EncryptConnection": true
      }
    }
  }
}  
"@    

$myPassword = ‘XXXXX'
$myUsername = 'philipp@plenz.onmicrosoft.com'

$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)

Invoke-ASCmd -Server asazure://eastus.asazure.windows.net/aspltest -Database Cube -Query $AlterDatasourceCommand -Credential $credential

If you need to script your connection, you can right click to your connection and choose the Replace option. But here you must also add the password option.

Continuous Deployment of Azure Analysis Services Cubes – Part 1

With Tabular Editor from Daniel Otykier, you can easily deploy tabular models to different stages. This Blog post covers how to do that and how to manage the connection which points to the correct stage. The second part will cover, how to do that with a SQL Authenticated user.
This blog post follows this series: https://tabulareditor.github.io/2019/10/08/DevOps3.html

Firstly, you need a pipeline with three steps: Download the portable version of tabular editor, the preparing of the release which will deployed, and the final deployment. The cube uses an OAuth connection which also must be maintained in the last step.

The first step downloads the Tabular Editor and extracts the content to the environment:

# Download URL for Tabular Editor portable:
$TabularEditorUrl = "https://github.com/otykier/TabularEditor/releases/download/2.12.4/TabularEditor.Portable.zip" 

# Download destination (root of PowerShell script execution path):
$DownloadDestination = join-path (get-location) "TabularEditor.zip"

# Download from GitHub:
Invoke-WebRequest -Uri $TabularEditorUrl -OutFile $DownloadDestination

# Unzip Tabular Editor portable, and then delete the zip file:
Expand-Archive -Path $DownloadDestination -DestinationPath (get-location).Path
Remove-Item $DownloadDestination

The second step copies the BIM file and replaces the connection. Normally you have different branches, but in this case it is easier. The connection from the BIM file points to the DEV SQL database, the replacement points the cube to the testing database.

Copy-Item "_cube/cube/cube/Model.bim" -Destination "_cube/cube/cube/Test.bim" -Force

(Get-Content "_cube/cube/cube/Test.bim") -replace '"server": "sqlpl001.database.windows.net",', '"server": "sqlpl002.database.windows.net",' | Set-Content "_cube/cube/cube/Test.bim"

The last step deploys the cube to the testing environment. The password of the deployment user is stored inside a variable and can get from KeyVault (https://www.flip-design.de/?p=1100 )

TabularEditor.exe _cube\cube\cube\Test.bim  -D asazure://eastus.asazure.windows.net/aspltest Cube -L philipp@plenz.onmicrosoft.com $(password) -O

But when you want to refresh the cube to get new data, you get an error. The reason why is, that you OAuth connection user doesn’t have actual credential information:

You can fix it, by refreshing the credentials. But you must be aware of, that your credentials will expire.

After that, and when you refresh the cube instantly, the lease time will be updated.

Use Azure KeyVault with DevOps to secure your credentials

The most DevOps pipelines use variables to store credentials and other variables. With DevOps you have the possibility to hide/encrypt data, but in an enterprise environment mostly KeyVaults are in use to store the sensitive data in a central store to provide better management. This post is to show you how you can use a KeyVault and use passwords in your pipeline. This post referrers to my last blog post and uses this pipeline.

To start, I created an Azure KeyVault and I store inside a password which is able in combination with the username to start a Power BI deployment pipeline.

Next, you need to create a service connection to your Azure tenant:

Select Azure Resource Manager

Select service principal

Select your subscription and the Resource Group where the KeyVault is stored

After creating your Service Principal, you can get the name of it by clicking to the Manage link

After clicking at ServiePrincipal, you will get the name of it, and your can add the Access Policy:

Select Get and List to get the Secrets and the content

Add your Service Principal which was created by DevOps

Don’t forget to click Save. Next, back to Azure DevOps and refer to Library. Here you can add a new variable group which uses your KeyVault.

You need click to Authorize to enable the connection. Next, you must go to your pipeline variables and add the newly created library

Inside your tasks you can use these variables. If your passwords contain some special characters, please put them into single quotes.

By viewing the results, you can see the using of the KeyVault

Continuous Deployment of Power BI Reports with PowerShell and Azure DevOps

For Azure DevOps you find several Addons to deploy Power BI Content to a workspace and addons to trigger a Pipeline. But when you need more flexibility to trigger a pipeline at Power BI and decide which content you want to deploy, you can use the Power BI REST API bs triggering it with PowerShell. For this purpose, I’ve created a Script which can start a pipeline.

The script contains four parameters:

The UPN / username who can start the pipeline
The password from the user (if the password contains special characters, you should put the password into single quotes.
The stage: 0=from DEV to test, 1=from test to prod
The pipeline name

pipeline.ps1 username 'password' 1 DeployPipeline

So, you can call the script for different pipelines and for the different stages.
The script:

[CmdletBinding()]
param (
    $ParamUser,
    $ParamPassword,
    $ParamstageOrder, # The order of the source stage. Development (0), Test (1).
    $ParamPipelinename # The name of the pipeline
)

$myPassword = $ParamPassword
$myUsername = $ParamUser
$stageOrder = $ParamstageOrder   
$pipelineName = $ParamPipelinename  

$moduleName = "MicrosoftPowerBIMgmt.Profile"
$module = Get-Module $moduleName -ListAvailable -ErrorAction SilentlyContinue

if (!$module) 
{
	Install-Module -Name $moduleName -Force -Scope CurrentUser -SkipPublisherCheck 
}


$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)

Login-PowerBIServiceAccount -Credential $credential
                  

# End Parameters =======================================

try { 
    # Get pipelines
    $pipelines = (Invoke-PowerBIRestMethod -Url "pipelines"  -Method Get | ConvertFrom-Json).value

    # Try to find the pipeline by display name
    $pipeline = $pipelines | Where-Object {$_.DisplayName -eq $pipelineName}

    if(!$pipeline) {
        Write-Host "A pipeline with the requested name was not found"
        return
    }

    # Construct the request url and body
    $url = "pipelines/{0}/DeployAll" -f $pipeline.Id

    $body = @{ 
        sourceStageOrder = $stageOrder

        options = @{
            # Allows creating new artifact if needed on the Test stage workspace
            allowCreateArtifact = $TRUE

            # Allows overwriting existing artifact if needed on the Test stage workspace
            allowOverwriteArtifact = $TRUE
        }
    } | ConvertTo-Json

    # Send the request
    $deployResult = Invoke-PowerBIRestMethod -Url $url  -Method Post -Body $body | ConvertFrom-Json

    "Operation ID: {0}" -f $deployResult.id
} catch {
    $errmsg = Resolve-PowerBIError -Last
    $errmsg.Message
}

I moved this script to a DevOps repo:

After that, I created a release pipeline:

The content of each stage:

After running the pipeline, the content of each stage is synchronized:

Add comments to Power BI Visuals and the data by using Analysis Services Cubes?

If you use Power BI reports, the users can add comments to a report and to a visual. But mostly, the business users want to add report to the data, e.g., to say “This data is not complete” or something like that. So, the use case for this blog post is to add comments to a report which is using a OLAP Cube as the data source.
Firstly, we need a simple data source for the cube. I’ve created two table. The first one contains the data for the cube, and the second one is used to store the comments:


CREATE TABLE dbo.factData (
	id int not null identity(1,1) primary key
	, category nvarchar(255) not null
	, project nvarchar(255) not null
	, amount int not null
);
GO
INSERT INTO dbo.factData (category, project, amount)
VALUES 
	('Cat A', 'P 1', 1000)
	, ('Cat A', 'P 2', 2000)
	, ('Cat B', 'P 3', 1100)

GO

CREATE TABLE dbo.comments (
	id int not null identity(1,1) primary key
	, project nvarchar(255) not null
	, comment  nvarchar(1000) not null
	, createtiondate datetime default GETDATE()
);
GO

INSERT INTO dbo.factData (category, project, amount)
VALUES 
	('Cat A', 'P 4', 1000)
GO

INSERT INTO dbo.comments (project, comment)
VALUES ('P 4', 'a comment ...')

After creating the tables, I added some sample data.
Next, I have created a SSAS model like this:

After deploying the cube, I created a Power BI report to display the data:

To show the comments, I added a Tooltip page to show the comments for each project of the first page:

Now, we need a simple way to comment. For this purpose, I created s Power App with a SQL connection to the source database:

After creating the app, which points to the comment table, I embed the APP to the Power BI report. Now, a user off the report can create a comment to each project.

Now we need a technique to refresh the cube to show the comments. If your cube contains many data, it is not good idea to refresh the whole data. But with SSAS you can refresh only one table, and this can be done by using the Azure Data Factory every X minutes. This means, when a user add comment, the user must wait a minute to see the comment.

{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "Comments",
        "table": "comments"
      }
    ]
  }
}