Trigger an Azure Data Factory pipeline by external users

The very simple and common use case is, that you have a pipeline that fills your data warehouse, process your cube, and delivers this information to your business users. The data warehouse uses as a part of it, excel files which are created by the business. The pipeline runs every day one or two times, but now, there is a requirement, that the process owner of these excel files what to control this process by trigger the pipeline. The reason is very simple, there was a file which contains errors and must be replaced that the management does not get misleading information.
If you want that the user can trigger a ADF pipeline you have different options, you can give him permissions or you can trigger the ADF by an event. I think, the last one is more interesting, because the user needs no permissions to the factory.

So, I have created an Azure storage account. This account is only to save a trigger file which is created by the user in this container. Next, I have created a simple pipeline which needs to be triggered by the business

The pipeline deletes after the run the trigger file from the storage. I use an Azure Event Blob trigger with these settings

So, I a place a file which starts with the word “trigger” inside the filename at the container, the ADF will recognize it at runs the pipeline.
After uploading a file named “triggeron.txt” the pipeline will run immediately

Now, you need only an option that the user can create this file very easily. There you can provide a Power Shell script to this user with access key, or you can use a Power Automate flow. I think this is better, because you have a flow which is shared to this user and when it runs, it will create this file inside this container. You can also trigger this flow by arriving an email to a particular post box, like this flow

But you must aware, that in the case you are using Power Automate you need a Premium connector.

How to add users into roles with Azure DevOps by using PowerShell to a Analysis Service Cube

To add users users after a deployment to a Azure Analysis Services cube you can do this manually with the Management tools or with the Visual Studio. But with a Continiuos Deployment (CI/CD) and different stages, you need mostly add different users/groups to add to the roles. So, you need todo this dynamiccaly with some scipts. I found in the marketplace a addin to do this with DevOps by using a pipline. But, unfornatially this adding does not work and you can only onee user/group by using one step. So, I decided to do this with PowerShell to add more than one user in one step.
I added my deployment user as a SSAS admin with the portal (you can also do this with the Management Studio)

After that, I created this PowerShell script and added this to my Repo and published it to DevOps to use it inside a Pipeline: This script connects to my Analysis Service and adds a user to a role. use it inside a Pipeline: This script connects to my Analysis Service and adds a user to a role.


[CmdletBinding()]
param (
    $ParamUser,
    $ParamPassword
)

$myPassword = $ParamPassword
$myUsername = $ParamUser

$password = ConvertTo-SecureString $myPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($myUsername, $password)
Login-AzAccount -Credential $credential
Import-Module SqlServer -DisableNameChecking

Add-RoleMember `
    -Server asazure://westus.asazure.windows.net/ssas001 `
    -membername "asadmin@plenz.onmicrosoft.com" `
    -database "adventureworks" `
    -rolename "Internet Sales Manager" `
    -Credential $credential

Here you see the script inside my Repo at DevOps

Next, I created a pipeline, and I added the needed credentials to connect to the SSAS server as variables.

After doing this, I created an Azure PowerShell task inside my pipeline this one uses the PowerShell script from the repo and pushes the variables as parameters to the script.

After tunning the pipeline, the user from the script is added to the role.