cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI - Databricks Connection using Service Principal PAT Refresh

js54123875
New Contributor III

What is best practice for automatically refreshing service princpal PAT in Power BI for a connection to a Databricks dataset? Ideally when the PAT is updated it will automatically be stored in Azure Key Vault, is there a way that Power BI can pick it up automatically from there?

Quick Details of Setup:

  • Power BI dashboard published to Power BI Service
  • Databricks data source
  • Service Principal
  • Personal Access Token (PAT) expires every 90 days, so will update every 70 days
  • Ideally PAT credential storied in Azure Key Vault for Power BI to access
  • How to automatically refresh PAT in Power BI?
1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @js54123875 , Certainly! Refreshing a Power BI dataset with a Service Principal and managing PATs can be achieved through a combination of best practices.

 

 Let’s explore some approaches:

 

Service Principal and Azure Key Vault:

  • Create a Service Principal: Set up a service account (user) in Azure Active Directory (AAD) specifically for Power BI Databricks connections. This account will act as the intermediary between Power BI and Databricks.
  • Azure Key Vault Integration: Store your PAT securely in Azure Key Vault. Configure the Key Vault access policies to allow the Power BI service principal to unwrap the PAT.
  • Scheduled Refresh: In your Power BI dataset, configure the data source credentials to point to the PAT stored in Azure Key Vault. Set up a scheduled refresh based on your desired frequency (e.g., daily, weekly).

Automated Refresh Using PowerShell:

  • Create a PowerShell script that:
    • Retrieves the updated PAT from Azure Key Vault.
    • Updates the Power BI dataset connection string with the new PAT.
    • Triggers a dataset refresh using the Power BI REST API.
  • Schedule this script to run periodically (e.g., every 70 days) using a task scheduler or Azure Logic Apps.

OAuth Tokens Instead of PATs:

  • As a security best practice, consider using OAuth tokens instead of PATs.
  • Configure your Databricks workspace to issue OAuth tokens for service principals.
  • Update your Power BI dataset connection to use OAuth tokens directly. OAuth tokens have longer lifetimes and can be automatically refreshed.

Custom Solutions:

  • If your security requirements are stringent, consider custom solutions:
    • Develop a custom application that manages PATs and refreshes them automatically.
    • Use Azure Functions or Logic Apps to trigger PAT updates based on a schedule.

Remember to balance security, automation, and ease of maintenance when choosing your approach. Each organization’s requirements may vary, so evaluate the best fit for your specific environment.

 

Additionally, consult the official Power BI and Azure Databricks documentation for detailed implemen....

View solution in original post

3 REPLIES 3

SSundaram
Contributor

May be not the recommended way, but try to create a token that do not expire for such use cases. Ideally you will need like a custom powershell type of solution to get it automated completely. 

js54123875
New Contributor III

Thank you for the reply.  Unfortunately this does not meet our security requirements

Kaniz
Community Manager
Community Manager

Hi @js54123875 , Certainly! Refreshing a Power BI dataset with a Service Principal and managing PATs can be achieved through a combination of best practices.

 

 Let’s explore some approaches:

 

Service Principal and Azure Key Vault:

  • Create a Service Principal: Set up a service account (user) in Azure Active Directory (AAD) specifically for Power BI Databricks connections. This account will act as the intermediary between Power BI and Databricks.
  • Azure Key Vault Integration: Store your PAT securely in Azure Key Vault. Configure the Key Vault access policies to allow the Power BI service principal to unwrap the PAT.
  • Scheduled Refresh: In your Power BI dataset, configure the data source credentials to point to the PAT stored in Azure Key Vault. Set up a scheduled refresh based on your desired frequency (e.g., daily, weekly).

Automated Refresh Using PowerShell:

  • Create a PowerShell script that:
    • Retrieves the updated PAT from Azure Key Vault.
    • Updates the Power BI dataset connection string with the new PAT.
    • Triggers a dataset refresh using the Power BI REST API.
  • Schedule this script to run periodically (e.g., every 70 days) using a task scheduler or Azure Logic Apps.

OAuth Tokens Instead of PATs:

  • As a security best practice, consider using OAuth tokens instead of PATs.
  • Configure your Databricks workspace to issue OAuth tokens for service principals.
  • Update your Power BI dataset connection to use OAuth tokens directly. OAuth tokens have longer lifetimes and can be automatically refreshed.

Custom Solutions:

  • If your security requirements are stringent, consider custom solutions:
    • Develop a custom application that manages PATs and refreshes them automatically.
    • Use Azure Functions or Logic Apps to trigger PAT updates based on a schedule.

Remember to balance security, automation, and ease of maintenance when choosing your approach. Each organization’s requirements may vary, so evaluate the best fit for your specific environment.

 

Additionally, consult the official Power BI and Azure Databricks documentation for detailed implemen....

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.