cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Chris_Shehu
by Valued Contributor III
  • 3809 Views
  • 7 replies
  • 4 kudos

On-Behalf of tokens disabled for Azure Environments?

While trying to setup a Power BI connection to the Azure Delta Lake we ran into several issues around Service Principals. ​1) The API listed on the learn.microsoft site (link 1 below) indicates that there is an API you can use to create SP tokens. Wh...

  • 3809 Views
  • 7 replies
  • 4 kudos
Latest Reply
meetskorun
New Contributor II
  • 4 kudos

hello,i am new here from india, here to share some thoughts with you all

  • 4 kudos
6 More Replies
User16270906190
by New Contributor III
  • 4733 Views
  • 8 replies
  • 5 kudos

customer is trying to generate a Databricks token for a service principal (SP). They’ve created the SP in Azure AD and have used the Databricks rest ...

customer is trying to generate a Databricks token for a service principal (SP). They’ve created the SP in Azure AD and have used the Databricks rest api to add it as an admin. When using the Databricks rest api /api/2.0/token-management/on-behalf-of...

  • 4733 Views
  • 8 replies
  • 5 kudos
Latest Reply
Buxert
New Contributor II
  • 5 kudos

Having the same problem here. @Nitisha Nigam​ did you solve it?

  • 5 kudos
7 More Replies
venkat-bodempud
by New Contributor III
  • 4202 Views
  • 4 replies
  • 7 kudos

Power BI - Databricks Integration using Service Principal

Hello Community,We are able to connect to databricks(using Personal access token) from Power BI Desktop and we able to set up scheduling databricks notebook using DataFactory for every 10 minutes(as per our requirement). We want to avoid using the pe...

  • 4202 Views
  • 4 replies
  • 7 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 7 kudos

You can use the token generated for the service principal and use it. As a security best practice, when authenticating with automated tools, systems, scripts, and apps, Databricks recommends you use access tokens belonging to service principals inste...

  • 7 kudos
3 More Replies
powerus
by New Contributor III
  • 4662 Views
  • 1 replies
  • 0 kudos

Resolved! "Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key" using com.databricks:spark-xml_2.12:0.12.0

Hi community,I'm trying to read XML data from Azure Datalake Gen 2 using com.databricks:spark-xml_2.12:0.12.0:spark.read.format('XML').load('abfss://[CONTAINER]@[storageaccount].dfs.core.windows.net/PATH/TO/FILE.xml')The code above gives the followin...

  • 4662 Views
  • 1 replies
  • 0 kudos
Latest Reply
powerus
New Contributor III
  • 0 kudos

The issue was also raised here: https://github.com/databricks/spark-xml/issues/591A fix is to use the "spark.hadoop" prefix in front of the fs.azure spark config keys:spark.hadoop.fs.azure.account.oauth2.client.id.nubulosdpdlsdev01.dfs.core.windows.n...

  • 0 kudos
Orianh
by Valued Contributor II
  • 3497 Views
  • 3 replies
  • 1 kudos

Resolved! Attach instance profile to service principal.

Hey Guys, I'm having some permission issues using service principal and instance profile and i hope you could help me.I created a service principal and attached to it an instance profile - databricks-my-profile.I have a s3 bucket with policy that all...

  • 3497 Views
  • 3 replies
  • 1 kudos
Latest Reply
Orianh
Valued Contributor II
  • 1 kudos

Hey @Kaniz Fatma​ , @Debayan Mukherjee​, Thanks for your answers.Actually, Databricks is not support using DBFS API with service principal & attached instance profile on a mounted s3 bucket.I'm not sure if this exists in docs (might miss it) but thi...

  • 1 kudos
2 More Replies
sfalquier
by New Contributor II
  • 1988 Views
  • 3 replies
  • 0 kudos

HTTP 403 on git-credentials API

Hi,I am trying to set git credentials for my service principal. I follow the process described here but I get a 403 error when making the POST request to ${DATABRICKS_HOST}/api/2.0/git-credentials with service principal token.By the way, I also canno...

  • 1988 Views
  • 3 replies
  • 0 kudos
Latest Reply
Vivian_Wilfred
Honored Contributor
  • 0 kudos

Hi @Sébastien FALQUIER​ it works for me, there are no restrictions. Maybe the PAT token you generated for the service principle got expired. Can you generate a new token and try to run GET/git-credentials API?How are you creating PAT for service prin...

  • 0 kudos
2 More Replies
jefft
by New Contributor III
  • 2204 Views
  • 3 replies
  • 1 kudos

Databricks integration with DocumentDB

Hi Everyone,I was wondering if anyone here has any experience or tips reading data from AWS DocumentDB. I am working on this using the MongoDB connector. For DocumentDB we also need to work with the required creds issued as a .pem file by AWS. Th...

  • 2204 Views
  • 3 replies
  • 1 kudos
Latest Reply
jefft
New Contributor III
  • 1 kudos

Hi @Kaniz Fatma​ ,Thank you so much for your response. Your suggestions were helpful. As per the AWS documentation, DocumentDB is MongoDB compatible. "With Amazon DocumentDB, you can run the same application code and use the same drivers and tools th...

  • 1 kudos
2 More Replies
noimeta
by Contributor II
  • 1327 Views
  • 0 replies
  • 0 kudos

How to use Terraform to add Git provider credentials to a workspace in order to use service principal for CI/CD

Hi,I'm very new to Terraform. Currently, I'm trying to automate the service principal setup process using Terraform.Following this example, I successfully created a service principal and an access token. However, when I tried adding databricks_git_cr...

  • 1327 Views
  • 0 replies
  • 0 kudos
vk217
by Contributor
  • 2366 Views
  • 3 replies
  • 0 kudos

Resolved! Token management is not enabled for this feature tier

I want to create a personal access token for a service principal so that I can use that service principal personal access token in the databricks-connect configure command in an automated build. I followed the instructions from here.https://docs.data...

  • 2366 Views
  • 3 replies
  • 0 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 0 kudos

@Vikas B​ https://docs.databricks.com/dev-tools/api/latest/scim/scim-sp.html#scim-api-20-serviceprincipals let me know if this helps.

  • 0 kudos
2 More Replies
Mohit_m
by Valued Contributor II
  • 9431 Views
  • 1 replies
  • 3 kudos

Resolved! How to Install Python packages from the own artifactory

We have created our own artifactory and we use this to install python dependencies or libraries.We would like to know how we can make use of our own artifactory to install dependencies or libraries on Databricks clusters..

  • 9431 Views
  • 1 replies
  • 3 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 3 kudos

For private repos, you can find some good examples herehttps://kb.databricks.com/clusters/install-private-pypi-repo.htmlhttps://towardsdatascience.com/install-custom-python-libraries-from-private-pypi-on-databricks-6a7669f6e6fd

  • 3 kudos
amichel
by New Contributor III
  • 5135 Views
  • 3 replies
  • 2 kudos

Resolved! Is there a way to refresh tokens issued on behalf of service principal?

I want to be able to refresh tokens generated on behalf of a service principal via Token Management API, just like with any other service where OAuth is used and refresh token endpoint is available. Allowing indefinite or very long expiration for acc...

  • 5135 Views
  • 3 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Refresh option would be useful.In Azure you could use Azure automation to make "refresh" script: delete if still existscreate token via: "databricks tokens create" put it to Azure Key Vault with expiration data

  • 2 kudos
2 More Replies
Labels