cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

databricks OAuth is not supported for this host

bradleyjamrozik
New Contributor III

I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:

Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspace>.azuredatabricks.net, client_id=***, client_secret=***. Env: DATABRICKS_HOST, DATABRICKS_CLIENT_ID, DATABRICKS_CLIENT_SECRET
##[error]Bash exited with code '1'.

The pipeline is nothing complicated:

trigger:
- development

pool:
  vmImage: ubuntu-latest

variables:
- group: "DevOps Service Credentials Databricks"

stages:
- stage: "Development_Deploy"
  jobs:
  - job: "Deploy_Bundle"
    steps:
    - script: |
        curl -fsSL https://raw.githubusercontent.com/databricks/setup-cli/main/install.sh | sh
      displayName: 'Install Databricks CLI'
    - script: |
        databricks bundle deploy
      displayName: 'Deploy Databricks Asset Bundle'
      env:
        DATABRICKS_HOST: https://<workspace>.azuredatabricks.net
        DATABRICKS_CLIENT_ID: $(databricksdevopsdeployment-clientid)
        DATABRICKS_CLIENT_SECRET: $(databricksdevopsdeployment-clientsecret)

Any ideas? What am I doing wrong?

1 ACCEPTED SOLUTION

Accepted Solutions

saadansari-db
New Contributor III
New Contributor III

Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work

Specifically

ARM_CLIENT_ID
ARM_TENANT_ID
ARM_CLIENT_SECRET

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 for reference,

Hope that helps!

 

View solution in original post

5 REPLIES 5

Kaniz
Community Manager
Community Manager

Hi @bradleyjamrozikThe error message suggests that the Databricks OAuth is unsupported for the host you're trying to connect to. The issue may be due to incorrect configuration values being used. Ensure that the DATABRICKS_HOSTDATABRICKS_CLIENT_ID, and DATABRICKS_CLIENT_SECRET environment variables are set correctly in your pipeline configuration. 

Here are the steps you can take to troubleshoot and resolve the issue.

1. Check the DATABRICKS_HOST value: Ensure it's the correct URL for your Databricks workspace.
2. Validate DATABRICKS_CLIENT_ID and DATABRICKS_CLIENT_SECRET: These should be correct and tied to an app registration in Azure AD with the correct permissions.
3. Ensure that the app registration in Azure AD has been granted the necessary API permissions and that these permissions have been granted admin consent in your Azure AD tenant.

If you still face issues after performing these steps, there may be a compatibility or bug. In that case, I would recommend filing a Databricks support for further assistance.

Sources:
- [Docs: use-bundles-with-jobs](https://docs.databricks.com/workflows/jobs/how-to/use-bundles-with-jobs.html)
- [Docs: index](https://docs.databricks.com/dev-tools/bundles/index.html)
- [Docs: work-tasks](https://docs.databricks.com/dev-tools/bundles/work-tasks.html)

bradleyjamrozik
New Contributor III

I can confirm that the host/clientid/clientsecret are correct as I can use the same variables with az login and it connects to the workspace successfully; I just get a different error of "cannot create default credentials" for the a job after it successfully uploads the bundle files. Is there a setting I need to check to enable oauth for a given workspace?

saadansari-db
New Contributor III
New Contributor III

Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work

Specifically

ARM_CLIENT_ID
ARM_TENANT_ID
ARM_CLIENT_SECRET

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 for reference,

Hope that helps!

 

That did it, thank you!

tariq
New Contributor III

So in this case the  Azure managed identities authentication needs to be used?