Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...
we ended up using device flow oauth because, as noted above, it is not possible to launch a browser on the Databricks cluster from a notebook so you cannot use "externalBrowser" flow. It gives you a url and a code and you open the url in a new tab an...
What are the minimum permissions are required to search and view objects in Data Explorer? For example, does a user have to have `USE [SCHEMA|CATALOG]` to search or browse in the Data Explorer? Or can anyone with workspace access browse objects and, ...
Circling back to this. With one of the recent releases you can now GRANT BROWSE at the catalog level! Hopefully they will be rolling this feature out at every object level (schemas and tables specifically).
Currently, I am both an account administrator and also a workspace administrator in Databricks.When I try to enable the entitlements "Workspace access" and "Databricks SQL access" to account groups I am receiving the error "Failed to enable entitlem...
Hi @Arnold_Souza,
The error "Failed to enable entitlement.: Group not found" that you're experiencing when trying to enable the entitlements “Workspace access” and “Databricks SQL access” for account groups is likely due to the fact that Identity Fed...
I am trying to add a non-SSO admin user to my account (not to a workspace). I have SSO backed off to Google for the majority of users.I can create the account OK, then go in and reset the password to something, but when I try and log in I get the err...
I have an Azure service principle that is used for our CI/CD pipelines. We do not have access to the Databricks UI via user logins. Our github repos also require SSO PATs. How can I configure the git integration for the service principal so that I ca...
Hi all,My organization has changed our domain emails and now all databricks users can't login.We can only login into azure portal with our new domain email.The message is the following (using the new domain)I wonder if there is a way to upload all us...
Hi @Prabakar Ammeappin thanks for your response. I wanted to know if the domain name change is transparent within the same workspace. We don't need to migrate data, only replace old domain with new domain. Do you think this is possible?
Accounts added after we turned on SSO don't allow me to restrict their cluster creation abilities. How can I undo this, so I can prevent business people from writing to ETLed data?
Do you support SSO with any IdP which supports SAML 2.0 (e.g. Auth0) or is it limited to https://docs.databricks.com/administration-guide/users-groups/single-sign-on/index.html#supported-identity-providers?
I am currently having few applications (say App1, App2) along with databricks all integrated with auth0. Now what I wanted to achieve is that when we login to say databricks and then access other apps url in another tab it should not ask for login in...
After enabling SSO on my account I now don't have any way to change my terraform for provisioning AWS workspaces because username/password is disabled. Is there a workaround for this?
Never mind, the account owner creds do work, but I had to add the account owner to all of the workspaces. The terraform didn't give me an informative error, it just hung forever when applying.
I want to be able to refresh tokens generated on behalf of a service principal via Token Management API, just like with any other service where OAuth is used and refresh token endpoint is available. Allowing indefinite or very long expiration for acc...
Refresh option would be useful.In Azure you could use Azure automation to make "refresh" script: delete if still existscreate token via: "databricks tokens create" put it to Azure Key Vault with expiration data
Should we create a Databricks user for airflow and generate a personal access token for it? We also have gsuite SSO enabled, does that mean I need to create a gsuite account for the user as well?
I would recommend having the 'user' the Databricks Jobs are triggered by as a dedicated user. This is what I would consider a 'Service Account' and I'll drop a definition for that type of user below.Seeing that you have SSO enabled, I might create th...