- 863 Views
- 2 replies
- 0 kudos
When I create a new user in Databricks, the new user does not receive their onboarding email. It is not in their junkmail, deleted items or in their inbox.However, when I reset that user's password, they do receive the password reset link, and are a...
- 863 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @billraper ,
I'm sorry to hear about the trouble. Would you mind sharing more about whether this is happening with a community portal profile or with a product profile? Please share the link to the profile with which you're experiencing this issue...
1 More Replies
- 653 Views
- 1 replies
- 0 kudos
HiI am not able to login to account management (https://accounts.cloud.databricks.com), It somehow reninforce SSO, cannot login with username and password.
- 653 Views
- 1 replies
- 0 kudos
- 1378 Views
- 1 replies
- 1 kudos
Are there any perspectives in Databricks' roadmap for enabling coarse-grained access management for jobs?Currently, access to jobs has to be managed on a job by job basis: https://docs.databricks.com/en/security/auth-authz/access-control/index.html#j...
- 1378 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @Retired_mod, thanks for your reply.A more mature access management concept in Databricks would be definitely terrific. I understand it's not entirely along the AI-lines that Databricks is pushing hard currently, but it would improve a lot the pla...
- 19701 Views
- 8 replies
- 3 kudos
Hi,I want to run a dbt workflow task and would like to use the git integration for that. Using my personal user I am able to do so but I am running my workflows using a service principal.I added git credentials and the repository using terraform. I a...
- 19701 Views
- 8 replies
- 3 kudos
Latest Reply
I created that link using the "Share" button in the post but it's broken, sorry Here's a working link to the discussion: https://community.databricks.com/t5/data-engineering/git-credentials-for-service-principals-running-jobs/td-p/73802
7 More Replies
- 4778 Views
- 3 replies
- 3 kudos
as far i know currently ((as of 03-25-2024) databricks don't any workspace admin settings option to restrict users from creating a workflow/job or delta pipelines. Here is the use case for it Example: you have 3 tier landscape Dev, Qa and Prod.It is ...
- 4778 Views
- 3 replies
- 3 kudos
Latest Reply
I notice that a separate discussion overlaps with the OPs issue: https://community.databricks.com/t5/data-engineering/restricting-workflow-creation-and-implementing-approval/td-p/4336@Retired_mod do you have a mechanism for clustering discussions fo...
2 More Replies
- 625 Views
- 0 replies
- 0 kudos
As Data Platform Admins, do you follow some standard process or self service approach towards rolling out new features in your workspaces? Is the process automated? How is the testing done? Please share your thoughts. Given UC is introducing new feat...
- 625 Views
- 0 replies
- 0 kudos
- 5680 Views
- 2 replies
- 2 kudos
Hi everyone,Is it possible with Terraform or Azure CLI or any other not manual method to get the value for Azure Databricks Account ID and not to use manual method as is described here - https://learn.microsoft.com/en-us/azure/databricks/administrati...
- 5680 Views
- 2 replies
- 2 kudos
Latest Reply
I am also looking for a solution to this. The path suggested by @Retired_mod does not work! When I run the proposed az cli, this is what I get:```$> az databricks workspace show --resource-group $my_rg --name $my_ws --query 'id'"/subscriptions/64e..<...
1 More Replies
by
adb-rm
• New Contributor II
- 2251 Views
- 4 replies
- 0 kudos
Hi ,I am looking for Databricks DR(Disaster recovery) site creation. My primary setup is in west us and my primary data bricks created in primary site with unity catalog enable. with same meta data i want bring up the Databricks in my secondary sid...
- 2251 Views
- 4 replies
- 0 kudos
Latest Reply
As of now UC does not have DR setting, this is currently in our prioritized roadmap but no ETA is currently available.
Some customers host an external metastore and replicate it across regions to achieve this.
3 More Replies
- 563 Views
- 0 replies
- 0 kudos
We have so many team using maven libraries and are in the process of UC migration. These maven coordinates need to be added to "allow list" before they can be used in clusters. What is the standard process followed by admin teams for this feature? Do...
- 563 Views
- 0 replies
- 0 kudos
by
Jcea
• New Contributor
- 792 Views
- 0 replies
- 0 kudos
What is the best practice to do a paralele test for many chaches on the same ETL?
- 792 Views
- 0 replies
- 0 kudos
by
mh_db
• New Contributor III
- 588 Views
- 0 replies
- 0 kudos
Does anyone know how to link databricks to AWS GOV cloud? I subscribed to databricks and created databricks account from the link in Marketplace. When I go to databricks account console the URL doesn't take me to the URL reference in the documentatio...
- 588 Views
- 0 replies
- 0 kudos
by
mh_db
• New Contributor III
- 685 Views
- 0 replies
- 0 kudos
I'm going through the documentation in databricks for using existing VPC requirements. For the section that covers ACLs, what's the need to have this inbound rule: " ALLOW ALL from Source 0.0.0.0/0. This rule must be prioritized."https://docs.databri...
- 685 Views
- 0 replies
- 0 kudos
- 1930 Views
- 3 replies
- 0 kudos
- 1930 Views
- 3 replies
- 0 kudos
Latest Reply
Hello,The UC catalog container is sitting in Sub B.Basically we have a Spoke and Hub configuration for each Subscriptions. Each Subscriptions can access to any ressource inside it own subscription with some PE. But to access other Subscription ressou...
2 More Replies
- 1025 Views
- 1 replies
- 0 kudos
We have our CI/CD pipelines set up in Google Cloud using Cloud Build, and we are publishing our artifacts to a private repository in Google Cloud's Artifact Registry. I want to use these JAR files to create and run jobs in Databricks.However, when I ...
- 1025 Views
- 1 replies
- 0 kudos
Latest Reply
To integrate your CI/CD pipeline with Databricks, you have a couple of options:1. Using Artifact Registry as an intermediary: Currently, Databricks does not directly support Google Artifact Registry URLs. However, you can use an intermediate storage ...
- 2216 Views
- 1 replies
- 1 kudos
Hello,I have a question regarding git authentication in the context of a Databricks job. I know that a GitHub Personal Access Token can be generated for a user using GitHub App authorization flow. My team is currently configuring Terraform templates ...
- 2216 Views
- 1 replies
- 1 kudos
Latest Reply
@tonypiazza good day!
To authorize access to all private repositories within a GitHub organization at the account level, without configuring git credentials on a per-user basis, follow these steps:
1. Use a GitHub Machine User:- Create a GitHub machi...