cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RamlaSuhra
by New Contributor II
  • 665 Views
  • 0 replies
  • 0 kudos

Can you all share your experiences on rolling out new features in Workspaces managed by you as Admin

As Data Platform Admins, do you follow some standard process or self service approach towards rolling out new features in your workspaces? Is the process automated? How is the testing done? Please share your thoughts. Given UC is introducing new feat...

  • 665 Views
  • 0 replies
  • 0 kudos
RozaZaharieva
by New Contributor
  • 6076 Views
  • 2 replies
  • 2 kudos

Get Azure Databricks Account ID

Hi everyone,Is it possible with Terraform or Azure CLI or any other not manual method to get the value for Azure Databricks Account ID and not to use manual method as is described here - https://learn.microsoft.com/en-us/azure/databricks/administrati...

Administration & Architecture
azuredatabricks
iac
Terraform
  • 6076 Views
  • 2 replies
  • 2 kudos
Latest Reply
SHeisterkamp
New Contributor II
  • 2 kudos

I am also looking for a solution to this. The path suggested by @Retired_mod does not work! When I run the proposed az cli, this is what I get:```$> az databricks workspace show --resource-group $my_rg --name $my_ws --query 'id'"/subscriptions/64e..<...

  • 2 kudos
1 More Replies
adb-rm
by New Contributor II
  • 2482 Views
  • 4 replies
  • 0 kudos

Unity catalog and metadata to backup site

Hi ,I am looking for Databricks DR(Disaster recovery)  site creation. My  primary setup is in west us and my primary data bricks created in primary site with unity catalog enable. with same meta data i want bring up the Databricks in my secondary sid...

  • 2482 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

As of now UC does not have DR setting, this is currently in our prioritized roadmap but no ETA is currently available. Some customers host an external metastore and replicate it across regions to achieve this.

  • 0 kudos
3 More Replies
RamlaSuhra
by New Contributor II
  • 593 Views
  • 0 replies
  • 0 kudos

How are teams platform admin teams managing "allow list" for libraries feature UC?

We have so many team using maven libraries and are in the process of UC migration. These maven coordinates need to be added to "allow list" before they can be used in clusters. What is the standard process followed by admin teams for this feature? Do...

  • 593 Views
  • 0 replies
  • 0 kudos
mh_db
by New Contributor III
  • 643 Views
  • 0 replies
  • 0 kudos

link databrick to AWS GOV cloud

Does anyone know how to link databricks to AWS GOV cloud? I subscribed to databricks and created databricks account from the link in Marketplace. When I go to databricks account console the URL doesn't take me to the URL reference in the documentatio...

  • 643 Views
  • 0 replies
  • 0 kudos
mh_db
by New Contributor III
  • 732 Views
  • 0 replies
  • 0 kudos

Databricks AWS ACLs requirement

I'm going through the documentation in databricks for using existing VPC requirements. For the section that covers ACLs, what's the need to have this inbound rule: " ALLOW ALL from Source 0.0.0.0/0. This rule must be prioritized."https://docs.databri...

Administration & Architecture
Architecture
AWS
VPC
  • 732 Views
  • 0 replies
  • 0 kudos
Etyr
by Contributor
  • 2131 Views
  • 3 replies
  • 0 kudos

Unity Catalog read issue

Hello,Our company is POCing the Unity Catalog with Azure as provider.We have 2 subscriptions that contains 1 databricks each and 1 ADLS GEN2 each.Initially we have the default `hive_metastore` connected to the ADLS GEN2. I've created a secret scope a...

Etyr_0-1717660060863.png Etyr_1-1717660177530.png Etyr_2-1717663031618.png
  • 2131 Views
  • 3 replies
  • 0 kudos
Latest Reply
Etyr
Contributor
  • 0 kudos

Hello,The UC catalog container is sitting in Sub B.Basically we have a Spoke and Hub configuration for each Subscriptions. Each Subscriptions can access to any ressource inside it own subscription with some PE. But to access other Subscription ressou...

  • 0 kudos
2 More Replies
aghiya
by New Contributor
  • 1111 Views
  • 1 replies
  • 0 kudos

Using JARs from Google Cloud Artifact Registry in Databricks for Job Execution

We have our CI/CD pipelines set up in Google Cloud using Cloud Build, and we are publishing our artifacts to a private repository in Google Cloud's Artifact Registry. I want to use these JAR files to create and run jobs in Databricks.However, when I ...

  • 1111 Views
  • 1 replies
  • 0 kudos
Latest Reply
brijeshgoud1
New Contributor II
  • 0 kudos

To integrate your CI/CD pipeline with Databricks, you have a couple of options:1. Using Artifact Registry as an intermediary: Currently, Databricks does not directly support Google Artifact Registry URLs. However, you can use an intermediate storage ...

  • 0 kudos
tonypiazza
by New Contributor II
  • 2296 Views
  • 1 replies
  • 1 kudos

Resolved! Enable git credential for entire Databricks account

Hello,I have a question regarding git authentication in the context of a Databricks job. I know that a GitHub Personal Access Token can be generated for a user using GitHub App authorization flow. My team is currently configuring Terraform templates ...

  • 2296 Views
  • 1 replies
  • 1 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 1 kudos

@tonypiazza good day! To authorize access to all private repositories within a GitHub organization at the account level, without configuring git credentials on a per-user basis, follow these steps: 1. Use a GitHub Machine User:- Create a GitHub machi...

  • 1 kudos
Conlyn
by New Contributor III
  • 3482 Views
  • 2 replies
  • 0 kudos

Resolved! Terraform databricks_grants errors on external_location

We are using terraform to setup Unity Catalog external locations and when using databricks_grants to set permissions on the external locations it throws the following error:  Error: cannot create grants: permissions for external_location-test_locatio...

  • 3482 Views
  • 2 replies
  • 0 kudos
Latest Reply
Conlyn
New Contributor III
  • 0 kudos

I figured out my issue... The principal name is case sensitive and if the input value doesn't match the case of the email address or Group Name in the workspace/account it throws that ambiguous error.  

  • 0 kudos
1 More Replies
Francis
by New Contributor II
  • 5849 Views
  • 3 replies
  • 0 kudos

Setup Alerts to monitor cost in dartabricks

Hi Team Tech,Recently we had an huge bill in databricks and we want to somehow have an alerting system in place to monitor cost.Please can someone help me to know how to setup email notification for cost alerts in databricks.As we need to setup an em...

  • 5849 Views
  • 3 replies
  • 0 kudos
Latest Reply
TravBricks
Databricks Employee
  • 0 kudos

This is an old thread but for posterity... @Francis What you were looking for is not available in the workspace setting but in the account console. The unified way of tracking costs within Databricks is through System Tables, specifically billable us...

  • 0 kudos
2 More Replies
ksenija
by Contributor
  • 1298 Views
  • 3 replies
  • 0 kudos

Connecting Databricks and Azure Devops

Hi everyone,When I tried to create new Databricks job that is using a notebook from a repo, it asked me to set up Azure DevOps Services (Personal access token) in Linked Accounts under my username. And now every time I want to create a new branch or ...

ksenija_0-1717148736222.png ksenija_1-1717148797484.png
  • 1298 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @ksenija , Looks like the Git credentials for the job uses a different account and are missing.  The job is configured to use {a particular user} but this account has credentials for {another configured user}. So you need to update the git details...

  • 0 kudos
2 More Replies
746837
by New Contributor II
  • 4965 Views
  • 3 replies
  • 0 kudos

Databricks and SMTP

Using databricks as aws partner trying to run python script to validate email addresses.  Whenever it gets to the smtp portion it times out.  I am able to telnet from python to the POP servers and get a response, I can ping domains and get replies, b...

  • 4965 Views
  • 3 replies
  • 0 kudos
Latest Reply
Babu_Krishnan
Contributor
  • 0 kudos

@746837 , Did you resolve this issue ? 

  • 0 kudos
2 More Replies
Kutbuddin
by New Contributor II
  • 4364 Views
  • 5 replies
  • 1 kudos

Resolved! Stream Query termination using available now trigger and toTable.

We are running a streaming job in databricks with custom streaming logic which consumes a CDC stream from mongo and appends to a delta table, at the end of the streaming job we have a internal checkpointing logic which creates an entry into a table w...

  • 4364 Views
  • 5 replies
  • 1 kudos
Latest Reply
Kutbuddin
New Contributor II
  • 1 kudos

I was expecting spark.sql(f"insert into table {internal_tab_name} values({dt})") to execute at the end after the streaming query was written to the table. What I observed:The spark sql query spark.sql(f"insert into table {internal_tab_name} values({d...

  • 1 kudos
4 More Replies