cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mbanxp
by New Contributor III
  • 2163 Views
  • 5 replies
  • 4 kudos

Resolved! Metastore deletion issues

Good afternoon, I have an issue with my metastore in North Europe.All my workspaces got detached:If I go to Databricks console, I can see the metastore in North Europe I created.However, when I select the metastore in North Europe, I get the followin...

mbanxp_0-1751622293898.png mbanxp_1-1751621952878.png mbanxp_2-1751621985852.png mbanxp_3-1751622036461.png
  • 2163 Views
  • 5 replies
  • 4 kudos
Latest Reply
mbanxp
New Contributor III
  • 4 kudos

I solved the issue by deleting all the asignments before deleting the metastore.1. Access to Databricks CLI and authenticate 2. List metastores>> databricks account metastores list 3. List wotrkspaces and check assignments>> databricks account worksp...

  • 4 kudos
4 More Replies
EjB
by New Contributor
  • 1333 Views
  • 1 replies
  • 1 kudos

Drop schema or catalog using cascade function

Hello In Databricks (non-Unity Catalog), I have two schemas (schema_a and schema_b) that both use the same root location in DBFS or external storage like ADLS.Example:abfss://container@storage_account.dfs.core.windows.net/data/project/schema_aabfss:/...

  • 1333 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hello @EjB For the given example, here is the response:Will DROP SCHEMA schema_a CASCADE remove or affect tables in schema_b?No, unless:1. The tables in schema_a are managed tables, AND2. Tables in schema_b store their data physically inside /schema_...

  • 1 kudos
Kutbuddin
by New Contributor III
  • 2064 Views
  • 3 replies
  • 0 kudos

[INTERNAL_ERROR] Query could not be scheduled: HTTP Response code: 503. Please try again later

We have a databricks job configured to run a dbt project. The dbt cli compute cluster being used is serverless with a serverless sql warehouse. We encountered this error during a run. SQLSTATE: XX000Any idea why this occurred?

  • 2064 Views
  • 3 replies
  • 0 kudos
Latest Reply
Amine8089
New Contributor II
  • 0 kudos

Hi,We are experiencing same recurring HTTP errors throughout the day when executing queries on Databricks. The specific error message we receive is: "[INTERNAL_ERROR] Query could not be scheduled: HTTP Response code: 503. Please try again later. SQLS...

  • 0 kudos
2 More Replies
antonionuzzo
by New Contributor III
  • 2011 Views
  • 2 replies
  • 3 kudos

Resolved! System tables performance optimization

Hi AllAre there any Databricks lab projects or GitHub repositories that leverage system tables to provide dashboards or code for monitoring, and more importantly, for optimizing workflows and clusters based on usage?

  • 2011 Views
  • 2 replies
  • 3 kudos
Latest Reply
Sharanya13
Contributor III
  • 3 kudos

+1 to @szymon_dybczak. I would also add the dashboards for DB SQL Warehouse monitoring.

  • 3 kudos
1 More Replies
MBV3
by Contributor
  • 3186 Views
  • 6 replies
  • 0 kudos

Unable to see sample data in Hive Metastore after moving to GCE

Hi,We have recently moved from GKE to GCE, it is taking forever to load the sample data in the manged delta tables.Even running simple select sql statements are taking forever. Totally clueless here, any help will be appreciatedThanks

  • 3186 Views
  • 6 replies
  • 0 kudos
Latest Reply
MBV3
Contributor
  • 0 kudos

Hi All,Strangely after struggle for 2 days we figured out that we can't run the cluster in scalable mode, so after selecting single node mode we are able to execute queries and job. It seems there is a bug in the Databrick's GKE to GCE migration. Won...

  • 0 kudos
5 More Replies
Chinu
by New Contributor III
  • 1228 Views
  • 1 replies
  • 0 kudos

Resolved! Best Approach to Retrieve Policy IDs Across Multiple Workspaces

Hi, I’m aware of the API endpoint api/2.0/policies/clusters/list to fetch a list of policy IDs and names. However, we have 50 different workspaces, and I need to retrieve the specific policy ID and name for each one.Could you advise on the most effic...

  • 1228 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Chinu ,Databricks does not provide a global API to query all workspaces in a single call. I guess your only option for now is to use scripting approach.

  • 0 kudos
quad_t
by New Contributor III
  • 5379 Views
  • 2 replies
  • 1 kudos

Resolved! [Azure Databricks]: Use managed identity to access mlflow models and artifacts

Hello! I am new to Azure Databricks and have a question: In my current setup, I am running some containerized python code within an azure functions app. In this code, I need to download some models and artifacts stored via mlflow in our Azure Databri...

  • 5379 Views
  • 2 replies
  • 1 kudos
Latest Reply
ali_daei
New Contributor II
  • 1 kudos

Hi @quad_t, were you able to find a solution to this problem? I'm having similar issues when trying to use MSI to connect to MLflow.

  • 1 kudos
1 More Replies
raffael
by New Contributor III
  • 1433 Views
  • 1 replies
  • 1 kudos

Resolved! ADLS Gen2 with Unity Catalog on Azure Databricks / is Workspace Admin permissions sufficient?

Hello,I want to use an ADLS Gen2 Storage Account for Managed Delta Tables on Azure Databricks. The mounting/connection should be managed by Unity Catalog. There is only going to be a single workspace (for now).Does this require Account Admin permissi...

  • 1433 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @raffael ,It depends on where you want to configure managed storage location. If you want to do this at Metastore level then you have to be an account admin and you need to do this during metastore creation.You can also configure storage location ...

  • 1 kudos
prodrick
by New Contributor II
  • 1970 Views
  • 2 replies
  • 0 kudos

Resolved! Webhook Authentication

If want to send notifications via webhook to Splunk, Datadog, or LogicMonitor, how might I configure Databricks to authenticate using the destination platform's bearer token?

  • 1970 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @prodrick ,It looks like webhooks destination supports only basic authentication using username and password. But you can try to paste bearer token to password section. Some webhook endpoints accept Bearer tokens in the password field while leavin...

  • 0 kudos
1 More Replies
mathiaskvist
by New Contributor III
  • 9031 Views
  • 11 replies
  • 3 kudos

Resolved! Valid Workspace Conf keys

HiI'm trying to automate the configuration of Admin Settings of our Databricks Workspace using Terraform. However identifying the correct config keys is very difficult.Databricks exposes a Workspace Conf API (Enable/disable features | Workspace Conf ...

  • 9031 Views
  • 11 replies
  • 3 kudos
Latest Reply
TMD
Contributor
  • 3 kudos

something that may be of interest. Not a substitute to official documentation:https://github.com/databricks/terraform-provider-databricks/issues/3365 

  • 3 kudos
10 More Replies
Lalit_asky
by New Contributor
  • 2542 Views
  • 1 replies
  • 0 kudos

Serverless and private connectivity - unable to create managed table

Hi,I am trying to setup Private connectivity for my serverless compute to my managed storage which is at the catalog level.I created the NCC, endpoints, external location and credentials with the required access. My storage account public network acc...

  • 2542 Views
  • 1 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

Try to do an nslooup to your storage from a notebook.# From your Databricks cluster, test DNS resolution%sh nslookup yourstorageaccount.blob.core.windows.net # Should resolve to private IP, not public IP   # Check if storage account allows traffic f...

  • 0 kudos
nayan_wylde
by Esteemed Contributor
  • 2761 Views
  • 3 replies
  • 1 kudos

Resolved! Monitoring pool usage

Hi All,I am working on creating a dashboard for databricks instance pools. I am capturing maximum usage and scheduled a jpb to capture the info every 15 mins and I aggregate to see if any point the max usage is greater then 85% of capacity, Is there ...

  • 2761 Views
  • 3 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

The ai_forecast function did the magic and it have the forecast on pool usage that predicted for next 30 days

  • 1 kudos
2 More Replies
MariusE
by New Contributor II
  • 3020 Views
  • 3 replies
  • 2 kudos

Support for Unity Catalog External Data in Fabric OneLake

Hi community!We have set up a Fabric Link with our Dynamics, and want to attach the data in Unity Catalog using the External Data connector.But it doesn't look like Databricks support other than the default dls endpoints against Azure.Is there any wa...

MariusE_0-1751441479398.png
  • 3020 Views
  • 3 replies
  • 2 kudos
Latest Reply
MariusE
New Contributor II
  • 2 kudos

Thanks @szymon_dybczak , do you know if support is on the roadmap? Since the current supported way of doing this with credential passthrough on the Compute is deprecated.Regards Marius

  • 2 kudos
2 More Replies
Corwyn
by New Contributor III
  • 3043 Views
  • 2 replies
  • 3 kudos

Resolved! Can't create a new table from uploaded file.

I've just started using the Community Edition through the AWS marketplace and I'm trying to set up tables to share with a customer.  I've managed to create 3 of the tables, but when uploading a small version of the fourth file, I'm having problems. T...

Corwyn_1-1751310273366.png Corwyn_2-1751310411172.png
  • 3043 Views
  • 2 replies
  • 3 kudos
Latest Reply
Corwyn
New Contributor III
  • 3 kudos

Thank you, Lou.By loading manually, I found the error that wasn't being displayed in the UI. Once I took care of this, everything loaded just fine.

  • 3 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 1201 Views
  • 3 replies
  • 0 kudos

Usage of Databricks apps or UI driven approach to create & maintain Databricks infrastructure

 Hi all,the CI/CD based process to create & maintain Databricks infrastructure (UC securables, Metastore securables, Workspace securables) is resulting into high time to market in our case. So, we are planning to make it UI driven, as on, create a Da...

  • 1201 Views
  • 3 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

UI Driven approach is definetly a bad idea for deployment. I have seen most organisation using terraform or biceps for deployment.Why UI-Driven Infrastructure is Wrong1. No version control or audit trail2. Configuration Drift & Inconsistency3. No Dis...

  • 0 kudos
2 More Replies