cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

alexacas
by New Contributor II
  • 2302 Views
  • 3 replies
  • 0 kudos

Resolved! Help with Databricks SQL Queries

Hi everyone,I’m relatively new to Databricks and trying to optimize some SQL queries for better performance. I’ve noticed that certain queries take longer to run than expected. Does anyone have tips or best practices for writing efficient SQL in Data...

  • 2302 Views
  • 3 replies
  • 0 kudos
Latest Reply
lowedirect
New Contributor II
  • 0 kudos

When working with large datasets in Databricks SQL, here are some practical tips to boost performance:Leverage Partitioning: Partition large Delta tables on columns with high cardinality and frequent filtering (like date or region). It helps Databric...

  • 0 kudos
2 More Replies
raffael
by New Contributor III
  • 1986 Views
  • 3 replies
  • 1 kudos

Resolved! How does reported billing in Azure relate to Databricks?

Hi,I'm confused by how costs in Azure relate to costs in Databricks. I'm currently on Azure Pay-as-you-Go and Databricks Trial. There's nothing on my Azure account going on apart from Databricks.This is the costs bar chart on Azure (€):This is the co...

raffael_2-1744702361699.png raffael_1-1744702336209.png raffael_3-1744702381360.png
  • 1986 Views
  • 3 replies
  • 1 kudos
Latest Reply
raffael
New Contributor III
  • 1 kudos

Thanks.I can't find a documentation on how DBU translates to $/€. The pricing calculator only works for AWS/GCP. Where would I find that info?

  • 1 kudos
2 More Replies
rdadhichi
by New Contributor II
  • 1274 Views
  • 3 replies
  • 0 kudos

Disable 'Allow trusted Microsoft services to bypass this firewall' for Azure Key Vault

Currently even when using vnet injected Databricks workspace, we are unable to fetch the secrets from AKV if the 'Allow trusted Microsoft services to bypass this firewall' is disabled.The secret is used a AKV backed secret scope and the key vault is ...

  • 1274 Views
  • 3 replies
  • 0 kudos
Latest Reply
bauerbrett1
New Contributor II
  • 0 kudos

Any update on this? Is it possible to disable the ' Allow Trusted services....' rule if you are using a private endpoint or whitelist certain IPs? Or is it required no matter what?

  • 0 kudos
2 More Replies
priya-g
by New Contributor II
  • 1395 Views
  • 5 replies
  • 0 kudos

Issue creating a workspace with Databricks Partner Account & AWS Sandbox Environment

Subject:Hello,My name is Priya, and my organization is a Databricks partner. We've been given access to the Partner Academy, and my company has set up an AWS sandbox environment to support our certification preparation.I'm currently trying to set up ...

  • 1395 Views
  • 5 replies
  • 0 kudos
Latest Reply
priya-g
New Contributor II
  • 0 kudos

I do not see anything created under network configuration. Private access settings and vpc endpoints both show 403 status when clicking on them, and there is a default policy under network policies. I will reach out to the account executive. 

  • 0 kudos
4 More Replies
MariuszK
by Valued Contributor III
  • 921 Views
  • 1 replies
  • 0 kudos

Metastore consolidated-northeuropec2-prod-metastore-2.mysql.database.azure.com

Hi,One of network requirments on Databricks site is to allow connection to one of these public addressesconsolidated-northeuropec2-prod-metastore-2.mysql.database.azure.com, do you know if this address is for the legacy hive metastore or it's used by...

  • 921 Views
  • 1 replies
  • 0 kudos
Latest Reply
MariuszK
Valued Contributor III
  • 0 kudos

The answear is that this is used for the legacy hive and if you disable it, you don't need to create an exception in your firewall.

  • 0 kudos
chandru44
by New Contributor
  • 693 Views
  • 1 replies
  • 0 kudos

Why is Databricks Using Private IP Instead of NAT Gateway's Public IP to Connect with Source System

I have a publicly accessible SQL database that is protected by a firewall. I am trying to connect this SQL database to Databricks, but I'm encountering an authentication error. I have double-checked the credentials, port, and host, and they are all c...

  • 693 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

The issue occurs because the Databricks cluster's outbound traffic isn't routed through the NAT Gateway due to misconfigured network settings or conflicting outbound connectivity configurations. This should be mostly resolved by your Network team but...

  • 0 kudos
bhanu_dp
by New Contributor III
  • 1587 Views
  • 4 replies
  • 2 kudos

How to delete or clean Legacy Hive Metastore after successful completion of UC migration

Say we have completed the migration of tables from Hive Metastore to UC. All the users, jobs and clusters are switched to UC. There is no more activity on Legacy Hive Metastore.What is the best recommendation on deleting or cleaning the Hive Metastor...

  • 1587 Views
  • 4 replies
  • 2 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 2 kudos

Sometime in the last couple of days, this setting was pushed to my account, it looks like what you want:To see if you've been added, go to your Account Console and look under Previews.

  • 2 kudos
3 More Replies
noorbasha534
by Valued Contributor II
  • 1803 Views
  • 4 replies
  • 2 kudos

Resolved! Disable ability to choose PHOTON

Dear all,as an administrator, I want to restrict developers from choosing 'photon' option in job clusters. I see this in the job definition when they choose it -"runtime_engine": "PHOTON"How can I pass this as input in the policy and restrict develop...

  • 1803 Views
  • 4 replies
  • 2 kudos
Latest Reply
mnorland
Valued Contributor
  • 2 kudos

You also need to make sure the policy permissions are set up properly. You can/should fix preexisting compute affected by the policy with the wizard in the policy edit screen.

  • 2 kudos
3 More Replies
Behwar
by New Contributor III
  • 5043 Views
  • 5 replies
  • 1 kudos

Databricks App in Azure Databricks with private link cluster (no Public IP)

Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...

  • 5043 Views
  • 5 replies
  • 1 kudos
Latest Reply
rugger-bricks
Databricks Employee
  • 1 kudos

Behwar : you should have to create a specific private DNS zone for azure.databricksapps.com - if you do a nslookup on your apps url - you will see that it points to your workspace. In Azure using Azure (recursive) DNS you can see an important behavio...

  • 1 kudos
4 More Replies
owly
by New Contributor
  • 432 Views
  • 1 replies
  • 0 kudos

remove s3 buckets

Hi,My databricks is based on AWS S3, I deleted my buckets, now Databricks is not working, how do I delete my Databricks?regards

  • 432 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @owly! To delete Databricks after AWS S3 bucket deletion: - Terminate all clusters and instance pools.- Clean up associated resources, like IAM roles, S3 storage configurations, and VPCs.- Delete the workspace from the Databricks Account Consol...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 3378 Views
  • 1 replies
  • 0 kudos

Databricks delta sharing design

DearsI wanted to have a mindshare around delta sharing - how do you decide how many shares to be created and share with other departments if you are maintaining an enterprise wide data warehoouse/lakehouse using Azure Databricks. I see from the docum...

  • 3378 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hi @noorbasha534 ,Let me share a bit about our use case and how we’re handling Delta Sharing.Delta Sharing is indeed a simple and lightweight solution, and one of its main advantages is that it’s free to use. However, it still has several limitations...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 1185 Views
  • 4 replies
  • 0 kudos

get permissions assignment done from the workspaces UI

Hi all,I am looking to capture events of permissions assigned on catalog/schemas/tables/views from the workspaces UI; example, someone gave another user USE CATALOG permission from the UI.Is it possible to capture all such events?appreciate the minds...

  • 1185 Views
  • 4 replies
  • 0 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 0 kudos

@Advika can you kindly please let me know the action name that I should filter upon...

  • 0 kudos
3 More Replies
alonisser
by Contributor II
  • 1722 Views
  • 3 replies
  • 1 kudos

misbehavior of spots with fallback to on demand on job clusters

In the last few days, I've encountered in Azure (and before that also in AWS, but a bit different) this message about failing to start a cluster"run failed with error message Cluster '0410-173007-1pjmdgi1' was terminated. Reason: INVALID_ARGUMENT (CL...

  • 1722 Views
  • 3 replies
  • 1 kudos
Latest Reply
alonisser
Contributor II
  • 1 kudos

I see "Fleet instances do not support GPU instances" so in this case it's a no-op 

  • 1 kudos
2 More Replies
RicardoAntunes
by New Contributor II
  • 666 Views
  • 2 replies
  • 0 kudos

Access locked out with SSO

We were locked out of our account (expired secret for login via Azure Entra ID and password-based login disabled).How can i add a new secret in databricks if i'm only able to login with SSO and this is broken?

  • 666 Views
  • 2 replies
  • 0 kudos
Latest Reply
RicardoAntunes
New Contributor II
  • 0 kudos

It’s a company account 

  • 0 kudos
1 More Replies
DeepankarB
by New Contributor III
  • 1499 Views
  • 1 replies
  • 0 kudos

Implementing Governance on DLT pipelines using compute policy

I am implementing governance over compute creation in the workspaces by implementing custom compute policies for all-purpose, job and dlt pipelines. I was successfully able to create compute policies for all-purpose and jobs where I could restrict th...

Administration & Architecture
administration
Delta Live Table
  • 1499 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @DeepankarB, To enforce compute policies for DLT pipelines, make sure your policy JSON includes policy_family_id: dlt and set apply_policy_default_values: true in the pipeline cluster settings. This helps apply the instance restrictions correctly ...

  • 0 kudos