- 149 Views
- 1 replies
- 0 kudos
We have configured a couple of webhooks in Teams channels and added the URLs to Databricks under > Settings> notifications. But our jobs do not post anything into the Teams channels.This used to work but is now not doing anything.
- 149 Views
- 1 replies
- 0 kudos
Latest Reply
Teams changed something to their webhooks a while ago. We got warnings saying we needed to use another (new) method. So we created a new webhook.Perhaps you are still running on the old ones?
- 290 Views
- 5 replies
- 0 kudos
Subject:Hello,My name is Priya, and my organization is a Databricks partner. We've been given access to the Partner Academy, and my company has set up an AWS sandbox environment to support our certification preparation.I'm currently trying to set up ...
- 290 Views
- 5 replies
- 0 kudos
Latest Reply
I do not see anything created under network configuration. Private access settings and vpc endpoints both show 403 status when clicking on them, and there is a default policy under network policies. I will reach out to the account executive.
4 More Replies
- 191 Views
- 1 replies
- 0 kudos
Hi,One of network requirments on Databricks site is to allow connection to one of these public addressesconsolidated-northeuropec2-prod-metastore-2.mysql.database.azure.com, do you know if this address is for the legacy hive metastore or it's used by...
- 191 Views
- 1 replies
- 0 kudos
Latest Reply
The answear is that this is used for the legacy hive and if you disable it, you don't need to create an exception in your firewall.
- 106 Views
- 1 replies
- 0 kudos
I have a publicly accessible SQL database that is protected by a firewall. I am trying to connect this SQL database to Databricks, but I'm encountering an authentication error. I have double-checked the credentials, port, and host, and they are all c...
- 106 Views
- 1 replies
- 0 kudos
Latest Reply
The issue occurs because the Databricks cluster's outbound traffic isn't routed through the NAT Gateway due to misconfigured network settings or conflicting outbound connectivity configurations. This should be mostly resolved by your Network team but...
- 579 Views
- 4 replies
- 2 kudos
Say we have completed the migration of tables from Hive Metastore to UC. All the users, jobs and clusters are switched to UC. There is no more activity on Legacy Hive Metastore.What is the best recommendation on deleting or cleaning the Hive Metastor...
- 579 Views
- 4 replies
- 2 kudos
Latest Reply
Sometime in the last couple of days, this setting was pushed to my account, it looks like what you want:To see if you've been added, go to your Account Console and look under Previews.
3 More Replies
- 585 Views
- 4 replies
- 2 kudos
Dear all,as an administrator, I want to restrict developers from choosing 'photon' option in job clusters. I see this in the job definition when they choose it -"runtime_engine": "PHOTON"How can I pass this as input in the policy and restrict develop...
- 585 Views
- 4 replies
- 2 kudos
Latest Reply
You also need to make sure the policy permissions are set up properly. You can/should fix preexisting compute affected by the policy with the wizard in the policy edit screen.
3 More Replies
by
Behwar
• New Contributor III
- 1563 Views
- 5 replies
- 1 kudos
Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...
- 1563 Views
- 5 replies
- 1 kudos
Latest Reply
Behwar : you should have to create a specific private DNS zone for azure.databricksapps.com - if you do a nslookup on your apps url - you will see that it points to your workspace. In Azure using Azure (recursive) DNS you can see an important behavio...
4 More Replies
by
owly
• New Contributor
- 90 Views
- 1 replies
- 0 kudos
Hi,My databricks is based on AWS S3, I deleted my buckets, now Databricks is not working, how do I delete my Databricks?regards
- 90 Views
- 1 replies
- 0 kudos
Latest Reply
Hello @owly!
To delete Databricks after AWS S3 bucket deletion:
- Terminate all clusters and instance pools.- Clean up associated resources, like IAM roles, S3 storage configurations, and VPCs.- Delete the workspace from the Databricks Account Consol...
- 382 Views
- 1 replies
- 0 kudos
DearsI wanted to have a mindshare around delta sharing - how do you decide how many shares to be created and share with other departments if you are maintaining an enterprise wide data warehoouse/lakehouse using Azure Databricks. I see from the docum...
- 382 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @noorbasha534 ,Let me share a bit about our use case and how we’re handling Delta Sharing.Delta Sharing is indeed a simple and lightweight solution, and one of its main advantages is that it’s free to use. However, it still has several limitations...
- 538 Views
- 4 replies
- 0 kudos
Hi all,I am looking to capture events of permissions assigned on catalog/schemas/tables/views from the workspaces UI; example, someone gave another user USE CATALOG permission from the UI.Is it possible to capture all such events?appreciate the minds...
- 538 Views
- 4 replies
- 0 kudos
Latest Reply
@Advika can you kindly please let me know the action name that I should filter upon...
3 More Replies
- 270 Views
- 3 replies
- 1 kudos
In the last few days, I've encountered in Azure (and before that also in AWS, but a bit different) this message about failing to start a cluster"run failed with error message Cluster '0410-173007-1pjmdgi1' was terminated. Reason: INVALID_ARGUMENT (CL...
- 270 Views
- 3 replies
- 1 kudos
Latest Reply
I see "Fleet instances do not support GPU instances" so in this case it's a no-op
2 More Replies
- 303 Views
- 2 replies
- 0 kudos
We were locked out of our account (expired secret for login via Azure Entra ID and password-based login disabled).How can i add a new secret in databricks if i'm only able to login with SSO and this is broken?
- 303 Views
- 2 replies
- 0 kudos
- 486 Views
- 0 replies
- 0 kudos
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 486 Views
- 0 replies
- 0 kudos
- 579 Views
- 1 replies
- 0 kudos
I am implementing governance over compute creation in the workspaces by implementing custom compute policies for all-purpose, job and dlt pipelines. I was successfully able to create compute policies for all-purpose and jobs where I could restrict th...
- 579 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @DeepankarB, To enforce compute policies for DLT pipelines, make sure your policy JSON includes policy_family_id: dlt and set apply_policy_default_values: true in the pipeline cluster settings. This helps apply the instance restrictions correctly ...
- 846 Views
- 4 replies
- 0 kudos
If we want to enable Databricks Predictive Optimization, then is it also mandatory to enable serverless Job/Notebook Compute in our account. We already have Serverless SQL warehouse available in our workspaces.
- 846 Views
- 4 replies
- 0 kudos
Latest Reply
The documentation states this:Predictive optimization identifies tables that would benefit from ANALYZE, OPTIMIZE, and VACUUM operations and queues them to run using serverless compute for jobs.If I don't have serverless workloads enabled how does pr...
3 More Replies