- 21583 Views
- 2 replies
- 1 kudos
How to Know DBU consumption in azure databricks ?
In Azure portal - Billing we can get the COST but how to know How much DBU is consumed ?
- 21583 Views
- 2 replies
- 1 kudos
- 1 kudos
There was a promo on serverless early 2024 at the time which at some point got extended, and was bigger depending where you were.
- 1 kudos
- 1473 Views
- 2 replies
- 0 kudos
Configuration of NCC for Serverless to access SQL server running in a Azure VM
Hi Team, I am following this link to configure NCC for a Serverless compute to access a SQL Server running in a Azure VM. https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/This references to adding privat...
- 1473 Views
- 2 replies
- 0 kudos
- 0 kudos
also interested in doing this. Have federated queries for Classic Databricks cluster pointing to SQL server, but can't find documentation for Serverless plane connecting to SQL server on a VM
- 0 kudos
- 1773 Views
- 5 replies
- 0 kudos
Cannot list Clusters using Rest API
I am trying to run the following rest API command from:curl -H "Authorization: Bearer <PAT Code>" -X GET "curl -H "Authorization: Bearer <PAT Code>" -X GET "http://<databricks_workspace>.azuredatabricks.net/api/2.0/clusters/list" When I run the comm...
- 1773 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi, I definitely think it is facing network issues. Its just very difficult to identify, when I am able to successfully ping the instance from the server originating the request.It is something jdbc related, just not sure what.it is
- 0 kudos
- 732 Views
- 1 replies
- 0 kudos
JDBC Connect Time out
Anyone know why I would get the JDBC Connect error below:java.sql.SQLException: [Databricks][JDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: com.databricks.client.jdbc42.internal.apache.http.conn.ConnectTimeoutExc...
- 732 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Lawro, That normally happens whenever there is a network issue or firewall blocking the request. Is it failing consistently and have you tested connectivity to your SQL instance using nc -vz command via a notebook?
- 0 kudos
- 2332 Views
- 3 replies
- 3 kudos
Databricks Apps: Issue with ACLs for apps are disabled or not available in this tier
Hello, I've created a dummy app (using the template) and deployed it in an Azure Databricks premium workspace. It is working fine but is only available for those users with access to the Databricks resource.I would like to change the permissions to "...
- 2332 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi, any help? I've settled in the meantime for an Azure Webapp, but it is a pity that I cannot use this just for a configuration step. Any help is welcomed!
- 3 kudos
- 1483 Views
- 2 replies
- 7 kudos
Databricks Unity Catalog Bug - Reset of Network Connectivity Configuration not possible
The following use case is strange regarding the Network Connectivity Configuration (NCC):I create a Workspace (the NCC is empty)I create a NCCI attach the NCC to the WorkspaceI want to remove the NCC from the Workspace -> not possibleTherefore, I can...
- 1483 Views
- 2 replies
- 7 kudos
- 7 kudos
This is the documented behavior in the REST API:https://docs.databricks.com/api/account/workspaces/updateYou cannot remove a network connectivity configuration from the workspace once attached, you can only switch to another one.
- 7 kudos
- 6062 Views
- 9 replies
- 1 kudos
Resolved! OAUTH Secrets Rotation for Service Principal through Databricks CLI
I am currently utilizing a specific Service Principal in my DevOps steps to utilize the Databricks CLI. It's using the OAuth tokens with M2M authentication (Authenticate access to Azure Databricks with a service principal using OAuth (OAuth M2M) - Az...
- 6062 Views
- 9 replies
- 1 kudos
- 1 kudos
After filing a Microsoft Support Ticket through my client they provided me with the solution to the inquiry. There seems to be a undocumented API call that you can do to create this SP Oauth Client Secret and it works perfectly:curl -X POST --header...
- 1 kudos
- 1067 Views
- 1 replies
- 0 kudos
Resolved! AWS Security Hub - The S3 bucket is shared with an external AWS account
Currently We observe a HIGH Risk warning on the Security Hub of AWS Account were we have been deployed a Private Link Databricks. This warning is related to the permissions associated to the root S3 bucket we use, here an example: { "Version": "...
- 1067 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ambigus9 - Regarding the external AWS account (414351767826). This is actually a Databricks-owned AWS account, not a random external account. It's essential for Databricks' service to function properly. This account is used by Databricks to man...
- 0 kudos
- 2179 Views
- 3 replies
- 2 kudos
Feature request: Ability to delete local branches in git folders
According to the documentation https://learn.microsoft.com/en-us/azure/databricks/repos/git-operations-with-repos "Local branches in Databricks cannot be deleted, so if you must remove them, you must also delete and reclone the repository."Creating a...
- 2179 Views
- 3 replies
- 2 kudos
- 2 kudos
Also, rather than switching between dev branches, you can create another git folder for the other branches. Users can create a git folder for each dev branch they work on. Those git folders can be deleted after the branches are merged
- 2 kudos
- 3525 Views
- 3 replies
- 1 kudos
Resolved! How to check a specific table for it's VACUUM retention period
I'm looking for a way to query for the VACUUM retention period for a specific table. This does not show up with DESSCRIBE DETAIL <table_name>;
- 3525 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @WWoman ,the default retention period is 7 days and as per documentation it is regulated by 'delta.deletedFileRetentionDuration' table property: If there is no delta.deletedFileRetentionDuration table property it means it uses the default, so 7 ...
- 1 kudos
- 4493 Views
- 2 replies
- 1 kudos
Removal of account admin
Hi, I'm having issues with removing account admin (probably the first one, to which databricks account was related to). Under user management, when I hit the delete user button, it prompts:Either missing permissions to delete <user_email> or deleting...
- 4493 Views
- 2 replies
- 1 kudos
- 1 kudos
This error typically occurs when attempting to remove the 'account owner.' The account owner is the user who originally set up the Databricks account. This entitlement is attached to a user so that there is always at least one account admin who can a...
- 1 kudos
- 1512 Views
- 2 replies
- 0 kudos
Resolved! How to optimize Worklfow job startup time
We want to create a workflow pipeline in which we trigger a Databricks workflow job from AWS. However, the startup time of Databricks workflow jobs on job compute is over 10 minutes, which is causing issues.We would like to either avoid this startup ...
- 1512 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Rahul14Gupta, One option is to use serverless job clusters, which can significantly reduce the startup time. Serverless clusters are designed to start quickly and can be a good fit for workloads that require fast initialization. But you can actua...
- 0 kudos
- 3947 Views
- 1 replies
- 0 kudos
Databricks Bundles - Terraform state management
Hello,I had a look at DABs today and it seems they are using Terraform under the hood. The state is stored in Databricks Workspace, in the bundle deployment directory. Is it possible to use just the state management functionality that DABs must have ...
- 3947 Views
- 1 replies
- 0 kudos
- 0 kudos
While it may be possible to use the state management functionality provided by DABs using Terraform, it would require additional effort to synchronize your code and manage state consistently. The choice would depend on the use case and we should keep...
- 0 kudos
- 1385 Views
- 3 replies
- 3 kudos
How to tag databricks workspace to keep track of AWS resources.
How to add tags to a databricks workspace so that tag propagates to all cloud resources created for or by workspace to keep track of their costs.
- 1385 Views
- 3 replies
- 3 kudos
- 3 kudos
Hello!To add tags to a Databricks workspace and ensure they propagate to all cloud resources created for or by the workspace, log in to your Databricks workspace and navigate to the workspace settings or administration console. Use the tag interface ...
- 3 kudos
- 6052 Views
- 11 replies
- 3 kudos
"Azure Container Does Not Exist" when cloning repositories in Azure Databricks
Good Morning, I need some help with the following issue:I created a new Azure Databricks resource using the vnet-injection procedure. (here) I then proceeded to link my Azure Devops account using a personal account token. If I try to clone a reposito...
- 6052 Views
- 11 replies
- 3 kudos
- 3 kudos
Yes, the container is indeed the container in the storage account deployed by the Databricks instance into the managed resource group. This storage account is part of the managed resource group associated with your Azure Databricks workspace. If the ...
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 105 | |
| 37 | |
| 29 | |
| 25 | |
| 19 |