- 2809 Views
- 3 replies
- 5 kudos
It's possible to sync account group/user to workspace without the need to do it manually?
I am using Databricks SCIM for my Databricks Account, so when i add a user or group in the SCIM connector, the user or group its created in Databricks Account. After this, i need to manually assign the user/group to the workspaces. My boss wants to o...
- 2809 Views
- 3 replies
- 5 kudos
- 5 kudos
Hi , I agree with @Rjdudley EntraID groups are better
- 5 kudos
- 1033 Views
- 2 replies
- 1 kudos
Resolved! Table size information display
Hi,I have a problem with displaying information about the size of my tables.This information is visible several times, but after a while it disappears again.I need to undestand what is happen, and why this information is not available all the time on...
- 1033 Views
- 2 replies
- 1 kudos
- 1 kudos
In general you need a cluster or warehouse to be active for those detail to be presented.
- 1 kudos
- 1316 Views
- 1 replies
- 0 kudos
Driver: how much memory is actually available?
I have a cluster where Driver type is Standard_DS3_v2 (14GB Memory and 4 Cores). When I use free -h command in Web terminal (see attached screenshot) I get the response that I only have 8.9GB memory available on my driver - why is that?fyi, spark.dri...
- 1316 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dbuserng , The free -h command in the web terminal shows only 8.9GB of available memory on your driver, which is a Standard_DS3_v2 instance with 14GB of memory, because Databricks has services running on each node. This means the maximum allowabl...
- 0 kudos
- 853 Views
- 1 replies
- 0 kudos
JVM Heap Memory Graph - more memory used than available
I'm analyzing the memory usage of my Spark application and I see something strange when checking JVM Heap Memory Graph (see screenshot below). Each line on the graph is representing one executor.Why the memory usage sometimes reaches over 10GB, when ...
- 853 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dbuserng , The memory usage in your Spark application can exceed the spark.executor.memory setting of 7GB for several reasons: • Off-Heap Memory Usage: Spark allows for off-heap memory allocation, which is not managed by the JVM garbage collector...
- 0 kudos
- 1474 Views
- 4 replies
- 0 kudos
photon is being used by a job or not
We have lots of customers using many job as well as interactive clusters with photon enabled which is drastically increasing the cost .We would like to know if there is any table in system or any details that we can get through API that lists if the ...
- 1474 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @sruthianki , If you want to check if the job is really using photon or not you can check the SQL query plan in spark UI for its stages and the metrics will highlighted in yellow colour.
- 0 kudos
- 22270 Views
- 2 replies
- 2 kudos
How to Know DBU consumption in azure databricks ?
In Azure portal - Billing we can get the COST but how to know How much DBU is consumed ?
- 22270 Views
- 2 replies
- 2 kudos
- 2 kudos
There was a promo on serverless early 2024 at the time which at some point got extended, and was bigger depending where you were.
- 2 kudos
- 1867 Views
- 2 replies
- 0 kudos
Configuration of NCC for Serverless to access SQL server running in a Azure VM
Hi Team, I am following this link to configure NCC for a Serverless compute to access a SQL Server running in a Azure VM. https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/This references to adding privat...
- 1867 Views
- 2 replies
- 0 kudos
- 0 kudos
also interested in doing this. Have federated queries for Classic Databricks cluster pointing to SQL server, but can't find documentation for Serverless plane connecting to SQL server on a VM
- 0 kudos
- 2095 Views
- 5 replies
- 0 kudos
Cannot list Clusters using Rest API
I am trying to run the following rest API command from:curl -H "Authorization: Bearer <PAT Code>" -X GET "curl -H "Authorization: Bearer <PAT Code>" -X GET "http://<databricks_workspace>.azuredatabricks.net/api/2.0/clusters/list" When I run the comm...
- 2095 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi, I definitely think it is facing network issues. Its just very difficult to identify, when I am able to successfully ping the instance from the server originating the request.It is something jdbc related, just not sure what.it is
- 0 kudos
- 943 Views
- 1 replies
- 0 kudos
JDBC Connect Time out
Anyone know why I would get the JDBC Connect error below:java.sql.SQLException: [Databricks][JDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: com.databricks.client.jdbc42.internal.apache.http.conn.ConnectTimeoutExc...
- 943 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Lawro, That normally happens whenever there is a network issue or firewall blocking the request. Is it failing consistently and have you tested connectivity to your SQL instance using nc -vz command via a notebook?
- 0 kudos
- 2843 Views
- 3 replies
- 3 kudos
Databricks Apps: Issue with ACLs for apps are disabled or not available in this tier
Hello, I've created a dummy app (using the template) and deployed it in an Azure Databricks premium workspace. It is working fine but is only available for those users with access to the Databricks resource.I would like to change the permissions to "...
- 2843 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi, any help? I've settled in the meantime for an Azure Webapp, but it is a pity that I cannot use this just for a configuration step. Any help is welcomed!
- 3 kudos
- 1736 Views
- 2 replies
- 7 kudos
Databricks Unity Catalog Bug - Reset of Network Connectivity Configuration not possible
The following use case is strange regarding the Network Connectivity Configuration (NCC):I create a Workspace (the NCC is empty)I create a NCCI attach the NCC to the WorkspaceI want to remove the NCC from the Workspace -> not possibleTherefore, I can...
- 1736 Views
- 2 replies
- 7 kudos
- 7 kudos
This is the documented behavior in the REST API:https://docs.databricks.com/api/account/workspaces/updateYou cannot remove a network connectivity configuration from the workspace once attached, you can only switch to another one.
- 7 kudos
- 8016 Views
- 9 replies
- 1 kudos
Resolved! OAUTH Secrets Rotation for Service Principal through Databricks CLI
I am currently utilizing a specific Service Principal in my DevOps steps to utilize the Databricks CLI. It's using the OAuth tokens with M2M authentication (Authenticate access to Azure Databricks with a service principal using OAuth (OAuth M2M) - Az...
- 8016 Views
- 9 replies
- 1 kudos
- 1 kudos
After filing a Microsoft Support Ticket through my client they provided me with the solution to the inquiry. There seems to be a undocumented API call that you can do to create this SP Oauth Client Secret and it works perfectly:curl -X POST --header...
- 1 kudos
- 1307 Views
- 1 replies
- 0 kudos
Resolved! AWS Security Hub - The S3 bucket is shared with an external AWS account
Currently We observe a HIGH Risk warning on the Security Hub of AWS Account were we have been deployed a Private Link Databricks. This warning is related to the permissions associated to the root S3 bucket we use, here an example: { "Version": "...
- 1307 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ambigus9 - Regarding the external AWS account (414351767826). This is actually a Databricks-owned AWS account, not a random external account. It's essential for Databricks' service to function properly. This account is used by Databricks to man...
- 0 kudos
- 2714 Views
- 3 replies
- 2 kudos
Feature request: Ability to delete local branches in git folders
According to the documentation https://learn.microsoft.com/en-us/azure/databricks/repos/git-operations-with-repos "Local branches in Databricks cannot be deleted, so if you must remove them, you must also delete and reclone the repository."Creating a...
- 2714 Views
- 3 replies
- 2 kudos
- 2 kudos
Also, rather than switching between dev branches, you can create another git folder for the other branches. Users can create a git folder for each dev branch they work on. Those git folders can be deleted after the branches are merged
- 2 kudos
- 4146 Views
- 3 replies
- 1 kudos
Resolved! How to check a specific table for it's VACUUM retention period
I'm looking for a way to query for the VACUUM retention period for a specific table. This does not show up with DESSCRIBE DETAIL <table_name>;
- 4146 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @WWoman ,the default retention period is 7 days and as per documentation it is regulated by 'delta.deletedFileRetentionDuration' table property: If there is no delta.deletedFileRetentionDuration table property it means it uses the default, so 7 ...
- 1 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
64 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 121 | |
| 42 | |
| 37 | |
| 31 | |
| 25 |