cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

satniks_o
by New Contributor III
  • 1003 Views
  • 1 replies
  • 0 kudos

Can we provide custom dns name for Databricks app?

Hi All,I want to access my Databricks app https://myapp.aws.databricksapps.com/ using https://myapp.mycompaney.com  Is this possible? We tried DNS mapping but it is not working.  

  • 1003 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @satniks_o , It should be possible to set up a DNS but requires SSL and other settings, how are you setting it up?

  • 0 kudos
marc88
by New Contributor II
  • 657 Views
  • 1 replies
  • 0 kudos

why doesn't databricks allow setting executor metrics

I have an all-purpose compute cluster that processes different data sets for various jobs. I am struggling to optimize executor metrics like below.spark.executor.memory 4gIs it allowed to override default executor metrics and specify such configurati...

Administration & Architecture
clusterConfiguration
executorMetrics
fatExecutorProblem
performanceOptimization
  • 657 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @marc88, As you mentioned in Spark config under Advance cluster options you can do it once cluster boots up it will be set at run level. Or you can draft a cluster policy and apply it across for job computes when creating your workflow.

  • 0 kudos
jx1226
by New Contributor III
  • 753 Views
  • 2 replies
  • 0 kudos

Cross-tenant networking two orgs on Azure

Hi Databricks Community,For one client, we are looking for best practices, steps to establish cross-tenant networking and landing zones between two organizations. The current situation is that the source systems are in one organization, while the Azu...

  • 753 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Cross posting https://community.databricks.com/t5/administration-architecture/databricks-azure-cross-tenant-connection-to-storage-account/td-p/82491#:~:text=I%20tested%20the%20connection%20to,when%20tested%20from%20this%20VM. 

  • 0 kudos
1 More Replies
MarcoRezende
by New Contributor III
  • 2687 Views
  • 3 replies
  • 5 kudos

It's possible to sync account group/user to workspace without the need to do it manually?

I am using Databricks SCIM for my Databricks Account, so when i add a user or group in the SCIM connector, the user or group its created in Databricks Account. After this, i need to manually assign the user/group to the workspaces. My boss wants to o...

  • 2687 Views
  • 3 replies
  • 5 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 5 kudos

Hi , I agree with @Rjdudley EntraID groups are better 

  • 5 kudos
2 More Replies
Mihai_LBC
by New Contributor II
  • 934 Views
  • 2 replies
  • 1 kudos

Resolved! Table size information display

Hi,I have a problem with displaying information about the size of my tables.This information is visible several times, but after a while it disappears again.I need to undestand what is happen, and why this information is not available all the time on...

Information available.png Missing information.png
  • 934 Views
  • 2 replies
  • 1 kudos
Latest Reply
JakubSkibicki
Contributor
  • 1 kudos

In general you need a cluster or warehouse to be active for those detail to be presented.   

  • 1 kudos
1 More Replies
dbuserng
by New Contributor II
  • 1076 Views
  • 1 replies
  • 0 kudos

Driver: how much memory is actually available?

I have a cluster where Driver type is Standard_DS3_v2 (14GB Memory and 4 Cores). When I use free -h command in Web terminal (see attached screenshot) I get the response that I only have 8.9GB memory available on my driver - why is that?fyi, spark.dri...

  • 1076 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi @dbuserng , The free -h command in the web terminal shows only 8.9GB of available memory on your driver, which is a Standard_DS3_v2 instance with 14GB of memory, because Databricks has services running on each node. This means the maximum allowabl...

  • 0 kudos
dbuserng
by New Contributor II
  • 690 Views
  • 1 replies
  • 0 kudos

JVM Heap Memory Graph - more memory used than available

I'm analyzing the memory usage of my Spark application and I see something strange when checking JVM Heap Memory Graph (see screenshot below). Each line on the graph is representing one executor.Why the memory usage sometimes reaches over 10GB, when ...

  • 690 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi @dbuserng , The memory usage in your Spark application can exceed the spark.executor.memory setting of 7GB for several reasons: • Off-Heap Memory Usage: Spark allows for off-heap memory allocation, which is not managed by the JVM garbage collector...

  • 0 kudos
sruthianki
by New Contributor II
  • 1260 Views
  • 4 replies
  • 0 kudos

photon is being used by a job or not

We have lots of customers using many job as well as interactive clusters with photon enabled which is drastically increasing the cost .We would like to know if there is any table in system or any details that we can get through API that lists if the ...

  • 1260 Views
  • 4 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi @sruthianki , If you want to check if the job is really using photon or not you can check the SQL query plan in spark UI for its stages and the metrics will highlighted in yellow colour.

  • 0 kudos
3 More Replies
Kroy
by Contributor
  • 21841 Views
  • 2 replies
  • 2 kudos

How to Know DBU consumption in azure databricks ?

In Azure portal - Billing we can get the COST but how to know How much DBU is consumed ?

  • 21841 Views
  • 2 replies
  • 2 kudos
Latest Reply
odoll
New Contributor II
  • 2 kudos

There was a promo on serverless early 2024 at the time which at some point got extended, and was bigger depending where you were.

  • 2 kudos
1 More Replies
Veilraj
by New Contributor III
  • 1649 Views
  • 2 replies
  • 0 kudos

Configuration of NCC for Serverless to access SQL server running in a Azure VM

Hi Team, I am following this link to configure NCC for a Serverless compute to access a SQL Server running in a Azure VM. https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/This references to adding privat...

  • 1649 Views
  • 2 replies
  • 0 kudos
Latest Reply
shhhhhh
New Contributor III
  • 0 kudos

also interested in doing this.  Have federated queries for Classic Databricks cluster pointing to SQL server, but can't find documentation for Serverless plane connecting to SQL server on a VM

  • 0 kudos
1 More Replies
Lawro
by New Contributor II
  • 1912 Views
  • 5 replies
  • 0 kudos

Cannot list Clusters using Rest API

I am trying to run the following rest API command from:curl -H "Authorization: Bearer <PAT Code>" -X GET "curl -H "Authorization: Bearer <PAT Code>" -X GET "http://<databricks_workspace>.azuredatabricks.net/api/2.0/clusters/list"  When I run the comm...

  • 1912 Views
  • 5 replies
  • 0 kudos
Latest Reply
Lawro
New Contributor II
  • 0 kudos

Hi, I definitely think it is facing network issues.  Its just very difficult to identify, when I am able to successfully ping the instance from the server originating the request.It is something jdbc related, just not sure what.it is

  • 0 kudos
4 More Replies
Lawro
by New Contributor II
  • 800 Views
  • 1 replies
  • 0 kudos

JDBC Connect Time out

Anyone know why I would get the JDBC Connect error below:java.sql.SQLException: [Databricks][JDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: com.databricks.client.jdbc42.internal.apache.http.conn.ConnectTimeoutExc...

  • 800 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Lawro, That normally happens whenever there is a network issue or firewall blocking the request. Is it failing consistently and have you tested connectivity to your SQL instance using nc -vz command via a notebook?

  • 0 kudos
javiomotero
by New Contributor III
  • 2543 Views
  • 3 replies
  • 3 kudos

Databricks Apps: Issue with ACLs for apps are disabled or not available in this tier

Hello, I've created a dummy app (using the template) and deployed it in an Azure Databricks premium workspace. It is working fine but is only available for those users with access to the Databricks resource.I would like to change the permissions to "...

javiomotero_1-1737889451177.png javiomotero_0-1737889408832.png
  • 2543 Views
  • 3 replies
  • 3 kudos
Latest Reply
javiomotero
New Contributor III
  • 3 kudos

Hi, any help? I've settled in the meantime for an Azure Webapp, but it is a pity that I cannot use this just for a configuration step. Any help is welcomed!

  • 3 kudos
2 More Replies
mr_elastic
by New Contributor III
  • 1579 Views
  • 2 replies
  • 7 kudos

Databricks Unity Catalog Bug - Reset of Network Connectivity Configuration not possible

The following use case is strange regarding the Network Connectivity Configuration (NCC):I create a Workspace (the NCC is empty)I create a NCCI attach the NCC to the WorkspaceI want to remove the NCC from the Workspace -> not possibleTherefore, I can...

  • 1579 Views
  • 2 replies
  • 7 kudos
Latest Reply
loic
Contributor
  • 7 kudos

This is the documented behavior in the REST API:https://docs.databricks.com/api/account/workspaces/updateYou cannot remove a network connectivity configuration from the workspace once attached, you can only switch to another one.

  • 7 kudos
1 More Replies
Rob_Lemmens
by New Contributor III
  • 6734 Views
  • 9 replies
  • 1 kudos

Resolved! OAUTH Secrets Rotation for Service Principal through Databricks CLI

I am currently utilizing a specific Service Principal in my DevOps steps to utilize the Databricks CLI. It's using the OAuth tokens with M2M authentication (Authenticate access to Azure Databricks with a service principal using OAuth (OAuth M2M) - Az...

  • 6734 Views
  • 9 replies
  • 1 kudos
Latest Reply
Rob_Lemmens
New Contributor III
  • 1 kudos

 After filing a Microsoft Support Ticket through my client they provided me with the solution to the inquiry. There seems to be a undocumented API call that you can do to create this SP Oauth Client Secret and it works perfectly:curl -X POST --header...

  • 1 kudos
8 More Replies