cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

chenda
by New Contributor II
  • 7166 Views
  • 1 replies
  • 3 kudos

Databricks Cluster Failed to Start - ADD_NODES_FAILED (Solution)

Lately we encountered the issue that the classic compute clusters could not start. With the help of Databricks team to troubleshoot, we found the issue and get it fixed. So I think writing it here could help other people who would encounter the same ...

chenda_0-1728285163027.png chenda_1-1728285255900.png chenda_2-1728285850870.png chenda_8-1728289622254.png
  • 7166 Views
  • 1 replies
  • 3 kudos
Latest Reply
jem
New Contributor III
  • 3 kudos

Thanks for sharing. We had the same problem. I missed to add private endpoints to the workspace storage account in the managed resource group. I will also add NCC rules in the Databricks account. Then you don't need the subnets in the firewall.

  • 3 kudos
markbaas
by New Contributor III
  • 1625 Views
  • 3 replies
  • 3 kudos

private endpoint to non-storage azure resource

I'm trying to set up a ncc and private endpoint for a container app environment in azure. However I get the following error:Error occurred when creating private endpoint rule: : BAD_REQUEST: Can not create Private Link Endpoint with name databricks-x...

  • 1625 Views
  • 3 replies
  • 3 kudos
Latest Reply
markbaas
New Contributor III
  • 3 kudos

All the azure subscriptions have this registered. Could this not be a azure subscription within the databricks tenant?

  • 3 kudos
2 More Replies
jyunnko
by New Contributor
  • 689 Views
  • 1 replies
  • 1 kudos

How to find the billing of each cell in a notebook?

Suppose I have run ten different statements/tasks/cells in a notebook, and I want to know how many DBUs each of these ten tasks used. Is this possible?

  • 689 Views
  • 1 replies
  • 1 kudos
Latest Reply
Isi
Honored Contributor III
  • 1 kudos

Hey,I really think this it’s not possible to directly determine the cost of a single cell in Databricks.However, you can approach this in two ways, depending on the type of cluster you’re using, as different cluster types have different pricing model...

  • 1 kudos
jaytimbadia
by New Contributor II
  • 662 Views
  • 3 replies
  • 0 kudos

GPU accelerator not matching with desired memory.

Hello, We have opted for Standard_NC8as_T4_v3 which claims to have 56GB memory. But, when I am doing nvidia-smi in the notebook, its showing only ~16 GB, Why?Please let me know what is happening here?  Jay

  • 662 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Please refer to: https://learn.microsoft.com/en-us/azure/databricks/compute/gpu

  • 0 kudos
2 More Replies
Hari-dbw
by New Contributor
  • 457 Views
  • 0 replies
  • 0 kudos

Error "Gateway authentication failed for 'Microsoft.Network'" While Creating Azure Databricks

Hi All,I'm encountering an issue while trying to create a Databricks service in Azure. During the setup process, I get the following error:"Gateway authentication failed for 'Microsoft.Network'"I've checked the basic configurations, but I'm not sure ...

  • 457 Views
  • 0 replies
  • 0 kudos
erigaud
by Honored Contributor
  • 2004 Views
  • 5 replies
  • 5 kudos

Resolved! Databricks cluster pool deployed through Terraform does not have UC enabled

Hello everyone,we have a workspace with UC enabled, we already have a couple of catalogs attached and when using our personal compute we are able to read/write tables in those catalogs.However for our jobs we deployed a cluster pool using Terraform b...

erigaud_1-1736874136257.png
  • 2004 Views
  • 5 replies
  • 5 kudos
Latest Reply
erigaud
Honored Contributor
  • 5 kudos

Confirmed that this works ! THank you 

  • 5 kudos
4 More Replies
karthiknuvepro
by New Contributor II
  • 739 Views
  • 1 replies
  • 0 kudos

GCP Databricks | Workspace Creation Error: Storage Credentials Limit Reached

Hi Team,We are encountering an issue while trying to create a Databricks Workspace in the GCP region us-central1. Below is the error message:Error Message:Workspace Status: FailedDetails: Workspace failed to launch.Error: BAD REQUEST: Cannot create 1...

  • 739 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @karthiknuvepro, Do you have an active support plan? Over a ticket with us we can request the increase of this limit.

  • 0 kudos
AnkitShah
by New Contributor II
  • 1758 Views
  • 4 replies
  • 0 kudos

How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months

How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months

  • 1758 Views
  • 4 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @AnkitShah, I just tried on my end and found these 2 tables that might be useful. They do not exact show who downloaded a model artifact but who interacted with it: https://docs.databricks.com/en/ai-gateway/configure-ai-gateway-endpoints.html#usag...

  • 0 kudos
3 More Replies
RicksDB
by Contributor III
  • 1927 Views
  • 6 replies
  • 1 kudos

Governance to restrict compute creation

Hi,Cluster policies used to be an easy way to handle governance on computes. However, more and more, there seem to be no way to control many new compute features within the platform. We currently have this issue for model serving endpoints and vector...

  • 1927 Views
  • 6 replies
  • 1 kudos
Latest Reply
nskiran
New Contributor III
  • 1 kudos

If you are looking to restrict end users to create certain cluster configuration only, you can do so by using databricks APIs. Through python and Databricks API, you can specify what kind of cluster configurations are allowed and also restrict users ...

  • 1 kudos
5 More Replies
dbuserng
by New Contributor II
  • 3208 Views
  • 1 replies
  • 4 kudos

High memory usage on Databricks cluster

In my team we have a very high memory usage even when the cluster has just been started and nothing has been run yet. Additionally, memory usage never drops to lower levels - total used memory always fluctuates around 14GB.Where is this memory usage ...

  • 3208 Views
  • 1 replies
  • 4 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 4 kudos

This is not necessarily an issue.  Linux uses a lot of RAM for caching but this does not mean it cannot be released for processes (dynamic memory mgmt).Basically the philosophy is that RAM that is not used (so actually 'free') is useless.Here is a re...

  • 4 kudos
jimbender
by New Contributor II
  • 1073 Views
  • 1 replies
  • 0 kudos

Newbie DAB question regarding wheels

I am trying to build a wheel using a DAB.  It errors saying I don't have permissions to install my wheel onto a cluster I am have been given.  Is it possible to just upload the wheel to a subdir the /Shared directory and use it from there instead of ...

  • 1073 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

May I know the exact error you are getting on the cluster?You can use the following code to use a wheel in a shared folder: resources: jobs: my-job: name: my-job tasks: - task_key: my-task new_cluster: ...

  • 0 kudos
sparrap
by New Contributor
  • 1505 Views
  • 2 replies
  • 0 kudos

Error when Connecting Databricks Cluster to RStudio Desktop App

Hi! I am trying to connect RStudio to my Databricks Cluster, I already change the permissions to CAN MANAGE and CAN ATTACH to the cluster. Also I have verified to have the correct python version and Databricks version in my computer.This is the error...

  • 1505 Views
  • 2 replies
  • 0 kudos
Latest Reply
mikvaar
New Contributor III
  • 0 kudos

This seems to solve the problem: https://github.com/sparklyr/sparklyr/issues/3449Apparently sparklyr requires that Unity Catalog is enabled on the cluster in order to get the connection working right.

  • 0 kudos
1 More Replies
sparkplug
by New Contributor III
  • 966 Views
  • 1 replies
  • 1 kudos

Resolved! How do I track notebooks in all purpose compute?

I am trying to map out costs for a Shared cluster used in our organization. Since Databricks does not store the sessions in all purpose compute or who accessed the cluster, what are some possible options that I can track which notebooks were attached...

  • 966 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @sparkplug, You can use the audit logs and billing usage table: https://docs.databricks.com/en/admin/account-settings/audit-logs.html

  • 1 kudos
navi_bricks
by New Contributor II
  • 5334 Views
  • 9 replies
  • 1 kudos

Need to move files from one Volume to other

We recently enabled Unity catalog on our workspace, as part of certain transformations(Custom clustered Datapipelines(python)) we need to move file from one volume to other volume. As the job itself runs on a service principal that has access to exte...

  • 5334 Views
  • 9 replies
  • 1 kudos
Latest Reply
Dnirmania
Contributor
  • 1 kudos

Not all job clusters work well with Volumes. I used following type cluster to access files from Volume. 

  • 1 kudos
8 More Replies