cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

markbaas
by New Contributor II
  • 156 Views
  • 3 replies
  • 0 kudos

private endpoint to non-storage azure resource

I'm trying to set up a ncc and private endpoint for a container app environment in azure. However I get the following error:Error occurred when creating private endpoint rule: : BAD_REQUEST: Can not create Private Link Endpoint with name databricks-x...

  • 156 Views
  • 3 replies
  • 0 kudos
Latest Reply
markbaas
New Contributor II
  • 0 kudos

All the azure subscriptions have this registered. Could this not be a azure subscription within the databricks tenant?

  • 0 kudos
2 More Replies
jyunnko
by Visitor
  • 21 Views
  • 1 replies
  • 1 kudos

How to find the billing of each cell in a notebook?

Suppose I have run ten different statements/tasks/cells in a notebook, and I want to know how many DBUs each of these ten tasks used. Is this possible?

  • 21 Views
  • 1 replies
  • 1 kudos
Latest Reply
Isi
New Contributor
  • 1 kudos

Hey,I really think this it’s not possible to directly determine the cost of a single cell in Databricks.However, you can approach this in two ways, depending on the type of cluster you’re using, as different cluster types have different pricing model...

  • 1 kudos
karthiknuvepro
by New Contributor
  • 47 Views
  • 2 replies
  • 0 kudos

Databricks Workspace Access and Permissions

Hi Team,The GCP Databricks URL https://accounts.gcp.databricks.com/ for GCP Databricks is linked to the GCP Billing Account.We have two clients with separate GCP Organizations:client1.example.comclient2.example.comBoth GCP Organizations share the sam...

  • 47 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

HIi @karthiknuvepro, To isolate resources you can follow these steps:   Create Separate GCP Projects for Each Client: Create a separate GCP project for each client within their respective GCP Organizations.This ensures that each client has isolated ...

  • 0 kudos
1 More Replies
jaytimbadia
by New Contributor
  • 27 Views
  • 3 replies
  • 0 kudos

GPU accelerator not matching with desired memory.

Hello, We have opted for Standard_NC8as_T4_v3 which claims to have 56GB memory. But, when I am doing nvidia-smi in the notebook, its showing only ~16 GB, Why?Please let me know what is happening here?  Jay

  • 27 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Please refer to: https://learn.microsoft.com/en-us/azure/databricks/compute/gpu

  • 0 kudos
2 More Replies
erigaud
by Honored Contributor
  • 159 Views
  • 5 replies
  • 5 kudos

Resolved! Databricks cluster pool deployed through Terraform does not have UC enabled

Hello everyone,we have a workspace with UC enabled, we already have a couple of catalogs attached and when using our personal compute we are able to read/write tables in those catalogs.However for our jobs we deployed a cluster pool using Terraform b...

erigaud_1-1736874136257.png
  • 159 Views
  • 5 replies
  • 5 kudos
Latest Reply
erigaud
Honored Contributor
  • 5 kudos

Confirmed that this works ! THank you 

  • 5 kudos
4 More Replies
karthiknuvepro
by New Contributor
  • 37 Views
  • 1 replies
  • 0 kudos

GCP Databricks | Workspace Creation Error: Storage Credentials Limit Reached

Hi Team,We are encountering an issue while trying to create a Databricks Workspace in the GCP region us-central1. Below is the error message:Error Message:Workspace Status: FailedDetails: Workspace failed to launch.Error: BAD REQUEST: Cannot create 1...

  • 37 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @karthiknuvepro, Do you have an active support plan? Over a ticket with us we can request the increase of this limit.

  • 0 kudos
rdadhichi
by New Contributor II
  • 130 Views
  • 2 replies
  • 0 kudos

Disable 'Allow trusted Microsoft services to bypass this firewall' for Azure Key Vault

Currently even when using vnet injected Databricks workspace, we are unable to fetch the secrets from AKV if the 'Allow trusted Microsoft services to bypass this firewall' is disabled.The secret is used a AKV backed secret scope and the key vault is ...

  • 130 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @rdadhichi, Have you set "Allow access from" to "Private endpoint and selected networks" on the firewall?

  • 0 kudos
1 More Replies
AnkitShah
by New Contributor II
  • 103 Views
  • 4 replies
  • 0 kudos

How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months

How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months

  • 103 Views
  • 4 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @AnkitShah, I just tried on my end and found these 2 tables that might be useful. They do not exact show who downloaded a model artifact but who interacted with it: https://docs.databricks.com/en/ai-gateway/configure-ai-gateway-endpoints.html#usag...

  • 0 kudos
3 More Replies
ptco
by New Contributor II
  • 3089 Views
  • 10 replies
  • 2 kudos

"Azure Container Does Not Exist" when cloning repositories in Azure Databricks

Good Morning, I need some help with the following issue:I created a new Azure Databricks resource using the vnet-injection procedure. (here) I then proceeded to link my Azure Devops account using a personal account token. If I try to clone a reposito...

  • 3089 Views
  • 10 replies
  • 2 kudos
Latest Reply
ihenadb
New Contributor II
  • 2 kudos

HiAlso having problems with this after some IaC testing deleting and recreating the workspace with the same name. We are working in Azure.Is the "container" referred to, the container in the storage account deployed by Databricks Instance into the ma...

  • 2 kudos
9 More Replies
RicksDB
by Contributor III
  • 171 Views
  • 6 replies
  • 0 kudos

Governance to restrict compute creation

Hi,Cluster policies used to be an easy way to handle governance on computes. However, more and more, there seem to be no way to control many new compute features within the platform. We currently have this issue for model serving endpoints and vector...

  • 171 Views
  • 6 replies
  • 0 kudos
Latest Reply
nskiran
New Contributor II
  • 0 kudos

If you are looking to restrict end users to create certain cluster configuration only, you can do so by using databricks APIs. Through python and Databricks API, you can specify what kind of cluster configurations are allowed and also restrict users ...

  • 0 kudos
5 More Replies
dbuserng
by New Contributor
  • 152 Views
  • 1 replies
  • 3 kudos

High memory usage on Databricks cluster

In my team we have a very high memory usage even when the cluster has just been started and nothing has been run yet. Additionally, memory usage never drops to lower levels - total used memory always fluctuates around 14GB.Where is this memory usage ...

  • 152 Views
  • 1 replies
  • 3 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 3 kudos

This is not necessarily an issue.  Linux uses a lot of RAM for caching but this does not mean it cannot be released for processes (dynamic memory mgmt).Basically the philosophy is that RAM that is not used (so actually 'free') is useless.Here is a re...

  • 3 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels