cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NadithK
by Contributor
  • 3936 Views
  • 2 replies
  • 2 kudos

Pre-loading docker images to cluster pool instances still requires docker URL at cluster creation

I am trying to pre-load a docker image to a Databricks cluster pool instance.As per this article I used the REST API to create the cluster pool and defined a custom Azure container registry as the source for the docker images.https://learn.microsoft....

  • 3936 Views
  • 2 replies
  • 2 kudos
Latest Reply
krupakar1329
New Contributor II
  • 2 kudos

@NadithK Pre-loading Docker images to cluster pool instances is for performance optimization (faster cluster startup), but you still must specify the Docker image in your cluster configuration. The pre-loading doesn't eliminate the requirement to dec...

  • 2 kudos
1 More Replies
tariq
by New Contributor III
  • 1127 Views
  • 4 replies
  • 3 kudos

Resolved! Databricks Default package repositories

I have added an extra-index-url in the default package repository in databricks which points to a repository in azure artifact. The libraries from it are getting installed on job cluster but is not working on the all purpose cluster. Below is the rel...

  • 1127 Views
  • 4 replies
  • 3 kudos
Latest Reply
Advika_
Databricks Employee
  • 3 kudos

Hello @tariq! Did the suggestions shared above help address your issue? If so, please consider marking one of the responses as the accepted solution. If you found a different approach that worked for you, sharing it with the community would be really...

  • 3 kudos
3 More Replies
4Twannie
by New Contributor II
  • 1151 Views
  • 1 replies
  • 3 kudos

Resolved! Databricks Usage Dashboard - Tagging Networking Costs

Problem OverviewOur team has successfully integrated Azure Databricks Usage Dashboards to monitor platform-related costs. This addition has delivered valuable insights into our spending patterns. However, we've encountered a tagging issue that's prov...

  • 1151 Views
  • 1 replies
  • 3 kudos
Latest Reply
mark_ott
Databricks Employee
  • 3 kudos

There is no direct way to tag certain Azure networking resources (such as network interfaces, public IPs, or managed disks) so that their costs inherit custom tags like "projectRole" in cost reports, because many core networking resources either do n...

  • 3 kudos
PabloCSD
by Valued Contributor II
  • 384 Views
  • 1 replies
  • 0 kudos

Why MLFlow version blocks library installation?

I have a Databricks Asset Bundle proyect running and when the MLFlow library released the 3.4.0 version it happened this error:When restricting the MLFlow version to < 3.4.0 it works.The libraries got stuck installing and the process finished in erro...

PabloCSD_0-1758655078976.png
  • 384 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @PabloCSD! Which Databricks Runtime version are you using?

  • 0 kudos
bdanielatl
by New Contributor II
  • 1672 Views
  • 3 replies
  • 3 kudos

Resolved! Markdown Cells Do Not Render Consistently

When I am creating a notebook in the UI editor on DataBricks, markdown cells do not always render after I run them. They still appear in 'editing mode'. See the screenshot below, it should have rendered a H1.Again, this behavior is not consistent. So...

bdanielatl_0-1736861864741.png
  • 1672 Views
  • 3 replies
  • 3 kudos
Latest Reply
wilson_ejulu
New Contributor II
  • 3 kudos

same issue

  • 3 kudos
2 More Replies
howardgagan
by New Contributor
  • 550 Views
  • 2 replies
  • 0 kudos

When setting up unity catalog a storage account was created with security risk

When i set up databricks unity catalog, i think it automatically set up a storage account. I'm getting recommendations from Azure that this storage account has high risk associated with it. The problem is this resource has a deny assignment on preven...

howardgagan_0-1758720263881.png
  • 550 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @howardgagan ,Each Azure Databricks workspace has an associated Azure storage account in a managed resource group known as the workspace storage account.This storage account includes workspace system data (job output, system settings, and logs), D...

  • 0 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 783 Views
  • 2 replies
  • 3 kudos

Any sample code snippet that contains end-to-end OIDC flow

Hello,Is there a sample code snippet that depicts end-to-end OIDC flow - imagine, there exists a service principal, interactive user who connect to an sql warehouse, get authenticated, and run some sql queries as part of a python script (jdbc/odbc) f...

  • 783 Views
  • 2 replies
  • 3 kudos
Latest Reply
Advika
Databricks Employee
  • 3 kudos

Hello @noorbasha534, did the code snippet shared above help resolve your concern? If it did, please consider marking it as the accepted solution. If you found another approach, please share it with the community so others can benefit as well.

  • 3 kudos
1 More Replies
SravanThotakura
by New Contributor II
  • 1264 Views
  • 4 replies
  • 1 kudos

Resolved! Spark Job fails with No plan for OptimizedForeachBatchFastpath

Hi Team,I am trying to run a job on Databricks cluster 14.3LTS which streams data from parquet to custom sink. I am facing below error. The same code used to work a month back, however i am facing this issue recently.org.apache.spark.SparkException: ...

  • 1264 Views
  • 4 replies
  • 1 kudos
Latest Reply
Malthe
Contributor II
  • 1 kudos

Disabling the setting worked for me on 16.4 (serverless):spark.databricks.streaming.forEachBatch.optimized.fastPath.enabled=false But you have to wonder if this is something Databricks are aware of and are looking to fix.

  • 1 kudos
3 More Replies
swee
by New Contributor
  • 633 Views
  • 1 replies
  • 1 kudos

Resolved! Establish Cross cloud connectivity between Azure Databricks and AWS s3

Hello.We have the cross cloud configuration set as below:AWS - VPC, Transit Gateways, AWS Direct ConnectOn Premise Data centerAzure - Vnet, Transit Vnet , Express Route.We are trying to create a Databricks storage credential as below. The AWS IAM obj...

  • 633 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 1 kudos

Hello @swee ,Thank you for your query.If your storage account is private, you would need to establish a route to that storage account so you can read data.This is because if your storage is private, your storage account will block access to the publi...

  • 1 kudos
asharkman
by New Contributor III
  • 3818 Views
  • 8 replies
  • 2 kudos

Resolved! Reporting serverless costs to azure costs

So, we've just recently applied serverless budget polices to some of our vector searches and apps. At the moment they're all going to azure under one general tag that we created.However, we needed more definition. So i added the serverless budget pol...

  • 3818 Views
  • 8 replies
  • 2 kudos
Latest Reply
mrsimon0007
New Contributor II
  • 2 kudos

Billing or set up explicit export pipelines. Check whether your serverless budget policy tags are under a different namespace in Azure, as sometimes they show up nested.

  • 2 kudos
7 More Replies
ricelso
by New Contributor II
  • 1276 Views
  • 3 replies
  • 3 kudos

Resolved! AWS-Databricks' workspace attached to a NCC doesn't generate Egress Stable IPs

I am facing an issue when configuring a Databricks workspace on AWS with a Network Connectivity Configuration (NCC).Even after attaching the NCC, the workspace does not generate Egress Stable IPs as expected.In the workspace configuration tab, under ...

  • 1276 Views
  • 3 replies
  • 3 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 3 kudos

Hi @ricelso ,Sorry to hear you are still facing this issue.This behaviour isn't expected - I would suggest you kindly raise this with your Databricks Account Executive, and they can raise a support request to get this investigated further.Please let ...

  • 3 kudos
2 More Replies
akmukherjee
by New Contributor III
  • 7138 Views
  • 24 replies
  • 5 kudos

Resolved! Unable to enable Serverless Notebooks

Hello there,I have a Databricks Premium Subscription but am not able to enable Serverless Notebooks (as that option does not seem to exist). I have gone through DB documentation and have Unity Catalog Enabled. I even opened a ticket (00591635) but it...

  • 7138 Views
  • 24 replies
  • 5 kudos
Latest Reply
Fellnerse
New Contributor II
  • 5 kudos

We have the same exact thing happening. The /config endpoint shows it is enabled, but we can not select it. The account is several month old. @Walter_C could you help out here as well?

  • 5 kudos
23 More Replies
jp_allard1
by New Contributor
  • 1380 Views
  • 2 replies
  • 2 kudos

Resolved! Databricks One

Hello,I can not find where I enable databricks one in my workspace. Can someone help me understand were this is located or who can grant me access to this feature?I checked the the "Previews" in my account and it is not there.Thanks in advance.Best,J...

  • 1380 Views
  • 2 replies
  • 2 kudos
Latest Reply
koji_kawamura
Databricks Employee
  • 2 kudos

Hi @jp_allard1  Databricks One is now in Public Preview. It is a Workspace level feature, so a user who has Workspace Admin role should be able to enable it from the Workspace Preview setting page as shown in this screenshot.  

  • 2 kudos
1 More Replies
yumnus
by New Contributor III
  • 1871 Views
  • 7 replies
  • 0 kudos

Not able to connect to GCP Secret Manager except when using "No isolation shared" Cluster

Hey everyone,We’re trying to access secrets stored in GCP Secret Manager using its Python package from Databricks on GCP. However, we can only reach the Secret Manager when using "No Isolation Shared" clusters, which is not an option for us. Currentl...

  • 1871 Views
  • 7 replies
  • 0 kudos
Latest Reply
blemgorfell
New Contributor II
  • 0 kudos

This is a huge issue. We are seeing the same thing. google auth is broken for databricks on GCP? Only with no isolation enabled is it able to access the metadata service and get credentials.Why is the metadata service not reachable? I would be shocke...

  • 0 kudos
6 More Replies
APJESK
by New Contributor III
  • 457 Views
  • 1 replies
  • 2 kudos

Resolved! Serverless Workspace Observability

I’m setting up observability for a Databricks serverless workspace on AWS and need some guidance.I know we can configure audit logs for S3 delivery, but I’m unsure if that alone is sufficient.For a complete observability setup especially when integra...

  • 457 Views
  • 1 replies
  • 2 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 2 kudos

Hey @APJESK - thanks for reaching out!  For comprehensive observability in a Databricks serverless workspace on AWS, particularly when integrating with tools like CloudWatch, Splunk, or Kibana, enabling audit log delivery to S3 is a crucial first st...

  • 2 kudos