cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kayla
by Valued Contributor II
  • 653 Views
  • 1 replies
  • 0 kudos

GKE Cluster Shows "Starting" Even After its turned on

Curious if anyone else has run into this. After changing to GKE based clusters, they all turn on but don't show as turned on - we'll have it show as "Starting" but be able to see the same cluster in the dropdown that's already active. "Changing" to t...

Kayla_0-1749815522351.png
  • 653 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Yes, others have reported encountering this exact issue with Databricks clusters on Google Kubernetes Engine (GKE): after transitioning to GKE-based clusters, the UI may show clusters as "Starting" even though the cluster is already up and usable in ...

  • 0 kudos
zaicnupagadi
by New Contributor
  • 177 Views
  • 1 replies
  • 1 kudos

Reaching out to Azure Storage with IP from Private VNET pool

Hey All,Is there a way for Databricks to reach out to Azure Storage using private endpoint?We would like no omit enabling access by "all trusted services".All resources are in the same VNET however when Databrics tries to reach out to Storage instead...

  • 177 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Yeah, it’s definitely possible for Databricks to hit Azure Storage through a private endpoint without turning on “allow trusted services.” The key is making sure everything’s using the private network path.Right now, that 10.0.35.x IP you’re seeing i...

  • 1 kudos
biubiurx
by New Contributor
  • 176 Views
  • 1 replies
  • 1 kudos

Power Automate Azure Databricks connector cannot get output result of a run

Hi everybody, I'm using the Azure Databricks connector in Power automate and try to trigger a job run + get result of that single run. My job created in databricks is to run a notebook that contains a single block of SQL code, and that's the only tas...

  • 176 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Even though your Databricks job only has one task, Power Automate might still treats it as a multi-task job under the hood. That’s why you're getting the error when trying to fetch the output directly from the job run.Here’s a simple workaround you c...

  • 1 kudos
Marco37
by Contributor II
  • 479 Views
  • 5 replies
  • 3 kudos

Resolved! Neeed help with setting up a connection from Databricks to an Azure SQL Database with the REST API

Good day,I need some help with automating a connection from databricks to an Azure SQL Database. I'am able to configure the connection with the UI (Catalog Explorer), but I also want to configure it with a REST API (or SQL script), so that I can inte...

  • 479 Views
  • 5 replies
  • 3 kudos
Latest Reply
Marco37
Contributor II
  • 3 kudos

Hi Bianca,Thanks for your help. If I understand correctly the "authorization_code" and "pkce_verifier" are normally generated by the button "Sign in with Azure Entra ID" when I configure a connection through the Catalog Explorer.My organization is ne...

  • 3 kudos
4 More Replies
renancy
by New Contributor III
  • 3850 Views
  • 1 replies
  • 0 kudos

Using Databricks CLI for generating Notebooks not supported or not implemented

Hi I'm a Data engineer and recently developed a Notebook analytics template for general purposes that I would like to be the standard on my company. Continuing, I created another notebook with a text widget that uses the user input to map the folder ...

  • 3850 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The issue you’re facing is common among Databricks users who try to automate notebook cloning via shell commands or %sh magic, only to encounter format loss: exporting via %sh databricks workspace export or related commands typically results in .dbc,...

  • 0 kudos
camilo_s
by Contributor
  • 4393 Views
  • 1 replies
  • 0 kudos

Bug when re-creating force deleted external location

When re-creating an external location that was previously force-deleted (because it had a soft-deleted managed table), the newly re-created external location preserves the reference to the soft-deleted managed table from the previous external locatio...

  • 4393 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Databricks Unity Catalog currently maintains references to soft-deleted managed tables even after the associated external location is force-deleted and re-created with the same name and physical location, causing persistent deletion failures due to l...

  • 0 kudos
thibault
by Contributor III
  • 3782 Views
  • 1 replies
  • 0 kudos

Streaming job update

Hi! Using bundles, I want to update a running streaming job. All good until the new job gets deployed, but then the job needs to be stopped manually so that the new assets are used and it has to be started manually. This might lead to the job running...

  • 3782 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To handle updates to streaming jobs automatically and ensure that new code or assets are picked up without requiring manual stops and restarts, you typically use one of the following approaches depending on your streaming framework and deployment env...

  • 0 kudos
cyborg-de
by New Contributor
  • 4263 Views
  • 1 replies
  • 0 kudos

DNS resolution across vnet

Hi, I have created a new databricks workspace in Azure with backend private link. Settings are Required NSG rules - No Azure Databricks RuleNSG rules for AAD and azfrontdoor were added as per documentation. Private endpoint with subresource  databric...

  • 4263 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Based on your description, the error when creating a Databricks compute cluster in Azure with Private Link is likely due to DNS resolution issues between the workspace VNET and the separate VNET hosting your private DNS zone. Even with VNET peering a...

  • 0 kudos
Junda
by New Contributor III
  • 3956 Views
  • 1 replies
  • 0 kudos

How to install private repository as package dependency in Databricks Workflow

I am a member of the development team in our company and we use Databricks as sort of like ETL tool. We utilize git integration for our program and run Workflow daily basis. Recently, we created another company internal private git repository and wan...

  • 3956 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

You can install and use private repository packages in Databricks workflows in a scalable and secure way, but there are trade-offs and best practices to consider for robust, team-friendly automation. Here's a direct answer and a breakdown of solution...

  • 0 kudos
songhan89
by New Contributor
  • 3972 Views
  • 1 replies
  • 0 kudos

Why is writing direct to Unity Catalog Volume slower than to Azure Blob Storage (xarray -> zarr)

Hi,I have some workloads whereby i need to export an xarray object to a Zarr store.My UC volume is using ADLS.I tried to run a simple benchmark and found that UC Volume is considerably slower.a) Using a fsspec ADLS store pointing to the same containe...

songhan89_0-1738517230323.png
Administration & Architecture
Unity Catalog
Volume
  • 3972 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Writing directly to a Unity Catalog (UC) Volume in Databricks is often slower than writing to Azure Blob Storage (ADLS) using an fsspec-based store, especially for workloads exporting xarray objects to Zarr. This performance gap has been noted and di...

  • 0 kudos
gyorgyjelinek
by New Contributor II
  • 3712 Views
  • 1 replies
  • 0 kudos

How to calculate accurate usage cost for a longer contractual period?

Hi Experts!I work on providing and accurate total cost (in DBU and USD as well) calculation for my team for the whole ongoing contractual period. I'v checked the following four options:Account console: Manage account - Usage - Consumption (Legacy): t...

  • 3712 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Based on your description, the REST API for billable usage logs (Option 4) is likely the most comprehensive and reliable method for retrieving usage and cost data for the full contractual period, including potentially the missing first two months. Th...

  • 0 kudos
carlos_tasayco
by Contributor
  • 3375 Views
  • 1 replies
  • 1 kudos

Get managedResourceGroup from serverless

Hello,In my job I have a task where I should modify a notebook to get dynamically the environment, for example:This is how we get it:dic = {"D":"dev", "Q":"qa", "P":"prod"}managedResourceGroup = spark.conf.get("spark.databricks.xxxxx")xxxxx_Index = m...

  • 3375 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

To dynamically detect your Databricks environment (dev, qa, prod) in a serverless notebook, without relying on manual REST API calls, you typically need a reliable way to extract context directly inside the notebook. However, serverless notebooks oft...

  • 1 kudos
AxelM
by New Contributor
  • 3563 Views
  • 1 replies
  • 0 kudos

Only absolute paths are currently supported. Paths must begin with '/'

I am facing the above issue when using the Python Databricks SDK. I retreive the job-definition by "client.jobs.get()" and then try to create it on another workspace with"client.jobs.create()"Therefore the job-definition is correct and working fine o...

  • 3563 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error "Only absolute paths are currently supported. Paths must begin with '/'" in the context of the Databricks Python SDK means that when creating a job, the notebook path you provide must be an absolute workspace path (like /Users/username/note...

  • 0 kudos
AiswaryaS
by New Contributor II
  • 3099 Views
  • 1 replies
  • 0 kudos

Query has been timed out due to inactivity while connecting from Tableau Prep

Hi,We are experiencing Query timed out error while running Tableau flows with connections to Databricks. The query history for Serverless SQL warehouse initially showing as finished in Databricks. But later the query status change to "Query has been ...

  • 3099 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The "Query has been timed out due to inactivity" error with Tableau flows connected to Databricks Serverless SQL Warehouse is a known and intermittent issue impacting several users, even when the SQL warehouse does not auto-terminate during the proce...

  • 0 kudos
Saurabh_kanoje
by New Contributor
  • 178 Views
  • 2 replies
  • 2 kudos

Resolved! Learning Databricks

Hi All,I am new to databricks and trying to learn things around, i have experience in platform administration and Platform integration and managements roles.Can someone please guide a correct path learning path around platform administration and is t...

  • 178 Views
  • 2 replies
  • 2 kudos
Latest Reply
bianca_unifeye
New Contributor III
  • 2 kudos

Hi @Saurabh_kanoje , welcome to the Databricks community!In the Databricks Academy, there’s a free course called Databricks Platform Administration Fundamentals, which is a great starting point.I’d also recommend exploring the Azure, AWS AND GCP Data...

  • 2 kudos
1 More Replies