cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

IvanPopov
by New Contributor
  • 105 Views
  • 1 replies
  • 0 kudos

Does Databricks support HNS in GCP?

Hello,I need to set up some buckets in GCP which will be used as an analytics and productive data lake. I am getting diverging feedback on whether hierarchical namespaces (HNS) should be enabled for these buckets.On one hand, HNS is advisable for ana...

  • 105 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @IvanPopov ,According to docs Google Cloud Storage hierarchical namespace (HNS) is not supported with external locations. You must disable hierarchical namespace before creating an external location. 

  • 0 kudos
nk-five1
by New Contributor II
  • 486 Views
  • 7 replies
  • 3 kudos

UC volumes not useable in Apps?

I have to install some custom library in a Python Databricks App. According to the documentation this should be possible through UC volumes:https://docs.databricks.com/aws/en/dev-tools/databricks-apps/dependencies#install-wheel-files-from-unity-catal...

Administration & Architecture
App
UC volumes
Unity Catalog
  • 486 Views
  • 7 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

 

  • 3 kudos
6 More Replies
skarpeck
by New Contributor III
  • 91 Views
  • 0 replies
  • 0 kudos

Lakebase not accessible in Private Network

We have a VNET Injected workspace in Azure. There are multiple SQL Warehouse which are easily accessible from Private Network - both directly from VM and via VPN on client's machine. We deployed Lakebase. Inside the Workspace, connectivity is working...

  • 91 Views
  • 0 replies
  • 0 kudos
AlekseiDiaz
by New Contributor II
  • 207 Views
  • 2 replies
  • 0 kudos

Internet Access from Serverless Databricks - free trial

Hi community. I started to use databricks quick set up free trial and I have been trying to access internet from a python notebook but I haven't been able to do so. Even my UI is different. Is it becasue I am using free trial?

  • 207 Views
  • 2 replies
  • 0 kudos
Latest Reply
AlekseiDiaz
New Contributor II
  • 0 kudos

I changed the set up and I linked it to aws workspace. It doesn't raise any error now.But I was using requests

  • 0 kudos
1 More Replies
libpekin
by New Contributor
  • 101 Views
  • 0 replies
  • 0 kudos

Databricks Free Edition Account Migration

Hello,I set up a Databricks Free Edition account with the intention of running it on Azure, since my environment is based in the Azure cloud. However, the account was provisioned on AWS instead. Is there a way to migrate it? Please provide the steps ...

  • 101 Views
  • 0 replies
  • 0 kudos
rmusti
by New Contributor II
  • 188 Views
  • 3 replies
  • 1 kudos

What is the maximum number of workspaces per account on GCP

I found this in the docs: "you can create at most 200 workspaces per week in the same Google Cloud project" -But that directly contradicts the 20 limit that is mentioned in resource limits docs  . But Azure has no limit, and AWS has a limit of 50. So...

  • 188 Views
  • 3 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @rmusti ,This is a bit confusing but not contradictory. Here's the important line in the docs:"For limits where Fixed is No, you can request a limit increase through your Databricks account team."So, below you have a table with resource limits. In...

  • 1 kudos
2 More Replies
AxelM
by New Contributor
  • 3729 Views
  • 3 replies
  • 0 kudos

Only absolute paths are currently supported. Paths must begin with '/'

I am facing the above issue when using the Python Databricks SDK. I retreive the job-definition by "client.jobs.get()" and then try to create it on another workspace with"client.jobs.create()"Therefore the job-definition is correct and working fine o...

  • 3729 Views
  • 3 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

You’re hitting a Jobs validation rule that depends on where the notebook is sourced from. With Git-sourced jobs, notebook paths must be relative; with workspace-sourced jobs, paths must be absolute and start with “/”. If a task’s source is treated as...

  • 0 kudos
2 More Replies
158576
by New Contributor
  • 120 Views
  • 1 replies
  • 0 kudos

mount cifs volume on all purpose compute results in permission denied

I have all networking already set, nslookup resolves NAS server IP and connectivity is enabled from worker nodes to nas server. I am able to mount the same nas drive outside of databricks, I mean standalone linux vm in the same VPC where worker nodes...

  • 120 Views
  • 1 replies
  • 0 kudos
Latest Reply
siva-anantha
Contributor
  • 0 kudos

Hello, Could you provide with more information about why you want to attach NAS drive to Databricks cluster, please? I am no expert in Storage. As far as I understand, NAS will suffer with IO and Replication Bottlenecks, when attached to Distributed ...

  • 0 kudos
zaicnupagadi
by New Contributor II
  • 269 Views
  • 2 replies
  • 4 kudos

Resolved! Reaching out to Azure Storage with IP from Private VNET pool

Hey All,Is there a way for Databricks to reach out to Azure Storage using private endpoint?We would like no omit enabling access by "all trusted services".All resources are in the same VNET however when Databrics tries to reach out to Storage instead...

  • 269 Views
  • 2 replies
  • 4 kudos
Latest Reply
zaicnupagadi
New Contributor II
  • 4 kudos

Sorry for late reply - thank you for your help Nayan!

  • 4 kudos
1 More Replies
maikel
by New Contributor III
  • 460 Views
  • 4 replies
  • 1 kudos

Agent outside databricks communication with databricks MCP server

Hello Community!I have a following use case in my project:User -> AI agent -> MCP Server -> Databricks data from unity catalog.- AI agent is not created in the databricks- MCP server is created in the databricks and should expose tools to get data fr...

  • 460 Views
  • 4 replies
  • 1 kudos
Latest Reply
maikel
New Contributor III
  • 1 kudos

Hello @mark_ott thank you very much for this! It gives me a lot of knowledge! I think that since data is stored in databricks we will go with MCP deployed there as well.I have next portion of questions - can you please tell me how to deploy my mcp se...

  • 1 kudos
3 More Replies
Dharma25
by New Contributor III
  • 225 Views
  • 2 replies
  • 1 kudos

Task Hanging issue on DBR 15.4

Hello,I am running strucutred streaming pipeline with 5 models loaded using pyfunc.spark_udf. Lately we have been noticing very strange issue of tasks getting hanged and batch is taking very long time finishing its execution.CPU utilization is around...

Screenshot 2025-11-27 at 10.24.41 PM.png
  • 225 Views
  • 2 replies
  • 1 kudos
Latest Reply
bianca_unifeye
Contributor
  • 1 kudos

On DBR 15.4 the DeadlockDetector: TASK_HANGING message usually just means Spark has noticed some very long-running tasks and is checking for deadlocks. With multiple pyfunc.spark_udf models in a streaming query the tasks often appear “stuck” because ...

  • 1 kudos
1 More Replies
ismaelhenzel
by Contributor III
  • 361 Views
  • 4 replies
  • 3 kudos

Resolved! Asset bundle vs terraform

I would like to understand the differences between Terraform and Asset Bundles, especially since in some cases, they can do the same thing. I’m not talking about provisioning storage, networking, or the Databricks workspace itself—I know that is Terr...

  • 361 Views
  • 4 replies
  • 3 kudos
Latest Reply
Coffee77
Contributor III
  • 3 kudos

First, DAB uses terraform in the background. Having said that, my recommendation is to use DAB for whatever component already included and only other tools for IaC not supported yet or non-databricks specific (private VNets, external storages, etc.) ...

  • 3 kudos
3 More Replies
GeraldBriyolan
by New Contributor II
  • 255 Views
  • 3 replies
  • 0 kudos

Databricks Federated Token Exchange Returns HTML Login Page Instead of Access Token(GCP →Databricks)

Hi everyone,I’m trying to implement federated authentication (token exchange) from Google Cloud → Databricks without using a client ID / client secret only using a Google-issued service account token. I have also created a federation policy in Databr...

GeraldBriyolan_0-1764050266136.png
Administration & Architecture
Federation Policy
GCP
Token exchange
  • 255 Views
  • 3 replies
  • 0 kudos
Latest Reply
WiliamRosa
Contributor III
  • 0 kudos

You might want to check whether the issue is related to your federation policy configuration.Try reviewing the following documentation to confirm that your policy is correctly set up (issuer, audiences, and other expected claims):https://docs.databri...

  • 0 kudos
2 More Replies
rpl
by Contributor
  • 4173 Views
  • 14 replies
  • 2 kudos

Resolved! Which API to use to list groups in which a given user is a member

Is there an API that can be used to list groups in which a given user is a member? Specifically, I’d be interested in account (not workspace) groups.It seems there used to be a workspace-level list-parents API referred to in the answers to this quest...

  • 4173 Views
  • 14 replies
  • 2 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 2 kudos

there will be 2 system tables soon - users, groups. Probably, that makes life easy. I have already requested the Databricks RAS meant for my customer to enabled these for our control plane

  • 2 kudos
13 More Replies