cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dermoritz
by Databricks Partner
  • 335 Views
  • 5 replies
  • 3 kudos

Spark config ignored in job run settings

i am talking about this setting: i tried so far:spark.executor.cores 8andspark.log.level INFOBoth documented here: https://spark.apache.org/docs/latest/configuration.htmlbut neither i see effect nor i see them set if i check sparkui -> environment ta...

Screenshot 2026-03-25 150815.png Screenshot 2026-03-25 151527.png
  • 335 Views
  • 5 replies
  • 3 kudos
Latest Reply
emma_s
Databricks Employee
  • 3 kudos

Hi, You're not putting in the wrong place, it's just that Databricks doesn't allow certain configs, because they are managed by Databricks for you. For example your core spark config you've shown above won't be recognised as this is set by selected c...

  • 3 kudos
4 More Replies
abhijit007
by Databricks Partner
  • 442 Views
  • 2 replies
  • 1 kudos

Resolved! Running Browser Based Agentic Applications on Databricks

Hi,We are evaluating whether it is possible to host a browser‑based agentic application on Databricks.Our application performs frontend UI automation using the browser-use Python library and also exposes FastAPI endpoints to drive a UI..Application O...

  • 442 Views
  • 2 replies
  • 1 kudos
Latest Reply
Lu_Wang_ENB_DBX
Databricks Employee
  • 1 kudos

TLDR: Databricks Apps/serverless won’t support this pattern; classic compute with Databricks Container Services is your only real option on Databricks, and even that has trade‑offs. For serious browser automation, run it off‑platform and integrate wi...

  • 1 kudos
1 More Replies
NikeshPurohit
by Databricks Partner
  • 989 Views
  • 1 replies
  • 1 kudos

Unable to connect Claude connector to Databricks MCP server (Genie)

Hello everyone,I am trying to connect Claude’s Custom Connector (Remote MCP) to a Databricks Managed MCP Server using a Genie space, but I keep encountering an authentication connection error. I would appreciate guidance on whether this is a configur...

  • 989 Views
  • 1 replies
  • 1 kudos
Latest Reply
emma_s
Databricks Employee
  • 1 kudos

Hi, updating from my previous post to give the latest update. As of two weeks ago, there is a public preview for this, it needs to be enabled in your workspace. You also need to have the Claude IP added to Databricks if you have any IP workspace rest...

  • 1 kudos
RikL
by Databricks Partner
  • 303 Views
  • 2 replies
  • 2 kudos

Resolved! Automatic Identity Management with Nested Groups and API Access

Hi all,I’m exploring Automatic Identity Management for synchronizing nested groups in Azure Databricks. According to the documentation, this feature supports nested groups: Automatic Identity ManagementHowever, the same article notes that groups sync...

  • 303 Views
  • 2 replies
  • 2 kudos
Latest Reply
emma_s
Databricks Employee
  • 2 kudos

Hi, I've just had a look internally and there is some discussion about making this functionality available but I can't give you a definitive idea of when this might be.  In terms of workarounds the best one I can find is to use Tarracurl to make raw ...

  • 2 kudos
1 More Replies
margarita_shir
by New Contributor III
  • 330 Views
  • 4 replies
  • 0 kudos

Private Route53 hosted zone for cloud.databricks.com

We have Tailscale deployed on the same VPC as our Databricks workspaces. To enable PrivateLink for some workspaces, we created a private Route53 hosted zone for cloud.databricks.com with CNAME records pointing workspace hostnames to the PrivateLink e...

  • 330 Views
  • 4 replies
  • 0 kudos
Latest Reply
margarita_shir
New Contributor III
  • 0 kudos

Thank you for your response. I added a new private hosted zone in Amazon Route 53 for privatelink.cloud.databricks.com, and created an A record for us-east-1.privatelink.cloud.databricks.com pointing to the private endpoint IP.I had a question regard...

  • 0 kudos
3 More Replies
PradeepPrabha
by New Contributor III
  • 2373 Views
  • 6 replies
  • 0 kudos

Resolved! Any documentation mentioning connectivity from Azure SQL database connectivity to Azure Databricks

Any documentation available to connect from the Azure SQL database to Azure Databricks SQL workspace. We created a SQL warehouse personal access token for a user in a different team who can connect from his on-prem SQL DB to Databricks using the conn...

  • 2373 Views
  • 6 replies
  • 0 kudos
Latest Reply
PradeepPrabha
New Contributor III
  • 0 kudos

Thank you for the detailed answer

  • 0 kudos
5 More Replies
abhijit007
by Databricks Partner
  • 772 Views
  • 4 replies
  • 3 kudos

Resolved! Security & Compliance understanding on LLM Usage in Databricks Genie and Agentbricks

Hi everyone,With the increasing focus on security and compliance for AI Agents and LLMs, I wanted to get some clarity on a couple of points related to Databricks Genie and Agentbricks.Could someone help provide detailed information on the following, ...

  • 772 Views
  • 4 replies
  • 3 kudos
Latest Reply
Ashwin_DSA
Databricks Employee
  • 3 kudos

Hi @abhijit007, Please take a look at these pages. They answer your queries in detail for Genie. https://docs.databricks.com/genie - Covers architecture and how it works. Also covers security.  https://docs.databricks.com/databricks-ai/databricks-ai-...

  • 3 kudos
3 More Replies
NatJ
by New Contributor III
  • 389 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks One Redirectio

Hello, I have an Entra ID group linked to Databricks with the Consumer Access entitlement enabled, other entitlements are unchecked. They also have "use catalog" on the a specific catalog. They have "select" and "use schema" to a gold level schema wi...

  • 389 Views
  • 2 replies
  • 0 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 0 kudos

Hi @NatJ, You are correct that users with only the Consumer Access entitlement are intended to see the Databricks One interface when they log in. However, the behavior you are observing with direct URLs to the catalog explorer is expected, and here i...

  • 0 kudos
1 More Replies
Valerio
by New Contributor
  • 1914 Views
  • 2 replies
  • 0 kudos

GitHub Actions OIDC with Databricks: wildcard subject for pull_request workflows

Hi,I’m configuring GitHub Actions OIDC authentication with Databricks following the official documentation:https://docs.databricks.com/aws/en/dev-tools/auth/provider-githubWhen running a GitHub Actions workflow triggered by pull_request, authenticati...

  • 1914 Views
  • 2 replies
  • 0 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 0 kudos

Hi @Valerio,The challenge you are running into is a common one when setting up OIDC federation for pull_request-triggered workflows. Here is a breakdown of the issue and several approaches to solve it.UNDERSTANDING THE SUBJECT CLAIM FOR PULL REQUESTS...

  • 0 kudos
1 More Replies
vino2000
by New Contributor
  • 253 Views
  • 1 replies
  • 0 kudos

Payment with Netsuite Customer Payment Portal

I was using a trial account and credit was finished. Now i have done my first month and my card has been rejected,At the moment I received a warning email where i should pay at this platform named as Netsuite Customer Payment Portal but I can't login...

  • 253 Views
  • 1 replies
  • 0 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 0 kudos

Hi @vino2000, Thanks for reaching out. I understand the frustration -- your trial credits ran out, your card was declined, and now you have received an email pointing you to the NetSuite Customer Payment Portal but you cannot log in or reset your pas...

  • 0 kudos
GeertS
by New Contributor II
  • 704 Views
  • 5 replies
  • 1 kudos

Resolved! Hive Metastore - Disable Legacy Access option not found

Hi,I've just provisioned a new Databricks Workspace and I would like to enable the hive megastore for some testing. The problem is I cannot find the "Disable legacy access" setting. I went several times to Workspace Settings -> Security but it is jus...

GeertS_0-1771429510264.png
  • 704 Views
  • 5 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Yes — that’s correct. In a new account, legacy features are disabled by design. That includes the Databricks-hosted workspace Hive metastore. You cannot enable or use Hive the way it was historically used inside a workspace. That path is closed. If y...

  • 1 kudos
4 More Replies
ricelso
by New Contributor II
  • 2152 Views
  • 4 replies
  • 3 kudos

Resolved! AWS-Databricks' workspace attached to a NCC doesn't generate Egress Stable IPs

I am facing an issue when configuring a Databricks workspace on AWS with a Network Connectivity Configuration (NCC).Even after attaching the NCC, the workspace does not generate Egress Stable IPs as expected.In the workspace configuration tab, under ...

  • 2152 Views
  • 4 replies
  • 3 kudos
Latest Reply
majidfn
New Contributor II
  • 3 kudos

Hello @Sai_Ponugoti I'm facing the exact same issue. The NCC doesn't generate the egress IPs. I'm on the premium plan, but on a trial period. I have added payment method to the account. But still the configuration shows: "Egress Stable IPs: ." Could ...

  • 3 kudos
3 More Replies
Ale_Armillotta
by Valued Contributor II
  • 618 Views
  • 2 replies
  • 1 kudos

Resolved! Asset Bundles + GitHub Actions: why does bundle deploy re-create UC schema and volume every run?

Hi everyone,I’m working with Databricks Asset Bundles and deploying via GitHub Actions (CI/CD). I’m seeing behavior I don’t fully understand.On every pipeline run (fresh git checkout/pull and then databricks bundle deploy to the same target environme...

  • 618 Views
  • 2 replies
  • 1 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 1 kudos

Hi @Ale_Armillotta , To answer each of your questions: Is this expected behavior for Asset Bundles Yes, deploy is declarative and will attempt “create” whenever the bundle’s tracked state doesn’t already include that resource (names aren’t used to co...

  • 1 kudos
1 More Replies
tinodj
by Databricks Partner
  • 895 Views
  • 6 replies
  • 1 kudos

Real-time output missing when using “Upload and Run File” from VS Code

I am running Python files on a Databricks cluster using the VS Code Databricks extension, specifically the “Upload and Run File” command.I cannot get real-time output in the Debug Console. I have checked the official docs:https://learn.microsoft.com/...

  • 895 Views
  • 6 replies
  • 1 kudos
Latest Reply
tinodj
Databricks Partner
  • 1 kudos

@Dali1 I have even tried to describe  https://github.com/databricks/databricks-vscode/issues/1813  and to fix on https://github.com/databricks/databricks-vscode/pull/1814, unfortunately no one seems to be interested in this. 

  • 1 kudos
5 More Replies
Fabi_DYM
by Databricks Partner
  • 599 Views
  • 3 replies
  • 1 kudos

Resolved! Error creating Git folder: Invalid Git provider credential although PAT is valid and cloning works o

Hi everyone,I’m getting this error when trying to create a Git folder in Databricks:Error creating Git folderInvalid Git provider credential for repository with URL [Placeholder]. How to fixPlease go to your remote Git provider to ensure that:You hav...

  • 599 Views
  • 3 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor III
  • 1 kudos

how are you doing it manually ? or during job deployment or during job execution? your own identity or servie principal identity?

  • 1 kudos
2 More Replies