cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

LEE_SUKJUN
by New Contributor II
  • 108 Views
  • 1 replies
  • 0 kudos

Inquire the location of creating a metastore resource

 I created Databricks on AWS today.Of course, we're planning to switch to paid.By the way, the Metastore is in the US Region after I was born.I'm a Korea APJ, and is the Metastore all only run in the US?Does the Metastore have no impact if I query or...

LEE_SUKJUN_0-1762932473218.png
  • 108 Views
  • 1 replies
  • 0 kudos
Latest Reply
Coffee77
Contributor III
  • 0 kudos

Hi @LEE_SUKJUN , I think the general principle should be to keep all components (metastore, workspace, and cloud storage) in the same region in order to avoid cross-region latency, data egress costs, and compliance issues.Concerning number of metasto...

  • 0 kudos
DazMunro
by New Contributor
  • 129 Views
  • 2 replies
  • 0 kudos

Using or integrating SIlver or Gold Zone data in an Operational API

I am looking to understand what sort of approach we can take to use Silver or Gold zone data in an Operational style API, or even if we should. We have data that makes it's way to the Silver and Gold zones in our Medallion Architecture and it kind of...

  • 129 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

Databricks is an analytics system and isn't optimized to perform as an OLTP system.  Additionally, Databricks compute can scale to zero if you set it to do so.  This means if you want to use gold/silver data in a real-time way you need to keep a clus...

  • 0 kudos
1 More Replies
pranav5
by New Contributor II
  • 113 Views
  • 1 replies
  • 1 kudos

Trying to Backup Dashboards and Queries from our Workspace.

We are using a databricks workspace and our IT team is decommissioning it as our time with it is being done. I have many queries and dashboards developed. I want to copy these, unfortunately when i download using zip or .dbc these queries or dashboar...

  • 113 Views
  • 1 replies
  • 1 kudos
Latest Reply
bianca_unifeye
New Contributor III
  • 1 kudos

NotebooksThese are the easiest assets to back up.You can export them individually or in bulk as:.dbc – Databricks archive format (can re-import directly into a new workspace).source or .py – raw code export (ideal for version control)To download in b...

  • 1 kudos
jzu
by New Contributor II
  • 683 Views
  • 5 replies
  • 1 kudos

Problem with Metastore

Hello community.We are facing an issue when deploying and configuring metastore using terraform. We are using Azure Devops pipeline for deployment. The identity running the pipeline is a managed identity and it's set as account admin in Account porta...

  • 683 Views
  • 5 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

@jzu , is this a new error or is it the same as before. I need more details please. Louis.

  • 1 kudos
4 More Replies
ambigus9
by Contributor
  • 3985 Views
  • 2 replies
  • 0 kudos

R-studio on Dedicated Cluster Invalid Access Token

Hello!! Currently I have an R-studio installed on a Dedicated Cluster over Azure Databricks, here are the specs:I must to make enfasis over the Access mode: Manual and Dedicated to a Group.Here, we install R-studio using a notebook with the following...

ambigus9_0-1743020318837.png error token access.png
  • 3985 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

You’re seeing two key issues with your RStudio Server on Azure Databricks: RStudio stops working after 1–2 days. You get permission errors using sparklyr and can’t update the Connections pane. Let’s address each: 1. RStudio Server Stops Working A...

  • 0 kudos
1 More Replies
Carelytix
by New Contributor II
  • 119 Views
  • 1 replies
  • 1 kudos

Resolved! Signing up BAA requiring Compliance Security Profile activation

Hello folks —  I need help with enabling compliance security profile for my account. I need this to execute/sign a HIPPA BAA on my account. For this, I need an enhanced security & compliance add-on for this. I first reached out to help@databricks.com...

  • 119 Views
  • 1 replies
  • 1 kudos
Latest Reply
Carelytix
New Contributor II
  • 1 kudos

I was able to turn this feature on by upgrading the plan to "Enterprise". Thanks!

  • 1 kudos
unj1m
by New Contributor III
  • 3576 Views
  • 1 replies
  • 0 kudos

Getting "Data too long for column session_data'" creating a CACHE table

Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting  Data too long for column 'session_data'.  The query I'm using isn't referencing a session_data colu...

  • 3576 Views
  • 1 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Thanks for sharing the details—this is a common point of confusion with caching versus temporary objects in Databricks. What’s likely happening The error message “Data too long for column 'session_data'” is emitted by the metastore/metadata persiste...

  • 0 kudos
sebastiandz
by New Contributor
  • 3723 Views
  • 1 replies
  • 0 kudos

Azure Network Connectivity Configurations API failing

It seems like since yesterday evening (Europe time) there's a platform-side issue with Network Connectivity Configurations API on Azure Databricks Accounts.API calls are being redirected to a login page, causing multiple different tools, such as Terr...

  • 3723 Views
  • 1 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

I can't speak to this specific incident as it was over a year ago, but for future reference: Monitoring Databricks service health Please subscribe to the Azure Databricks Status Page for your region(s) and the specific components you use. You can opt...

  • 0 kudos
axelboursin
by New Contributor II
  • 4549 Views
  • 2 replies
  • 3 kudos

Databricks and AWS CodeArtifact

Hello, I saw multiple topics about it, but I need explanations and a solution.In my context, we have developers that are developing Python projects, like X.In Databricks, we have a cluster with a library of the main project A that is dependent of X.p...

  • 4549 Views
  • 2 replies
  • 3 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 3 kudos

Hi @axelboursin , I think this article will help you out: https://docs.databricks.com/aws/en/admin/workspace-settings/default-python-packages (option 1 below). Recommended approaches (choose based on your environment): For broad, consistent behavior...

  • 3 kudos
1 More Replies
jeffreym9
by New Contributor III
  • 3242 Views
  • 1 replies
  • 0 kudos

Databricks Managed MLFlow with Different Unity Catalog for Multi-tenant Production Tracing

Is the Databricks Managed MLFlow only trace LLM traffic through Serving Endpoint? Does it support manual tracing in my LLM application with decorator @mlflow.trace ?Also, How can Databricks Managed MLFlow support multi-tenant cases where traces need ...

  • 3242 Views
  • 1 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Hi @jeffreym9 , Managed MLflow Tracing captures traces in notebooks, local IDEs, jobs, and Serving Endpoints—via autologging or manual instrumentation, so it’s not limited to Serving Endpoints-only. Set your tracking URI to databricks and traces are...

  • 0 kudos
Marco37
by Contributor II
  • 428 Views
  • 3 replies
  • 4 kudos

Resolved! Writing data from Azure Databrick to Azure SQL Database

Good day,We have a customer who wanted to query an Azure SQL Database from a Databricks notebook. We have configured a connection and a catalog for him and he is able to query the Azure SQL Database. Now he has a new request. He also wants to write d...

Schermafbeelding 2025-11-05 115405.jpg
  • 428 Views
  • 3 replies
  • 4 kudos
Latest Reply
Coffee77
Contributor III
  • 4 kudos

You can customize the below code, that makes use of Spark SQL Server access connector, as per your needs: def PersistRemoteSQLTableFromDF( df: DataFrame, databaseName: str, tableName: str, mode: str = "overwrite", schemaName: str ...

  • 4 kudos
2 More Replies
JeremySu
by New Contributor III
  • 733 Views
  • 4 replies
  • 4 kudos

Resolved! A question about Databricks Fine-grained Access Control (FGAC) cost on dedicated compute

Hi All,recently, while testing Fine-grained Access Control (FGAC) on dedicated compute, I came across something that seems a bit unusual, and I’d like to ask if anyone else has seen similar behavior.I created a view with only one record, and had anot...

JeremySu_0-1761878010180.png
  • 733 Views
  • 4 replies
  • 4 kudos
Latest Reply
JeremySu
New Contributor III
  • 4 kudos

Hi @mark_ott Thank you very much for providing such a detailed and insightful explanation.This clearly resolves our confusion as to why an FGAC query that ran for only a few seconds ultimately incurred the DBU consumption shown on the bill, due to th...

  • 4 kudos
3 More Replies
kfadratek
by New Contributor
  • 189 Views
  • 1 replies
  • 0 kudos

Issue Using Private CA Certificates for Databricks Serverless Private Git → On-Prem GitLab Connectio

Hi everyone,I’m trying to properly configure Databricks Serverless Private Git to connect to our on-premises GitLab, but I'm running into issues with private CA certificates.Following the latest Databricks recommendations, our connection to GitLab go...

  • 189 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hello @kfadratek ,  thanks for the detailed context — Let's take a look at what could be causing the SSL verification to fail with a custome CA in Serverless Private Git and discuss some approaches that might resolve it. What’s likely going wrong   B...

  • 0 kudos
pablogarcia
by New Contributor II
  • 119 Views
  • 1 replies
  • 0 kudos

AiGatewayConfig non backward compatibly issue from 16.3 to 16.4

We're moving form version 16.3 to version 16.4 LTD, and looks like there is a non backward compatibly issue.  This is the import that I have in my codefrom databricks.sdk.service.serving import ( # type: ignore # noqa ServedModelInput, # type:...

  • 119 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error indicates that AiGatewayConfig cannot be imported from databricks.sdk.service.serving after upgrading from version 16.3 to 16.4 LTD, signaling a breaking change or removal in the SDK. Why This Happens With minor version updates, Databricks ...

  • 0 kudos
mo_moattar
by New Contributor III
  • 3810 Views
  • 1 replies
  • 0 kudos

Would it be great if the job workflow supports running docker-based tasks

The current workflow function in Databricks gives a series of options such as DLT, Dbt, python scripts, python files, JAR, etc. It would be good to add a docker file to that and simplify the development process a lot, especially on the unit and integ...

  • 3810 Views
  • 1 replies
  • 0 kudos
Latest Reply
jack_zaldivar
Databricks Employee
  • 0 kudos

Hi @mo_moattar ! Is this still some functionality you're interested in? If so, can you explain a bit more on the use case you're thinking? I'm happy to add this to our feature requests internally, but I know that the Product team will likely request ...

  • 0 kudos