- 255 Views
- 1 replies
- 1 kudos
Resolved! Trying to Backup Dashboards and Queries from our Workspace.
We are using a databricks workspace and our IT team is decommissioning it as our time with it is being done. I have many queries and dashboards developed. I want to copy these, unfortunately when i download using zip or .dbc these queries or dashboar...
- 255 Views
- 1 replies
- 1 kudos
- 1 kudos
NotebooksThese are the easiest assets to back up.You can export them individually or in bulk as:.dbc – Databricks archive format (can re-import directly into a new workspace).source or .py – raw code export (ideal for version control)To download in b...
- 1 kudos
- 863 Views
- 5 replies
- 1 kudos
Problem with Metastore
Hello community.We are facing an issue when deploying and configuring metastore using terraform. We are using Azure Devops pipeline for deployment. The identity running the pipeline is a managed identity and it's set as account admin in Account porta...
- 863 Views
- 5 replies
- 1 kudos
- 1 kudos
@jzu , is this a new error or is it the same as before. I need more details please. Louis.
- 1 kudos
- 4144 Views
- 2 replies
- 0 kudos
R-studio on Dedicated Cluster Invalid Access Token
Hello!! Currently I have an R-studio installed on a Dedicated Cluster over Azure Databricks, here are the specs:I must to make enfasis over the Access mode: Manual and Dedicated to a Group.Here, we install R-studio using a notebook with the following...
- 4144 Views
- 2 replies
- 0 kudos
- 0 kudos
You’re seeing two key issues with your RStudio Server on Azure Databricks: RStudio stops working after 1–2 days. You get permission errors using sparklyr and can’t update the Connections pane. Let’s address each: 1. RStudio Server Stops Working A...
- 0 kudos
- 221 Views
- 1 replies
- 1 kudos
Resolved! Signing up BAA requiring Compliance Security Profile activation
Hello folks — I need help with enabling compliance security profile for my account. I need this to execute/sign a HIPPA BAA on my account. For this, I need an enhanced security & compliance add-on for this. I first reached out to help@databricks.com...
- 221 Views
- 1 replies
- 1 kudos
- 1 kudos
I was able to turn this feature on by upgrading the plan to "Enterprise". Thanks!
- 1 kudos
- 3719 Views
- 1 replies
- 0 kudos
Getting "Data too long for column session_data'" creating a CACHE table
Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting Data too long for column 'session_data'. The query I'm using isn't referencing a session_data colu...
- 3719 Views
- 1 replies
- 0 kudos
- 0 kudos
Thanks for sharing the details—this is a common point of confusion with caching versus temporary objects in Databricks. What’s likely happening The error message “Data too long for column 'session_data'” is emitted by the metastore/metadata persiste...
- 0 kudos
- 3847 Views
- 1 replies
- 0 kudos
Azure Network Connectivity Configurations API failing
It seems like since yesterday evening (Europe time) there's a platform-side issue with Network Connectivity Configurations API on Azure Databricks Accounts.API calls are being redirected to a login page, causing multiple different tools, such as Terr...
- 3847 Views
- 1 replies
- 0 kudos
- 0 kudos
I can't speak to this specific incident as it was over a year ago, but for future reference: Monitoring Databricks service health Please subscribe to the Azure Databricks Status Page for your region(s) and the specific components you use. You can opt...
- 0 kudos
- 4720 Views
- 2 replies
- 3 kudos
Databricks and AWS CodeArtifact
Hello, I saw multiple topics about it, but I need explanations and a solution.In my context, we have developers that are developing Python projects, like X.In Databricks, we have a cluster with a library of the main project A that is dependent of X.p...
- 4720 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @axelboursin , I think this article will help you out: https://docs.databricks.com/aws/en/admin/workspace-settings/default-python-packages (option 1 below). Recommended approaches (choose based on your environment): For broad, consistent behavior...
- 3 kudos
- 3378 Views
- 1 replies
- 0 kudos
Databricks Managed MLFlow with Different Unity Catalog for Multi-tenant Production Tracing
Is the Databricks Managed MLFlow only trace LLM traffic through Serving Endpoint? Does it support manual tracing in my LLM application with decorator @mlflow.trace ?Also, How can Databricks Managed MLFlow support multi-tenant cases where traces need ...
- 3378 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @jeffreym9 , Managed MLflow Tracing captures traces in notebooks, local IDEs, jobs, and Serving Endpoints—via autologging or manual instrumentation, so it’s not limited to Serving Endpoints-only. Set your tracking URI to databricks and traces are...
- 0 kudos
- 657 Views
- 3 replies
- 4 kudos
Resolved! Writing data from Azure Databrick to Azure SQL Database
Good day,We have a customer who wanted to query an Azure SQL Database from a Databricks notebook. We have configured a connection and a catalog for him and he is able to query the Azure SQL Database. Now he has a new request. He also wants to write d...
- 657 Views
- 3 replies
- 4 kudos
- 4 kudos
You can customize the below code, that makes use of Spark SQL Server access connector, as per your needs: def PersistRemoteSQLTableFromDF( df: DataFrame, databaseName: str, tableName: str, mode: str = "overwrite", schemaName: str ...
- 4 kudos
- 1029 Views
- 4 replies
- 4 kudos
Resolved! A question about Databricks Fine-grained Access Control (FGAC) cost on dedicated compute
Hi All,recently, while testing Fine-grained Access Control (FGAC) on dedicated compute, I came across something that seems a bit unusual, and I’d like to ask if anyone else has seen similar behavior.I created a view with only one record, and had anot...
- 1029 Views
- 4 replies
- 4 kudos
- 4 kudos
Hi @mark_ott Thank you very much for providing such a detailed and insightful explanation.This clearly resolves our confusion as to why an FGAC query that ran for only a few seconds ultimately incurred the DBU consumption shown on the bill, due to th...
- 4 kudos
- 343 Views
- 1 replies
- 0 kudos
Resolved! Issue Using Private CA Certificates for Databricks Serverless Private Git → On-Prem GitLab Connectio
Hi everyone,I’m trying to properly configure Databricks Serverless Private Git to connect to our on-premises GitLab, but I'm running into issues with private CA certificates.Following the latest Databricks recommendations, our connection to GitLab go...
- 343 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @kfadratek , thanks for the detailed context — Let's take a look at what could be causing the SSL verification to fail with a custome CA in Serverless Private Git and discuss some approaches that might resolve it. What’s likely going wrong B...
- 0 kudos
- 178 Views
- 1 replies
- 0 kudos
AiGatewayConfig non backward compatibly issue from 16.3 to 16.4
We're moving form version 16.3 to version 16.4 LTD, and looks like there is a non backward compatibly issue. This is the import that I have in my codefrom databricks.sdk.service.serving import ( # type: ignore # noqa ServedModelInput, # type:...
- 178 Views
- 1 replies
- 0 kudos
- 0 kudos
The error indicates that AiGatewayConfig cannot be imported from databricks.sdk.service.serving after upgrading from version 16.3 to 16.4 LTD, signaling a breaking change or removal in the SDK. Why This Happens With minor version updates, Databricks ...
- 0 kudos
- 3950 Views
- 1 replies
- 0 kudos
Would it be great if the job workflow supports running docker-based tasks
The current workflow function in Databricks gives a series of options such as DLT, Dbt, python scripts, python files, JAR, etc. It would be good to add a docker file to that and simplify the development process a lot, especially on the unit and integ...
- 3950 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mo_moattar ! Is this still some functionality you're interested in? If so, can you explain a bit more on the use case you're thinking? I'm happy to add this to our feature requests internally, but I know that the Product team will likely request ...
- 0 kudos
- 4671 Views
- 1 replies
- 0 kudos
Resolved! Databricks Community - Cannot See Reply To My Posts
Databricks Community - Cannot See Reply To My Posts.Am I the only one facing this issue or others are also facing the same.
- 4671 Views
- 1 replies
- 0 kudos
- 0 kudos
@clentin , I know this is a response to an older post, but I'm wondering if you ever got this resolved or not? I am able to view the responses to your initial post, so I took the liberty of adding them as screenshots for you. Hope this helps!
- 0 kudos
- 4294 Views
- 2 replies
- 1 kudos
Automating Version Control for Databricks Workflows
I am currently using Databricks Asset Bundles to manage and deploy workflows. While I have successfully automated the version control for notebooks, I am facing challenges with workflows. Specifically, I am looking to automate the process of fetching...
- 4294 Views
- 2 replies
- 1 kudos
- 1 kudos
Automating the reverse synchronization of Databricks workflow (Job) changes made in the Databricks UI back to a GitHub repository is a significant challenge, mainly due to the intentional directionality and guardrails imposed by Databricks Asset Bund...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
57 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |