- 4745 Views
- 2 replies
- 0 kudos
R-studio on Dedicated Cluster Invalid Access Token
Hello!! Currently I have an R-studio installed on a Dedicated Cluster over Azure Databricks, here are the specs:I must to make enfasis over the Access mode: Manual and Dedicated to a Group.Here, we install R-studio using a notebook with the following...
- 4745 Views
- 2 replies
- 0 kudos
- 0 kudos
You’re seeing two key issues with your RStudio Server on Azure Databricks: RStudio stops working after 1–2 days. You get permission errors using sparklyr and can’t update the Connections pane. Let’s address each: 1. RStudio Server Stops Working A...
- 0 kudos
- 378 Views
- 1 replies
- 1 kudos
Resolved! Signing up BAA requiring Compliance Security Profile activation
Hello folks — I need help with enabling compliance security profile for my account. I need this to execute/sign a HIPPA BAA on my account. For this, I need an enhanced security & compliance add-on for this. I first reached out to help@databricks.com...
- 378 Views
- 1 replies
- 1 kudos
- 1 kudos
I was able to turn this feature on by upgrading the plan to "Enterprise". Thanks!
- 1 kudos
- 4330 Views
- 1 replies
- 0 kudos
Resolved! Getting "Data too long for column session_data'" creating a CACHE table
Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting Data too long for column 'session_data'. The query I'm using isn't referencing a session_data colu...
- 4330 Views
- 1 replies
- 0 kudos
- 0 kudos
Thanks for sharing the details—this is a common point of confusion with caching versus temporary objects in Databricks. What’s likely happening The error message “Data too long for column 'session_data'” is emitted by the metastore/metadata persiste...
- 0 kudos
- 4406 Views
- 1 replies
- 0 kudos
Azure Network Connectivity Configurations API failing
It seems like since yesterday evening (Europe time) there's a platform-side issue with Network Connectivity Configurations API on Azure Databricks Accounts.API calls are being redirected to a login page, causing multiple different tools, such as Terr...
- 4406 Views
- 1 replies
- 0 kudos
- 0 kudos
I can't speak to this specific incident as it was over a year ago, but for future reference: Monitoring Databricks service health Please subscribe to the Azure Databricks Status Page for your region(s) and the specific components you use. You can opt...
- 0 kudos
- 3940 Views
- 1 replies
- 0 kudos
Databricks Managed MLFlow with Different Unity Catalog for Multi-tenant Production Tracing
Is the Databricks Managed MLFlow only trace LLM traffic through Serving Endpoint? Does it support manual tracing in my LLM application with decorator @mlflow.trace ?Also, How can Databricks Managed MLFlow support multi-tenant cases where traces need ...
- 3940 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @jeffreym9 , Managed MLflow Tracing captures traces in notebooks, local IDEs, jobs, and Serving Endpoints—via autologging or manual instrumentation, so it’s not limited to Serving Endpoints-only. Set your tracking URI to databricks and traces are...
- 0 kudos
- 1421 Views
- 3 replies
- 4 kudos
Resolved! Writing data from Azure Databrick to Azure SQL Database
Good day,We have a customer who wanted to query an Azure SQL Database from a Databricks notebook. We have configured a connection and a catalog for him and he is able to query the Azure SQL Database. Now he has a new request. He also wants to write d...
- 1421 Views
- 3 replies
- 4 kudos
- 4 kudos
You can customize the below code, that makes use of Spark SQL Server access connector, as per your needs: def PersistRemoteSQLTableFromDF( df: DataFrame, databaseName: str, tableName: str, mode: str = "overwrite", schemaName: str ...
- 4 kudos
- 1974 Views
- 4 replies
- 4 kudos
Resolved! A question about Databricks Fine-grained Access Control (FGAC) cost on dedicated compute
Hi All,recently, while testing Fine-grained Access Control (FGAC) on dedicated compute, I came across something that seems a bit unusual, and I’d like to ask if anyone else has seen similar behavior.I created a view with only one record, and had anot...
- 1974 Views
- 4 replies
- 4 kudos
- 4 kudos
Hi @mark_ott Thank you very much for providing such a detailed and insightful explanation.This clearly resolves our confusion as to why an FGAC query that ran for only a few seconds ultimately incurred the DBU consumption shown on the bill, due to th...
- 4 kudos
- 889 Views
- 1 replies
- 0 kudos
Resolved! Issue Using Private CA Certificates for Databricks Serverless Private Git → On-Prem GitLab Connectio
Hi everyone,I’m trying to properly configure Databricks Serverless Private Git to connect to our on-premises GitLab, but I'm running into issues with private CA certificates.Following the latest Databricks recommendations, our connection to GitLab go...
- 889 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @kfadratek , thanks for the detailed context — Let's take a look at what could be causing the SSL verification to fail with a custome CA in Serverless Private Git and discuss some approaches that might resolve it. What’s likely going wrong B...
- 0 kudos
- 282 Views
- 1 replies
- 0 kudos
AiGatewayConfig non backward compatibly issue from 16.3 to 16.4
We're moving form version 16.3 to version 16.4 LTD, and looks like there is a non backward compatibly issue. This is the import that I have in my codefrom databricks.sdk.service.serving import ( # type: ignore # noqa ServedModelInput, # type:...
- 282 Views
- 1 replies
- 0 kudos
- 0 kudos
The error indicates that AiGatewayConfig cannot be imported from databricks.sdk.service.serving after upgrading from version 16.3 to 16.4 LTD, signaling a breaking change or removal in the SDK. Why This Happens With minor version updates, Databricks ...
- 0 kudos
- 4446 Views
- 1 replies
- 0 kudos
Would it be great if the job workflow supports running docker-based tasks
The current workflow function in Databricks gives a series of options such as DLT, Dbt, python scripts, python files, JAR, etc. It would be good to add a docker file to that and simplify the development process a lot, especially on the unit and integ...
- 4446 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mo_moattar ! Is this still some functionality you're interested in? If so, can you explain a bit more on the use case you're thinking? I'm happy to add this to our feature requests internally, but I know that the Product team will likely request ...
- 0 kudos
- 5323 Views
- 1 replies
- 0 kudos
Resolved! Databricks Community - Cannot See Reply To My Posts
Databricks Community - Cannot See Reply To My Posts.Am I the only one facing this issue or others are also facing the same.
- 5323 Views
- 1 replies
- 0 kudos
- 0 kudos
@clentin , I know this is a response to an older post, but I'm wondering if you ever got this resolved or not? I am able to view the responses to your initial post, so I took the liberty of adding them as screenshots for you. Hope this helps!
- 0 kudos
- 5145 Views
- 2 replies
- 1 kudos
Automating Version Control for Databricks Workflows
I am currently using Databricks Asset Bundles to manage and deploy workflows. While I have successfully automated the version control for notebooks, I am facing challenges with workflows. Specifically, I am looking to automate the process of fetching...
- 5145 Views
- 2 replies
- 1 kudos
- 1 kudos
Automating the reverse synchronization of Databricks workflow (Job) changes made in the Databricks UI back to a GitHub repository is a significant challenge, mainly due to the intentional directionality and guardrails imposed by Databricks Asset Bund...
- 1 kudos
- 6063 Views
- 8 replies
- 2 kudos
Issue with updating email with SCIM Provisioning
Hi all,For our set-up we have configured SCIM provisioning using Entra ID, group assignment on Azure is dealt with by IdentityIQ Sailpoint, and have enabled SSO for Databricks. It has/is working fine apart from one scenario. The original email assign...
- 6063 Views
- 8 replies
- 2 kudos
- 2 kudos
The other option is to raise a ticket with Databricks Accounts team. Our Databricks team worked on the backend and the new email was synced.
- 2 kudos
- 668 Views
- 3 replies
- 0 kudos
Account level Rest API to list workspaces has suddenly stopped working
We use databricks python sdk in one of our Azure databricks workspace to list all the workspaces present in our tenant. The code was working fine since 6-8 months till yesterday and it has started failing suddenly with error :Endpoint not found for /...
- 668 Views
- 3 replies
- 0 kudos
- 0 kudos
My bad, there was an issue with how the accountClient was created. I was able to resolve this. This is still an issue : I also noticed that databricks rest api documentation no longer shows list workspaces as an available API for Azurewhereas it show...
- 0 kudos
- 4184 Views
- 1 replies
- 0 kudos
Customer Managed VPC: Databricks IP Address Ranges
Hello,how often does Databricks change its public ip addresses (the ones that must be whitelisted in a customer managed vpc) and where can I find them?I found this list, but it seems to be incomplete.We moved from a managed vpc to a customer-managed ...
- 4184 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @tom_1 , you’re right to cross-check the published list—here’s how the IPs and ports fit together and where to get the authoritative values. Where to find the current Databricks IPs The official source is the Databricks “IP addresses and d...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
74 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 126 | |
| 54 | |
| 38 | |
| 38 | |
| 25 |