- 4309 Views
- 1 replies
- 0 kudos
Implementing Databricks Persona in
Hi all,I am looking to implement the "persona" based access control across multiple workspaces for multiple user groups in Azure Databricks workspaces. Specifically,- I have a "DEV" workspace where the developer groups (Data Engineers and ML Engineer...
- 4309 Views
- 1 replies
- 0 kudos
- 0 kudos
You can implement persona-based access control for Azure Databricks workspaces using Terraform and the Databricks provider, aligning with the setup you described for DEV and PROD environments. Terraform allows you to codify workspace configuration, u...
- 0 kudos
- 3958 Views
- 1 replies
- 0 kudos
Programmatically setting tags for securables
Unity Catalog securable objects can be tagged with key value pairs: https://learn.microsoft.com/en-us/azure/databricks/database-objects/tagsIs it possible tag objects via REST API calls?I initially thought any Unity Catalog resource in the Databricks...
- 3958 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @camilo_s , thanks for sharing the doc link and the details you observed in the UI network calls. Short answer There isn’t a documented, stable, public REST endpoint specifically for “tags on UC securables” today. You should use SQL DDL to man...
- 0 kudos
- 3488 Views
- 1 replies
- 0 kudos
Delete Users that are Maintenance Readers
I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...
- 3488 Views
- 1 replies
- 0 kudos
- 0 kudos
When trying to delete users in Databricks (Azure) and encountering the message "ABORTED: Account <account> is read-only during maintenance and cannot be updated," this means that your Databricks account is currently in a maintenance mode where no cha...
- 0 kudos
- 3995 Views
- 1 replies
- 0 kudos
Databricks report error: unexpected end of stream, read 0 bytes from 4 (socket was closed by server)
Has anyone encountered this error and knows how to resolve it?"Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)."This occurs in Databricks while generating reports.I've already adjusted the wait_timeout to 28,800, and both ...
- 3995 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, the "Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)" error has been encountered by other Databricks users when generating reports with MySQL. You've already set the major MySQL timeout parameters to their maximums, w...
- 0 kudos
- 3476 Views
- 1 replies
- 0 kudos
Need to create an Identity Federation between my Databricks workspace/account and my AWS account
Hello,I need to set up an identification between my Databricks workspace/account and my AWS account, where Databricks is already deployed.The goal is to make an easy authentification without access and secret keys.So I thought that OIDC will be the s...
- 3476 Views
- 1 replies
- 0 kudos
- 0 kudos
To set up identification between your Databricks workspace/account and your AWS account without using access or secret keys, you can leverage OIDC (OpenID Connect) federation. Instead of traditional SSO, what you’re looking for is a model where AWS t...
- 0 kudos
- 3709 Views
- 2 replies
- 0 kudos
OAuth Url and ClientId Validation
HiI am trying to setup an oauth connection with databricks, so I ask the user to enter their Workspace URL and ClientId.Once the user enters these values, I want to validate whether they are correct or not, so I ask them to login by redirecting them ...
- 3709 Views
- 2 replies
- 0 kudos
- 0 kudos
If you’re using OAuth with Databricks and want to validate both the Workspace URL and ClientId before proceeding, you’re facing an issue seen by others: when the Workspace URL is correct but the ClientId is wrong, Databricks just displays a generic e...
- 0 kudos
- 3641 Views
- 1 replies
- 0 kudos
Custom Runtime marketplace
Hi! Is there a possibility to share the solution accelerator on the custom runtime via the databricks marketplace?
- 3641 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @evgenyvainerman , sorry for the delayed response. Your question is not entirely clear but I will take a swing at providing an answer. Short answer: Yes, you can share a Solution Accelerator through Databricks Marketplace, but Marketplace...
- 0 kudos
- 3954 Views
- 1 replies
- 0 kudos
Unity Catalog Not Enabled on Job Cluster When Creating DLT in GCP Databricks
I am trying to create a Delta Live Table (DLT) in my GCP Databricks workspace, but I am encountering an issue where Unity Catalog is not enabled on the job cluster.Steps I followed:Created a DLT pipeline using the Databricks UI.Selected the appropria...
- 3954 Views
- 1 replies
- 0 kudos
- 0 kudos
The error “Unity Catalog is not enabled on this job cluster” during Delta Live Table (DLT) pipeline execution in your GCP Databricks workspace is a common issue, especially after Unity Catalog onboarding. Your troubleshooting steps cover most essenti...
- 0 kudos
- 3349 Views
- 1 replies
- 0 kudos
Databricks Managed RG Storage cost is High
Hi Community,How to calculate the databricks storage cost and where to see the data which is stored and charged in databricks.I'm trying to understand the storage cost on a managed resource group and i'm clueless about the data and where it is stored...
- 3349 Views
- 1 replies
- 0 kudos
- 0 kudos
To calculate the storage cost for Databricks in Azure and view the data being stored and charged, you need to consider both the Databricks compute (DBUs) and the storage resources (such as Azure Data Lake Storage or Blob Storage) linked to your Datab...
- 0 kudos
- 3447 Views
- 1 replies
- 0 kudos
Lakehouse Federation - Unable to connect to Snowflake using "PEM Private Key"
Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...
- 3447 Views
- 1 replies
- 0 kudos
- 0 kudos
To assist with your Databricks Lakehouse Federation to Snowflake using a PEM Private Key, let's clarify the underlying issue. You mentioned that: The connection works with a service credential (user id & password) but fails when switching to the "PE...
- 0 kudos
- 3663 Views
- 1 replies
- 0 kudos
MLFlow Tracking versions
Hi team,we are migrating from self-self hosted MLFlow Tracking server to the Databricks-hosted one. However, there are concerns about the unclear process of version changes and releases at the Tracking server side. Is there any public information av...
- 3663 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @ViliamG , thanks for raising this—here’s how versioning and client compatibility work for the Databricks-hosted MLflow Tracking service, and where you can track changes publicly. What’s publicly available about versions The Databricks-hosted M...
- 0 kudos
- 3699 Views
- 1 replies
- 0 kudos
DLT constantly failing with time out errors
DLT was working but then started getting time outs frequentlycom.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster xxxxxxxxxxxx: Self-bootstrap timed out during launch. Please try again later and con...
- 3699 Views
- 1 replies
- 0 kudos
- 0 kudos
Frequent timeouts and bootstrap errors when launching Databricks Delta Live Table (DLT) pipeline clusters on AWS are usually caused by network connectivity issues, VPC misconfigurations, or resource allocation problems between Databricks' control pla...
- 0 kudos
- 3839 Views
- 2 replies
- 1 kudos
Resolved! Lakeflow Connect: can't change general privilege requirements
I want to set up Lakeflow Connect to ETL data from Azure SQL Server (Microsoft SQL Azure (RTM) - 12.0.2000.8 Feb 9 2025) using change tracking (we don't need the data retention of CDC). In the documentation, there is a list off system tables, views ...
- 3839 Views
- 2 replies
- 1 kudos
- 1 kudos
You are hitting a known limitation in Azure SQL Database: it does not allow you to grant or modify permissions directly on most system objects, such as system stored procedures, catalog views, or extended stored procedures, resulting in the error "Ms...
- 1 kudos
- 3779 Views
- 1 replies
- 0 kudos
Unable to query using multi-node clusters but works with serverless warehouse & single-node clusters
We have a schema with 10tables and currently all 4 users have ALL access. When I (or any other user) spin up a serverless SQL warehouse, I am able to query one of the tables (million rows) in SQL Editor and get a response within seconds. `select co...
- 3779 Views
- 1 replies
- 0 kudos
- 0 kudos
This behavior suggests a significant difference in configuration or resource access between your Databricks serverless SQL warehouse, single-node cluster, and multi-node Spark cluster. The issue is not with SQL syntax or table access itself, since th...
- 0 kudos
- 3886 Views
- 1 replies
- 1 kudos
Jobs API 2.2 No Longer Enabled for Azure Government
Hello,My team deploys job in the Azure Government environment. We have been using the updated cli (> .205) to do so. Sometime within the last month and a half, our azure us gov environment stopped working with the jobs api 2.2. It was working before ...
- 3886 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey @fpmsi , thanks for raising this — I can clarify what’s going on and how to work around it. What’s happening Jobs API 2.2 is not enabled on the Azure Government (Govcloud/FedRAMP/PVC) shards today, by design. In those regions, the service respo...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |