- 3909 Views
- 2 replies
- 0 kudos
OAuth Url and ClientId Validation
HiI am trying to setup an oauth connection with databricks, so I ask the user to enter their Workspace URL and ClientId.Once the user enters these values, I want to validate whether they are correct or not, so I ask them to login by redirecting them ...
- 3909 Views
- 2 replies
- 0 kudos
- 0 kudos
If you’re using OAuth with Databricks and want to validate both the Workspace URL and ClientId before proceeding, you’re facing an issue seen by others: when the Workspace URL is correct but the ClientId is wrong, Databricks just displays a generic e...
- 0 kudos
- 3818 Views
- 1 replies
- 0 kudos
Custom Runtime marketplace
Hi! Is there a possibility to share the solution accelerator on the custom runtime via the databricks marketplace?
- 3818 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @evgenyvainerman , sorry for the delayed response. Your question is not entirely clear but I will take a swing at providing an answer. Short answer: Yes, you can share a Solution Accelerator through Databricks Marketplace, but Marketplace...
- 0 kudos
- 4160 Views
- 1 replies
- 0 kudos
Unity Catalog Not Enabled on Job Cluster When Creating DLT in GCP Databricks
I am trying to create a Delta Live Table (DLT) in my GCP Databricks workspace, but I am encountering an issue where Unity Catalog is not enabled on the job cluster.Steps I followed:Created a DLT pipeline using the Databricks UI.Selected the appropria...
- 4160 Views
- 1 replies
- 0 kudos
- 0 kudos
The error “Unity Catalog is not enabled on this job cluster” during Delta Live Table (DLT) pipeline execution in your GCP Databricks workspace is a common issue, especially after Unity Catalog onboarding. Your troubleshooting steps cover most essenti...
- 0 kudos
- 3641 Views
- 1 replies
- 0 kudos
Databricks Managed RG Storage cost is High
Hi Community,How to calculate the databricks storage cost and where to see the data which is stored and charged in databricks.I'm trying to understand the storage cost on a managed resource group and i'm clueless about the data and where it is stored...
- 3641 Views
- 1 replies
- 0 kudos
- 0 kudos
To calculate the storage cost for Databricks in Azure and view the data being stored and charged, you need to consider both the Databricks compute (DBUs) and the storage resources (such as Azure Data Lake Storage or Blob Storage) linked to your Datab...
- 0 kudos
- 3696 Views
- 1 replies
- 0 kudos
Lakehouse Federation - Unable to connect to Snowflake using "PEM Private Key"
Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...
- 3696 Views
- 1 replies
- 0 kudos
- 0 kudos
To assist with your Databricks Lakehouse Federation to Snowflake using a PEM Private Key, let's clarify the underlying issue. You mentioned that: The connection works with a service credential (user id & password) but fails when switching to the "PE...
- 0 kudos
- 3883 Views
- 1 replies
- 0 kudos
MLFlow Tracking versions
Hi team,we are migrating from self-self hosted MLFlow Tracking server to the Databricks-hosted one. However, there are concerns about the unclear process of version changes and releases at the Tracking server side. Is there any public information av...
- 3883 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @ViliamG , thanks for raising this—here’s how versioning and client compatibility work for the Databricks-hosted MLflow Tracking service, and where you can track changes publicly. What’s publicly available about versions The Databricks-hosted M...
- 0 kudos
- 3903 Views
- 1 replies
- 0 kudos
DLT constantly failing with time out errors
DLT was working but then started getting time outs frequentlycom.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster xxxxxxxxxxxx: Self-bootstrap timed out during launch. Please try again later and con...
- 3903 Views
- 1 replies
- 0 kudos
- 0 kudos
Frequent timeouts and bootstrap errors when launching Databricks Delta Live Table (DLT) pipeline clusters on AWS are usually caused by network connectivity issues, VPC misconfigurations, or resource allocation problems between Databricks' control pla...
- 0 kudos
- 4240 Views
- 2 replies
- 1 kudos
Resolved! Lakeflow Connect: can't change general privilege requirements
I want to set up Lakeflow Connect to ETL data from Azure SQL Server (Microsoft SQL Azure (RTM) - 12.0.2000.8 Feb 9 2025) using change tracking (we don't need the data retention of CDC). In the documentation, there is a list off system tables, views ...
- 4240 Views
- 2 replies
- 1 kudos
- 1 kudos
You are hitting a known limitation in Azure SQL Database: it does not allow you to grant or modify permissions directly on most system objects, such as system stored procedures, catalog views, or extended stored procedures, resulting in the error "Ms...
- 1 kudos
- 3974 Views
- 1 replies
- 0 kudos
Unable to query using multi-node clusters but works with serverless warehouse & single-node clusters
We have a schema with 10tables and currently all 4 users have ALL access. When I (or any other user) spin up a serverless SQL warehouse, I am able to query one of the tables (million rows) in SQL Editor and get a response within seconds. `select co...
- 3974 Views
- 1 replies
- 0 kudos
- 0 kudos
This behavior suggests a significant difference in configuration or resource access between your Databricks serverless SQL warehouse, single-node cluster, and multi-node Spark cluster. The issue is not with SQL syntax or table access itself, since th...
- 0 kudos
- 4151 Views
- 1 replies
- 1 kudos
Jobs API 2.2 No Longer Enabled for Azure Government
Hello,My team deploys job in the Azure Government environment. We have been using the updated cli (> .205) to do so. Sometime within the last month and a half, our azure us gov environment stopped working with the jobs api 2.2. It was working before ...
- 4151 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey @fpmsi , thanks for raising this — I can clarify what’s going on and how to work around it. What’s happening Jobs API 2.2 is not enabled on the Azure Government (Govcloud/FedRAMP/PVC) shards today, by design. In those regions, the service respo...
- 1 kudos
- 456 Views
- 1 replies
- 0 kudos
Unable to make Community edition cluster for multiple days
Hi,I've been trying to use CE to learn the basics, but I have never been able to make a compute cluster to actually run any workloads / notebooks.I have tried on 3 separate days now, and below I've attached the most recent attempt error log.Since CE ...
- 456 Views
- 1 replies
- 0 kudos
- 0 kudos
@DeltaScratchpad , thanks for sharing the details and the error payload — I know it’s frustrating to hit this repeatedly in Community Edition (CE). What your error means The CONTAINER_LAUNCH_FAILURE with “Container setup has timed out” means the d...
- 0 kudos
- 3760 Views
- 1 replies
- 0 kudos
How do I get rid of the GKE cluster?
hi!In our organisation we use databricks but I do not understand why this GKE cluster keeps getting created. We deploy workspaces and compute clusters through terraform and use the GCE tag"x-databricks-nextgen-cluster" = "true"From my understanding, ...
- 3760 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @Teo12333 , thanks for the clear context—what you’re seeing is expected during the current GCP migration from the older GKE-based compute architecture to the newer, VM-only architecture on GCE. What you’re seeing Databricks historically launched...
- 0 kudos
- 601 Views
- 2 replies
- 2 kudos
Resolved! Issue with spark version
Hello, I faced an issue with the configuration of IaC using Terraform.Our organization uses IaC as the default method for deploying resources.When I try to specify my Spark version using the Databricks provider (v1.96 - latest version) like this:data...
- 601 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi, thanks a lot , direct push of version worked.In future I will use API command to check version without using terraform module.
- 2 kudos
- 3622 Views
- 1 replies
- 0 kudos
Dataiku connector limitation
Hello,I'm trying to read data from Unity Catalog and insert it into an Oracle Database using an "On Premise" Dataiku.It works well for a small dataset ~600Kb/~150 000 rows.[14:51:20] [INFO] [dku.datasets.sql] - Read 2000 records from DB [14:51:20] [I...
- 3622 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @MaximeGendre , thanks for the detailed context — a few things here are likely at play. Is a Databricks “staging area” a common behavior? Yes. Many third‑party tools and ISV integrations use Unity Catalog (UC) Volumes or cloud object stor...
- 0 kudos
- 3781 Views
- 1 replies
- 0 kudos
Unable to create workspace using API
Hi all,I'm trying to automate the deployment of Databricks into GCP. In order to streamline the process, I created a standalone project to hold the service accounts SA1 and SA2, with the second one then being manually populated into the Databricks ac...
- 3781 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @Jeff4 , thanks for laying out the setup and symptoms so clearly. Short answer: it’s not required that the workspace-creating service account be hosted in the same GCP project as the workspace; cross‑project is supported. The failure you’r...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |