- 3731 Views
- 1 replies
- 1 kudos
Jobs API 2.2 No Longer Enabled for Azure Government
Hello,My team deploys job in the Azure Government environment. We have been using the updated cli (> .205) to do so. Sometime within the last month and a half, our azure us gov environment stopped working with the jobs api 2.2. It was working before ...
- 3731 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey @fpmsi , thanks for raising this — I can clarify what’s going on and how to work around it. What’s happening Jobs API 2.2 is not enabled on the Azure Government (Govcloud/FedRAMP/PVC) shards today, by design. In those regions, the service respo...
- 1 kudos
- 354 Views
- 1 replies
- 0 kudos
Unable to make Community edition cluster for multiple days
Hi,I've been trying to use CE to learn the basics, but I have never been able to make a compute cluster to actually run any workloads / notebooks.I have tried on 3 separate days now, and below I've attached the most recent attempt error log.Since CE ...
- 354 Views
- 1 replies
- 0 kudos
- 0 kudos
@DeltaScratchpad , thanks for sharing the details and the error payload — I know it’s frustrating to hit this repeatedly in Community Edition (CE). What your error means The CONTAINER_LAUNCH_FAILURE with “Container setup has timed out” means the d...
- 0 kudos
- 3382 Views
- 1 replies
- 0 kudos
How do I get rid of the GKE cluster?
hi!In our organisation we use databricks but I do not understand why this GKE cluster keeps getting created. We deploy workspaces and compute clusters through terraform and use the GCE tag"x-databricks-nextgen-cluster" = "true"From my understanding, ...
- 3382 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @Teo12333 , thanks for the clear context—what you’re seeing is expected during the current GCP migration from the older GKE-based compute architecture to the newer, VM-only architecture on GCE. What you’re seeing Databricks historically launched...
- 0 kudos
- 99 Views
- 2 replies
- 2 kudos
Resolved! Issue with spark version
Hello, I faced an issue with the configuration of IaC using Terraform.Our organization uses IaC as the default method for deploying resources.When I try to specify my Spark version using the Databricks provider (v1.96 - latest version) like this:data...
- 99 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi, thanks a lot , direct push of version worked.In future I will use API command to check version without using terraform module.
- 2 kudos
- 395 Views
- 2 replies
- 3 kudos
A question about Databricks Fine-grained Access Control (FGAC) cost on dedicated compute
Hi All,recently, while testing Fine-grained Access Control (FGAC) on dedicated compute, I came across something that seems a bit unusual, and I’d like to ask if anyone else has seen similar behavior.I created a view with only one record, and had anot...
- 395 Views
- 2 replies
- 3 kudos
- 3 kudos
@Isi Thank you for your practical experiment and for sharing your findings—it really helps everyone get a clearer view of FGAC (Fine-Grained Access Control) in Unity Catalog on Databricks. I also hope Databricks can clarify the pricing more transpare...
- 3 kudos
- 3243 Views
- 1 replies
- 0 kudos
Dataiku connector limitation
Hello,I'm trying to read data from Unity Catalog and insert it into an Oracle Database using an "On Premise" Dataiku.It works well for a small dataset ~600Kb/~150 000 rows.[14:51:20] [INFO] [dku.datasets.sql] - Read 2000 records from DB [14:51:20] [I...
- 3243 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @MaximeGendre , thanks for the detailed context — a few things here are likely at play. Is a Databricks “staging area” a common behavior? Yes. Many third‑party tools and ISV integrations use Unity Catalog (UC) Volumes or cloud object stor...
- 0 kudos
- 3306 Views
- 1 replies
- 0 kudos
Unable to create workspace using API
Hi all,I'm trying to automate the deployment of Databricks into GCP. In order to streamline the process, I created a standalone project to hold the service accounts SA1 and SA2, with the second one then being manually populated into the Databricks ac...
- 3306 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @Jeff4 , thanks for laying out the setup and symptoms so clearly. Short answer: it’s not required that the workspace-creating service account be hosted in the same GCP project as the workspace; cross‑project is supported. The failure you’r...
- 0 kudos
- 293 Views
- 4 replies
- 5 kudos
Resolved! Programmatically activate groups in account
Hi,I am currently trying to use the Accounts SDK to add External groups from Entra ID to functional groups within Databricks. I expect thousands of groups in Entra and I want to add these groups programmatically (for example) to a group in Databricks...
- 293 Views
- 4 replies
- 5 kudos
- 5 kudos
Great, thank you Louis, for the quick and detailed response! We'll get the account team to go over the use-case with us.Cheers, Sven
- 5 kudos
- 55 Views
- 1 replies
- 1 kudos
Need to claim Azure Databricks account for workspace created via Resource Provider
Hello, Our Azure Databricks workspace was deployed by the Azure Databricks Resource Provider. No “Manage Account” option appears in the UI, and no Account Admin is listed. Please link this workspace’s Databricks account to our Azure AD tenant and as...
- 55 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @JerryAnderson Good day!I understand that you have a brand new workspace and cant access the admin console. You can view this community solution provided for this issue. https://community.databricks.com/t5/administration-architecture/unable-to-...
- 1 kudos
- 87 Views
- 1 replies
- 1 kudos
Service Principal with Federated Credentials Can’t Access Full Repo in ADO
Good Afternoon,I’m using Databricks with Git integration to Azure DevOps (ADO).Authentication is via Microsoft Entra federated credentials for a service principal (SPN).The SPN has Basic access in ADO, is in the same project groups as my user, and Gi...
- 87 Views
- 1 replies
- 1 kudos
- 1 kudos
The issue stems from a fundamental architectural difference in how Databricks handles Git authentication: 1. Git Credential Gap: While your SPN successfully authenticates to Databricks via Microsoft Entra federated credentials, it lacks the sec...
- 1 kudos
- 3183 Views
- 1 replies
- 0 kudos
von Google Cloud Storage
Hi everyone,I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Here’s what I've done so far:Bucket Setup:I created a GCP bu...
- 3183 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @refah_1 , Thanks for laying out the steps—you’re very close. Here’s a structured checklist to get GCS working with Unity Catalog and a couple of common gotchas to check. What’s likely going on The region mismatch isn’t the root cause; docs em...
- 0 kudos
- 2959 Views
- 1 replies
- 0 kudos
Databricks on GCP admin console access
Hi,I'm trying to update the GCP permissions for Databricks as described here: https://docs.databricks.com/gcp/en/admin/cloud-configurations/gcp/gce-updateTo be able to do that, I have to log in to the account console here: https://accounts.gcp.databr...
- 2959 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @borft , It sounds like you’re being redirected into a workspace without the right privileges; let’s get you into the correct Databricks account console for your GCP Marketplace subscription and identify the right login. What login is requ...
- 0 kudos
- 4202 Views
- 6 replies
- 1 kudos
Issue with updating email with SCIM Provisioning
Hi all,For our set-up we have configured SCIM provisioning using Entra ID, group assignment on Azure is dealt with by IdentityIQ Sailpoint, and have enabled SSO for Databricks. It has/is working fine apart from one scenario. The original email assign...
- 4202 Views
- 6 replies
- 1 kudos
- 1 kudos
The other option is to raise a ticket with Databricks Accounts team. Our Databricks team worked on the backend and the new email was synced.
- 1 kudos
- 103 Views
- 3 replies
- 2 kudos
Use wheels from volumes in serverless
Hi everyone! I’m working with a job running on Databricks serverless, and I’d like to know how we can load a wheel file that we have stored in a volume, and then use that wheel as a package within the job itself. Any guidance or examples would be app...
- 103 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @pablogarcia ,You need configure serverless environement to achive that. Refere to below documentation:Configure the serverless environment | Databricks on AWSSpecifically to those sections:- Configure the serverless environment | Databricks on AW...
- 2 kudos
- 70 Views
- 2 replies
- 1 kudos
Subscription management - Can’t see subscription / Access issue
Hi,I recently upgraded my Azure account from Free Trial to Pay-As-You-Go.The Azure portal shows only “Azure subscription 1 – Don’t see a subscription? Switch to another directory.” I have only one directory (“Default Directory”).Please re-associate m...
- 70 Views
- 2 replies
- 1 kudos
- 1 kudos
@niveditha_tr Can you please share the resolution here and mark as solution.
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
45 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 33 | |
| 25 | |
| 19 |