- 594 Views
- 4 replies
- 5 kudos
Resolved! Programmatically activate groups in account
Hi,I am currently trying to use the Accounts SDK to add External groups from Entra ID to functional groups within Databricks. I expect thousands of groups in Entra and I want to add these groups programmatically (for example) to a group in Databricks...
- 594 Views
- 4 replies
- 5 kudos
- 5 kudos
Great, thank you Louis, for the quick and detailed response! We'll get the account team to go over the use-case with us.Cheers, Sven
- 5 kudos
- 261 Views
- 1 replies
- 1 kudos
Resolved! Need to claim Azure Databricks account for workspace created via Resource Provider
Hello, Our Azure Databricks workspace was deployed by the Azure Databricks Resource Provider. No “Manage Account” option appears in the UI, and no Account Admin is listed. Please link this workspace’s Databricks account to our Azure AD tenant and as...
- 261 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @JerryAnderson Good day!I understand that you have a brand new workspace and cant access the admin console. You can view this community solution provided for this issue. https://community.databricks.com/t5/administration-architecture/unable-to-...
- 1 kudos
- 260 Views
- 1 replies
- 1 kudos
Service Principal with Federated Credentials Can’t Access Full Repo in ADO
Good Afternoon,I’m using Databricks with Git integration to Azure DevOps (ADO).Authentication is via Microsoft Entra federated credentials for a service principal (SPN).The SPN has Basic access in ADO, is in the same project groups as my user, and Gi...
- 260 Views
- 1 replies
- 1 kudos
- 1 kudos
The issue stems from a fundamental architectural difference in how Databricks handles Git authentication: 1. Git Credential Gap: While your SPN successfully authenticates to Databricks via Microsoft Entra federated credentials, it lacks the sec...
- 1 kudos
- 3540 Views
- 1 replies
- 0 kudos
von Google Cloud Storage
Hi everyone,I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Here’s what I've done so far:Bucket Setup:I created a GCP bu...
- 3540 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @refah_1 , Thanks for laying out the steps—you’re very close. Here’s a structured checklist to get GCS working with Unity Catalog and a couple of common gotchas to check. What’s likely going on The region mismatch isn’t the root cause; docs em...
- 0 kudos
- 3299 Views
- 1 replies
- 0 kudos
Databricks on GCP admin console access
Hi,I'm trying to update the GCP permissions for Databricks as described here: https://docs.databricks.com/gcp/en/admin/cloud-configurations/gcp/gce-updateTo be able to do that, I have to log in to the account console here: https://accounts.gcp.databr...
- 3299 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @borft , It sounds like you’re being redirected into a workspace without the right privileges; let’s get you into the correct Databricks account console for your GCP Marketplace subscription and identify the right login. What login is requ...
- 0 kudos
- 393 Views
- 3 replies
- 2 kudos
Use wheels from volumes in serverless
Hi everyone! I’m working with a job running on Databricks serverless, and I’d like to know how we can load a wheel file that we have stored in a volume, and then use that wheel as a package within the job itself. Any guidance or examples would be app...
- 393 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @pablogarcia ,You need configure serverless environement to achive that. Refere to below documentation:Configure the serverless environment | Databricks on AWSSpecifically to those sections:- Configure the serverless environment | Databricks on AW...
- 2 kudos
- 295 Views
- 2 replies
- 1 kudos
Subscription management - Can’t see subscription / Access issue
Hi,I recently upgraded my Azure account from Free Trial to Pay-As-You-Go.The Azure portal shows only “Azure subscription 1 – Don’t see a subscription? Switch to another directory.” I have only one directory (“Default Directory”).Please re-associate m...
- 295 Views
- 2 replies
- 1 kudos
- 1 kudos
@niveditha_tr Can you please share the resolution here and mark as solution.
- 1 kudos
- 3510 Views
- 1 replies
- 0 kudos
Azure Databricks with VNET injection and SCC
Hi,Azure databricks with VNET injection and SCC need to communicate with Azure endpoints for following,Metastore, artifact Blob storage, system tables storage, log Blob storage, and Event Hubs endpoint IP addresses.https://learn.microsoft.com/en-us/a...
- 3510 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @Mendi , Here’s how connectivity works for Azure Databricks with VNet injection and Secure Cluster Connectivity (SCC) for the endpoints you listed. Key points from the Microsoft Learn reference The page lists, per region, the FQDNs and ports f...
- 0 kudos
- 4336 Views
- 1 replies
- 2 kudos
Salesforce Marketing Cloud integration
What is the best way to get Salesforce Marketing Cloud data into Databricks? Lakeflow / Federation connectors are limited to Salesforce and Salesforce Data Cloud right now. Are there plans to add Salesforce Marketing Cloud? The only current option w...
- 4336 Views
- 1 replies
- 2 kudos
- 2 kudos
Hey @ceceliac , Thanks for raising this — here’s the current picture and practical paths you can use today. What Databricks supports today The Lakehouse Federation connector for Salesforce Data Cloud is available and lets you query Data Cloud tabl...
- 2 kudos
- 800 Views
- 4 replies
- 1 kudos
Resolved! Can a Databricks Workspace be renamed after creation ?
A Databricks workspace has already been created with all configurations completed. The customer has now requested to change the workspace name. Is it possible to rename an existing Databricks workspace after creation?
- 800 Views
- 4 replies
- 1 kudos
- 1 kudos
Yes, you can safely rename it. The workspace name is largely cosmetic - it won't affect the actual workspace functionality, API endpoints, or integrations since those all rely on the deployment name/URL (which doesn't change). That said, just a heads...
- 1 kudos
- 409 Views
- 3 replies
- 1 kudos
Resolved! API call to /api/2.0/serving-endpoints/{name}/ai-gateway does not support tokens or principals
From what I understand of reading the documentation the /api/2.0/serving-endpoints/{name}/ai-gateway supports a "tokens" and a "principals" attribute in the JSON payload.Documentation link: Update AI Gateway of a serving endpoint | Serving endpoints ...
- 409 Views
- 3 replies
- 1 kudos
- 1 kudos
I have dug a bit deeper on this these properties are supported but not as top level request body fields, instead they are available in object element fields under `rate_limits`. The actual payload looks like:: ```{ "guardrails": { /* ... */ }, ...
- 1 kudos
- 367 Views
- 1 replies
- 1 kudos
Using Terraform to GRANT SELECT ON ANY FILE securable
I have a use case where service principals will read .csv files from Azure Storage Account and create views from them. This used to work in our legacy environment but we are currently migrating to Unity Catalog and when we tested our existing jobs we...
- 367 Views
- 1 replies
- 1 kudos
- 1 kudos
you can try using this code.resource "databricks_grants" "any_file_select_grant" { principal = "your_user_or_group_name" // Replace with the actual user or group privileges { privilege_type = "SELECT" securable_type = "ANY FILE" } }
- 1 kudos
- 345 Views
- 1 replies
- 1 kudos
Resolved! Delta share not showing in delta shared with me
Hi Everyone,We just start using Databricks, and we were expecting to receive a Delta Share from a third-party provider. They’ve confirmed that the sharing process has been completed on their end. However, the shared data is not appearing on our porta...
- 345 Views
- 1 replies
- 1 kudos
- 1 kudos
You need USE PROVIDER privileges on the recipient workspaces assigned metastore (or you need to be a metastore admin), you will then see the providers delta sharing org name in SHOW PROVIDERS then you can mount their share as a catalog, let me know h...
- 1 kudos
- 4364 Views
- 3 replies
- 2 kudos
Resolved! Restricting Catalog and External Location Visibility Across Databricks Workspaces
Hi Databricks Community,I need some guidance regarding catalogs and external locations across multiple environments. Here's my situation:I've set up a resource group (dev-rg) and created a Databricks workspace where I successfully created catalogs (b...
- 4364 Views
- 3 replies
- 2 kudos
- 2 kudos
I am facing exact similar issue. I don't want to create separate metastore. and I have added environment name as a prefix to all external locations. All the locations are restricted to their workspaces, so functionality wise everything is fine. my co...
- 2 kudos
- 973 Views
- 4 replies
- 5 kudos
Resolved! Azure Databricks Control Plane connectivity issue after migrating to vWAN
Hello everyone,Recently, I received a client request to migrate our Azure Databricks environment from a Hub-and-Spoke architecture to a vWAN Hub architecture with an NVA (Network Virtual Appliance).Here’s a quick overview of the setup:The Databricks ...
- 973 Views
- 4 replies
- 5 kudos
- 5 kudos
@nodeb Can you please mark your reply as solution. It will help other users find the resolution fast.
- 5 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |