cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

lubiarzm1
by Contributor
  • 1773 Views
  • 2 replies
  • 2 kudos

Resolved! Issue with spark version

Hello, I faced an issue with the configuration of IaC using Terraform.Our organization uses IaC as the default method for deploying resources.When I try to specify my Spark version using the Databricks provider (v1.96 - latest version) like this:data...

  • 1773 Views
  • 2 replies
  • 2 kudos
Latest Reply
lubiarzm1
Contributor
  • 2 kudos

Hi, thanks a lot , direct push of version worked.In future I will use API command to check version without using terraform module.

  • 2 kudos
1 More Replies
MaximeGendre
by New Contributor III
  • 4451 Views
  • 1 replies
  • 0 kudos

Dataiku connector limitation

Hello,I'm trying to read data from Unity Catalog and insert it into an Oracle Database using an "On Premise" Dataiku.It works well for a small dataset ~600Kb/~150 000 rows.[14:51:20] [INFO] [dku.datasets.sql] - Read 2000 records from DB [14:51:20] [I...

MaximeGendre_0-1739993758870.png
  • 4451 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @MaximeGendre , thanks for the detailed context — a few things here are likely at play.   Is a Databricks “staging area” a common behavior? Yes. Many third‑party tools and ISV integrations use Unity Catalog (UC) Volumes or cloud object stor...

  • 0 kudos
Jeff4
by New Contributor
  • 4547 Views
  • 1 replies
  • 0 kudos

Unable to create workspace using API

Hi all,I'm trying to automate the deployment of Databricks into GCP. In order to streamline the process, I created a standalone project to hold the service accounts SA1 and SA2, with the second one then being manually populated into the Databricks ac...

  • 4547 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @Jeff4 ,  thanks for laying out the setup and symptoms so clearly. Short answer: it’s not required that the workspace-creating service account be hosted in the same GCP project as the workspace; cross‑project is supported. The failure you’r...

  • 0 kudos
Sven_Relijveld
by New Contributor II
  • 1063 Views
  • 4 replies
  • 5 kudos

Resolved! Programmatically activate groups in account

Hi,I am currently trying to use the Accounts SDK to add External groups from Entra ID to functional groups within Databricks. I expect thousands of groups in Entra and I want to add these groups programmatically (for example) to a group in Databricks...

  • 1063 Views
  • 4 replies
  • 5 kudos
Latest Reply
SvenRelijveld
New Contributor III
  • 5 kudos

Great, thank you Louis, for the quick and detailed response! We'll get the account team to go over the use-case with us.Cheers, Sven

  • 5 kudos
3 More Replies
JerryAnderson
by New Contributor
  • 447 Views
  • 1 replies
  • 1 kudos

Resolved! Need to claim Azure Databricks account for workspace created via Resource Provider

Hello, Our Azure Databricks workspace was deployed by the Azure Databricks Resource Provider. No “Manage Account” option appears in the UI, and no Account Admin is listed.  Please link this workspace’s Databricks account to our Azure AD tenant and as...

  • 447 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Esteemed Contributor
  • 1 kudos

Hello @JerryAnderson Good day!I understand that you have a brand new workspace and cant access the admin console. You can view this community solution provided for this issue. https://community.databricks.com/t5/administration-architecture/unable-to-...

  • 1 kudos
PearceR
by New Contributor III
  • 512 Views
  • 1 replies
  • 1 kudos

Service Principal with Federated Credentials Can’t Access Full Repo in ADO

Good Afternoon,I’m using Databricks with Git integration to Azure DevOps (ADO).Authentication is via Microsoft Entra federated credentials for a service principal (SPN).The SPN has Basic access in ADO, is in the same project groups as my user, and Gi...

  • 512 Views
  • 1 replies
  • 1 kudos
Latest Reply
AbhaySingh
Databricks Employee
  • 1 kudos

The issue stems from a fundamental architectural difference in how Databricks handles Git authentication:     1. Git Credential Gap: While your SPN successfully authenticates to Databricks via Microsoft Entra federated   credentials, it lacks the sec...

  • 1 kudos
refah_1
by New Contributor
  • 4381 Views
  • 1 replies
  • 0 kudos

von Google Cloud Storage

Hi everyone,I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Here’s what I've done so far:Bucket Setup:I created a GCP bu...

  • 4381 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hey @refah_1 ,  Thanks for laying out the steps—you’re very close. Here’s a structured checklist to get GCS working with Unity Catalog and a couple of common gotchas to check.   What’s likely going on The region mismatch isn’t the root cause; docs em...

  • 0 kudos
borft
by New Contributor
  • 3890 Views
  • 1 replies
  • 0 kudos

Databricks on GCP admin console access

Hi,I'm trying to update the GCP permissions for Databricks as described here: https://docs.databricks.com/gcp/en/admin/cloud-configurations/gcp/gce-updateTo be able to do that, I have to log in to the account console here: https://accounts.gcp.databr...

  • 3890 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @borft ,  It sounds like you’re being redirected into a workspace without the right privileges; let’s get you into the correct Databricks account console for your GCP Marketplace subscription and identify the right login. What login is requ...

  • 0 kudos
pablogarcia
by New Contributor II
  • 611 Views
  • 3 replies
  • 2 kudos

Use wheels from volumes in serverless

Hi everyone! I’m working with a job running on Databricks serverless, and I’d like to know how we can load a wheel file that we have stored in a volume, and then use that wheel as a package within the job itself. Any guidance or examples would be app...

  • 611 Views
  • 3 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @pablogarcia ,You need configure serverless environement to achive that. Refere to below documentation:Configure the serverless environment | Databricks on AWSSpecifically to those sections:- Configure the serverless environment | Databricks on AW...

  • 2 kudos
2 More Replies
niveditha_tr
by New Contributor II
  • 412 Views
  • 2 replies
  • 1 kudos

Subscription management - Can’t see subscription / Access issue

Hi,I recently upgraded my Azure account from Free Trial to Pay-As-You-Go.The Azure portal shows only “Azure subscription 1 – Don’t see a subscription? Switch to another directory.” I have only one directory (“Default Directory”).Please re-associate m...

  • 412 Views
  • 2 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 1 kudos

@niveditha_tr Can you please share the resolution here and mark as solution.

  • 1 kudos
1 More Replies
Mendi
by New Contributor
  • 4214 Views
  • 1 replies
  • 0 kudos

Azure Databricks with VNET injection and SCC

Hi,Azure databricks with VNET injection and SCC need to communicate with Azure endpoints for following,Metastore, artifact Blob storage, system tables storage, log Blob storage, and Event Hubs endpoint IP addresses.https://learn.microsoft.com/en-us/a...

  • 4214 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hey @Mendi ,  Here’s how connectivity works for Azure Databricks with VNet injection and Secure Cluster Connectivity (SCC) for the endpoints you listed.   Key points from the Microsoft Learn reference The page lists, per region, the FQDNs and ports f...

  • 0 kudos
ceceliac
by New Contributor III
  • 5570 Views
  • 1 replies
  • 2 kudos

Salesforce Marketing Cloud integration

What is the best way to get Salesforce Marketing Cloud data into Databricks? Lakeflow / Federation connectors are limited to Salesforce and Salesforce Data Cloud right now. Are there plans to add Salesforce Marketing Cloud?  The only current option w...

  • 5570 Views
  • 1 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Hey @ceceliac ,  Thanks for raising this — here’s the current picture and practical paths you can use today.   What Databricks supports today The Lakehouse Federation connector for Salesforce Data Cloud is available and lets you query Data Cloud tabl...

  • 2 kudos
apjeskeaa
by New Contributor II
  • 1804 Views
  • 4 replies
  • 1 kudos

Resolved! Can a Databricks Workspace be renamed after creation ?

A Databricks workspace has already been created with all configurations completed. The customer has now requested to change the workspace name. Is it possible to rename an existing Databricks workspace after creation?

  • 1804 Views
  • 4 replies
  • 1 kudos
Latest Reply
jeffreyaven
Databricks Employee
  • 1 kudos

Yes, you can safely rename it. The workspace name is largely cosmetic - it won't affect the actual workspace functionality, API endpoints, or integrations since those all rely on the deployment name/URL (which doesn't change). That said, just a heads...

  • 1 kudos
3 More Replies
sfibich1
by New Contributor II
  • 1420 Views
  • 3 replies
  • 1 kudos

Resolved! API call to /api/2.0/serving-endpoints/{name}/ai-gateway does not support tokens or principals

From what I understand of reading the documentation the /api/2.0/serving-endpoints/{name}/ai-gateway supports a "tokens" and a "principals" attribute in the JSON payload.Documentation link: Update AI Gateway of a serving endpoint | Serving endpoints ...

sfibich1_0-1761683609470.png
  • 1420 Views
  • 3 replies
  • 1 kudos
Latest Reply
jeffreyaven
Databricks Employee
  • 1 kudos

I have dug a bit deeper on this these properties are supported but not as top level request body fields, instead they are available in object element fields under `rate_limits`. The actual payload looks like:: ```{    "guardrails": { /* ... */ },    ...

  • 1 kudos
2 More Replies
MMJ
by New Contributor
  • 529 Views
  • 1 replies
  • 1 kudos

Resolved! Delta share not showing in delta shared with me

Hi Everyone,We just start using Databricks, and we were expecting to receive a Delta Share from a third-party provider. They’ve confirmed that the sharing process has been completed on their end. However, the shared data is not appearing on our porta...

  • 529 Views
  • 1 replies
  • 1 kudos
Latest Reply
jeffreyaven
Databricks Employee
  • 1 kudos

You need USE PROVIDER privileges on the recipient workspaces assigned metastore (or you need to be a metastore admin), you will then see the providers delta sharing org name in SHOW PROVIDERS then you can mount their share as a catalog, let me know h...

  • 1 kudos