cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

YugandharG
by New Contributor
  • 685 Views
  • 1 replies
  • 2 kudos

Resolved! Lakebase storage location

Hi,I'm a Solution Architect from a reputed insurance company looking for few key technical information about Lakebase architecture. Being fully managed serverless OLTP offering from Databricks, there is no clear documentation that talks about data st...

  • 685 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @YugandharG ,1. Lakebase data is stored in databricks-managed cloud object storage. There's no option to use customer storage as of now.2. File format: vanilla postgres pages. The storage format of postgres has nothing to do with parquet/delta. Wa...

  • 2 kudos
ctgchris
by New Contributor III
  • 1053 Views
  • 9 replies
  • 0 kudos

User Token Forwarding Between App?

I have a streamlit databricks app that is intended to be a frontend UI app. I also have a FastAPI databricks app that is intended to be a middleware app. I want my streamlit app to query the middleware app for all business logic and databrick queries...

  • 1053 Views
  • 9 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

This post?

  • 0 kudos
8 More Replies
Jhaprakash6608
by New Contributor
  • 718 Views
  • 1 replies
  • 1 kudos

Resolved! Spark executor logs path

We are running spark workloads and have enabled cluster log discovery to push executor logs to Azure blog. While that's running fine, I'd also like to know the local path of the executor logs so that I can make use of oneagent from dynatrace and send...

  • 718 Views
  • 1 replies
  • 1 kudos
Latest Reply
Krishna_S
Databricks Employee
  • 1 kudos

Local Executor Log Path on Azure Databricks Executor logs are written locally on each executor node under the work directory: The path pattern is: /databricks/spark/work/<app-id>/<executor-id> For example: /databricks/spark/work/app-20221121180310-00...

  • 1 kudos
ctgchris
by New Contributor III
  • 518 Views
  • 1 replies
  • 1 kudos

User OBO Token Forwarding between apps

Can user OAuth tokens be forwarded between Databricks Apps for on-behalf-of (OBO) authorization?I have two Databricks Apps deployed in the same workspace:1. **UI App** (Streamlit) - configured with OAuth user authorization2. **Middleware App** (FastA...

  • 518 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Hello @ctgchris Just pushing this issue for visibility to others. Someone from databricks can come up with a solution. 

  • 1 kudos
noklamchan
by New Contributor II
  • 3837 Views
  • 4 replies
  • 3 kudos

How to access UnityCatalog's Volume inside Databricks App?

I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...

  • 3837 Views
  • 4 replies
  • 3 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 3 kudos

Apps don’t mount /Volumes and don’t ship with dbutils. So os.listdir('/Volumes/...') or dbutils.fs.ls(...) won’t work inside an App. Use the Files API or Databricks SDK instead to read/write UC Volume files, then work on a local copy.Code using Pytho...

  • 3 kudos
3 More Replies
rabbitturtles
by New Contributor III
  • 1275 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Apps On behalf of user authorization - General availability date?

Currently Databricks apps on behalf of user authorization is in public-preview. Any idea when this would be generally available or where I can see it's release plan?https://docs.databricks.com/aws/en/release-notes/product/2025/march#databricks-apps-c...

  • 1275 Views
  • 5 replies
  • 1 kudos
Latest Reply
WiliamRosa
Contributor III
  • 1 kudos

Hi @rabbitturtles Additionally, you can subscribe to the Databricks Newsletter and join the Product Roadmap Webinars, where they announce all the latest private previews.”https://www.databricks.com/resources?_sft_resource_type=newsletters

  • 1 kudos
4 More Replies
sparkplug
by New Contributor III
  • 2091 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks service principal token federation on Kubernetes

Hi I am trying to create a service principal federation policy against AKS cluster. But I am struggling to make it work without any examples. It would be great if you could share examples on how this would work for a service account.Additionally, wha...

  • 2091 Views
  • 5 replies
  • 1 kudos
Latest Reply
sparkplug
New Contributor III
  • 1 kudos

I am currently using a two step process, logging in using azure library and then getting an access token from Azure using the databricks scope. And then using that to authorize towards Databricks. I would like to use `env-oidc` auth type instead, but...

  • 1 kudos
4 More Replies
ez
by New Contributor II
  • 1169 Views
  • 5 replies
  • 4 kudos

Resolved! SQLSTATE: 42501 - Missing Privileges for User Groups

Dear AllI'm investigating missing privileges for some of our users.When connecting to an Oracle database via JDBC and attempting to display a DataFrame, we encounter the following error:User does not have permission SELECT on any file. SQLSTATE: 4250...

  • 1169 Views
  • 5 replies
  • 4 kudos
Latest Reply
ez
New Contributor II
  • 4 kudos

@nayan_wylde thank you, that is exactly what I was looking for and could not find

  • 4 kudos
4 More Replies
vvijay61
by New Contributor II
  • 747 Views
  • 7 replies
  • 1 kudos

SAT Tool Scan other workspaces

Hello Team, i have been setting up SAT in my Databricks workspace and i am able to do it and scan in my workspace. i have provided my SP access to all other Workspaces as well  When i run the initialize job (SAT Initializer Notebook (one-time)) , I c...

  • 747 Views
  • 7 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

It seems like a access is denied by network policy. You have to update Network Policy for Serverless at account levelIn Account Console → Cloud Resources → Policies → Serverless Egress Control → default-policyCheck the Allow access to all destination...

  • 1 kudos
6 More Replies
mereud
by New Contributor
  • 562 Views
  • 1 replies
  • 0 kudos

Databricks bundle error

Hi everyone,I'm encountering an issue during deployment with Terraform on Databricks.The error I get is:Error: failed to read remote state file: stream error: stream ID 21; NO_ERROR; received from peerOn another attempt (after a manual mistake on my ...

  • 562 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Yes, I also find that sometimes the deploy fails. F.e. when cluster config using policies has changed or when you want to use  mixed node types in combination with policies.What I do is do destroy the bundle or the job that gives issues.Then deploy.T...

  • 0 kudos
taksheel-a-n
by New Contributor
  • 499 Views
  • 1 replies
  • 0 kudos

Can a Databricks App (React-based) be published on the Databricks Marketplace?

Hi everyone,I’ve been exploring Databricks Apps and building a frontend using React within the Databricks environment. I wanted to know if it’s currently possible to publish a Databricks App to the Databricks Marketplace, similar to how datasets, not...

  • 499 Views
  • 1 replies
  • 0 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 0 kudos

Hi @taksheel-a-n According to the documentation: https://learn.microsoft.com/en-us/azure/databricks/marketplace/ . Here's a list of the types of assets that are supported:According to this post, there's not a public facing roadmap: https://community....

  • 0 kudos
DavidRobinson
by New Contributor
  • 563 Views
  • 1 replies
  • 0 kudos

Transition from SCIM to AIM

We're in the process of transitioning our Azure Databricks instance from SCIM-based provisioning to Automated Identity Management (AIM), now that AIM is generally available. Once enabled, AIM becomes the authoritative source for managing users, group...

  • 563 Views
  • 1 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

@DavidRobinson Let me know how it goes. This is in my to-do list too as we are facing a lot of issues with SCIM like nested group sync and SPN syncs. One of the issue that I can think of is AIM respects nested groups from Entra, which SCIM didn’t. So...

  • 0 kudos
Elben
by New Contributor II
  • 676 Views
  • 4 replies
  • 1 kudos

Delta Sharing Egress Pipeline for Azure

We are currently investigating the options for implementing a multi tenancy solution where clients are separated but share data using delta sharing. Are there any way to track cost for reading data in Azure. It seems like Delta Sharing Egress Pipelin...

  • 676 Views
  • 4 replies
  • 1 kudos
Latest Reply
Elben
New Contributor II
  • 1 kudos

Thanks. I'm aware of the very neat features for analyzing cost in databricks, but we are also interested in monitoring the cost for the underlying storage and network. It seems that this is indeed possible in AWS using s3, but not supported in Azure....

  • 1 kudos
3 More Replies
kebubs
by New Contributor
  • 525 Views
  • 1 replies
  • 1 kudos

Resolved! Difference between AWS Marketplace and direct with Databricks

Hi all,Wanted to check the difference between direct purchase from Databricks and through AWS Marketplace, and the difference in deployment of direct purchase and AWS Marketplace. I understand that from AWS Marketplace will have a auto-deployment whe...

  • 525 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @kebubs ,Maybe you will find below thread as useful. According to databricks employee the main difference will be how billing is handled:"Direct Subscription: If you subscribe directly through Databricks, you will manage billing through the Databr...

  • 1 kudos
Kaz1
by New Contributor
  • 2691 Views
  • 1 replies
  • 0 kudos

Can't post to microsoft teams workflow from databricks notebook

When trying to post to a microsoft teams webhook, from a databricks notebook, using compute with DBR 12.2 I receive the following error:SSL error: HTTPSConnectionPool(host='prod-104.westeurope.logic.azure.com', port=443): Max retries exceeded with ur...

  • 2691 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hello @Kaz1 It’s very likely that the issue is related to where the HTTPS request originates — whether it’s coming from the Databricks control plane or your data plane (your own AWS VPC).When you run a local script or call the Teams webhook from a cl...

  • 0 kudos