cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

r_w_
by New Contributor II
  • 4756 Views
  • 7 replies
  • 2 kudos

Resolved! Best Practices for Mapping Between Databricks and AWS Accounts

Hi everyone, this is my first post here. I'm doing my best to write in English, so I apologize if anything is unclear.I'm looking to understand the best practices for how many environments to set up when using Databricks on AWS. I'm considering the f...

  • 4756 Views
  • 7 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

Hey @r_w_ If you think my answer was correct, it would be great if you could mark it as a solution to help future users Thanks,Isi

  • 2 kudos
6 More Replies
Rvwijk
by New Contributor II
  • 3602 Views
  • 2 replies
  • 0 kudos

Resolved! New default notebook format (IPYNB) causes unintended changes on release

Dear Databricks,We have noticed the following issue since the new default notebook format has been set to IPYNB. When we release our code from (for example) DEV to TST using a release pipeline built in Azure DevOps, we see unintended changes popping ...

  • 3602 Views
  • 2 replies
  • 0 kudos
Latest Reply
dkushari
Databricks Employee
  • 0 kudos

Hi @Rvwijk, please take a look at this. This should solve your issue. I suspect the mismatch is happening due to the previous ones, including output for the notebook cells. You may need to perform a rebase of your repository and allow the output to b...

  • 0 kudos
1 More Replies
hasanakhuy
by New Contributor
  • 634 Views
  • 1 replies
  • 1 kudos

Resolved! AIM with Entra ID Groups – Users and Service Principals not visible in Workspace

Hello Community, I am testing Automatic Identity Management (AIM) in Databricks with Unity Catalog enabled. Steps I did:      •     AIM is activated      •     In Microsoft Entra ID I created a group g1 and added user u1 and service principal sp1    ...

  • 634 Views
  • 1 replies
  • 1 kudos
Latest Reply
dkushari
Databricks Employee
  • 1 kudos

In Azure Databricks, when AIM is enabled, Entra users, service principals, and groups are available in Azure Databricks as soon as they’re granted permissions. Group memberships, including nested groups, flow directly from Entra ID, so permissions al...

  • 1 kudos
help_needed_445
by Contributor
  • 461 Views
  • 1 replies
  • 1 kudos

Questions About Notebook Debugging Tools

I'm researching the different ways to debug in databricks notebooks and have some questions.1. Can the python breakpoint() function be used in notebooks? This article says it can be used https://www.databricks.com/blog/new-debugging-features-databric...

  • 461 Views
  • 1 replies
  • 1 kudos
Latest Reply
jack_zaldivar
Databricks Employee
  • 1 kudos

Hi @help_needed_445 ! Can you give a bit more information on your environment? Which cloud are you operating in where you are not able to use the native debugging tool? I have tested in an Azure workspace by adding a breakpoint in the gutter of a spe...

  • 1 kudos
eoferreira
by New Contributor
  • 888 Views
  • 3 replies
  • 4 kudos

Lakebase security

Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...

  • 888 Views
  • 3 replies
  • 4 kudos
Latest Reply
Sudheer-Reddy
New Contributor II
  • 4 kudos

Postgres instance is covered by the private link you configure to your workspace.

  • 4 kudos
2 More Replies
Daniela_Boamba
by New Contributor III
  • 239 Views
  • 0 replies
  • 0 kudos

Databricks certificate expired

Hello,I have a databricks workspace with sso authentication. the IDP is on azure.The client certificate expired and now, I can't log on to databricks to add the new one.How can I do? Any idea is welcomed.Thank you!!Best regards,daniela 

  • 239 Views
  • 0 replies
  • 0 kudos
YugandharG
by New Contributor
  • 644 Views
  • 1 replies
  • 2 kudos

Resolved! Lakebase storage location

Hi,I'm a Solution Architect from a reputed insurance company looking for few key technical information about Lakebase architecture. Being fully managed serverless OLTP offering from Databricks, there is no clear documentation that talks about data st...

  • 644 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @YugandharG ,1. Lakebase data is stored in databricks-managed cloud object storage. There's no option to use customer storage as of now.2. File format: vanilla postgres pages. The storage format of postgres has nothing to do with parquet/delta. Wa...

  • 2 kudos
ctgchris
by New Contributor III
  • 988 Views
  • 9 replies
  • 0 kudos

User Token Forwarding Between App?

I have a streamlit databricks app that is intended to be a frontend UI app. I also have a FastAPI databricks app that is intended to be a middleware app. I want my streamlit app to query the middleware app for all business logic and databrick queries...

  • 988 Views
  • 9 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

This post?

  • 0 kudos
8 More Replies
Jhaprakash6608
by New Contributor
  • 699 Views
  • 1 replies
  • 1 kudos

Resolved! Spark executor logs path

We are running spark workloads and have enabled cluster log discovery to push executor logs to Azure blog. While that's running fine, I'd also like to know the local path of the executor logs so that I can make use of oneagent from dynatrace and send...

  • 699 Views
  • 1 replies
  • 1 kudos
Latest Reply
Krishna_S
Databricks Employee
  • 1 kudos

Local Executor Log Path on Azure Databricks Executor logs are written locally on each executor node under the work directory: The path pattern is: /databricks/spark/work/<app-id>/<executor-id> For example: /databricks/spark/work/app-20221121180310-00...

  • 1 kudos
ctgchris
by New Contributor III
  • 477 Views
  • 1 replies
  • 1 kudos

User OBO Token Forwarding between apps

Can user OAuth tokens be forwarded between Databricks Apps for on-behalf-of (OBO) authorization?I have two Databricks Apps deployed in the same workspace:1. **UI App** (Streamlit) - configured with OAuth user authorization2. **Middleware App** (FastA...

  • 477 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Hello @ctgchris Just pushing this issue for visibility to others. Someone from databricks can come up with a solution. 

  • 1 kudos
noklamchan
by New Contributor II
  • 3763 Views
  • 4 replies
  • 3 kudos

How to access UnityCatalog's Volume inside Databricks App?

I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...

  • 3763 Views
  • 4 replies
  • 3 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 3 kudos

Apps don’t mount /Volumes and don’t ship with dbutils. So os.listdir('/Volumes/...') or dbutils.fs.ls(...) won’t work inside an App. Use the Files API or Databricks SDK instead to read/write UC Volume files, then work on a local copy.Code using Pytho...

  • 3 kudos
3 More Replies
rabbitturtles
by New Contributor III
  • 1217 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Apps On behalf of user authorization - General availability date?

Currently Databricks apps on behalf of user authorization is in public-preview. Any idea when this would be generally available or where I can see it's release plan?https://docs.databricks.com/aws/en/release-notes/product/2025/march#databricks-apps-c...

  • 1217 Views
  • 5 replies
  • 1 kudos
Latest Reply
WiliamRosa
Contributor III
  • 1 kudos

Hi @rabbitturtles Additionally, you can subscribe to the Databricks Newsletter and join the Product Roadmap Webinars, where they announce all the latest private previews.”https://www.databricks.com/resources?_sft_resource_type=newsletters

  • 1 kudos
4 More Replies
sparkplug
by New Contributor III
  • 2019 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks service principal token federation on Kubernetes

Hi I am trying to create a service principal federation policy against AKS cluster. But I am struggling to make it work without any examples. It would be great if you could share examples on how this would work for a service account.Additionally, wha...

  • 2019 Views
  • 5 replies
  • 1 kudos
Latest Reply
sparkplug
New Contributor III
  • 1 kudos

I am currently using a two step process, logging in using azure library and then getting an access token from Azure using the databricks scope. And then using that to authorize towards Databricks. I would like to use `env-oidc` auth type instead, but...

  • 1 kudos
4 More Replies
ez
by New Contributor II
  • 1096 Views
  • 5 replies
  • 4 kudos

Resolved! SQLSTATE: 42501 - Missing Privileges for User Groups

Dear AllI'm investigating missing privileges for some of our users.When connecting to an Oracle database via JDBC and attempting to display a DataFrame, we encounter the following error:User does not have permission SELECT on any file. SQLSTATE: 4250...

  • 1096 Views
  • 5 replies
  • 4 kudos
Latest Reply
ez
New Contributor II
  • 4 kudos

@nayan_wylde thank you, that is exactly what I was looking for and could not find

  • 4 kudos
4 More Replies
vvijay61
by New Contributor II
  • 707 Views
  • 7 replies
  • 1 kudos

SAT Tool Scan other workspaces

Hello Team, i have been setting up SAT in my Databricks workspace and i am able to do it and scan in my workspace. i have provided my SP access to all other Workspaces as well  When i run the initialize job (SAT Initializer Notebook (one-time)) , I c...

  • 707 Views
  • 7 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

It seems like a access is denied by network policy. You have to update Network Policy for Serverless at account levelIn Account Console → Cloud Resources → Policies → Serverless Egress Control → default-policyCheck the Allow access to all destination...

  • 1 kudos
6 More Replies