cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Chiran-Gajula
by New Contributor III
  • 306 Views
  • 1 replies
  • 1 kudos

Resolved! How safe is Databricks workspaces with user files uploaded to workspace?

With the growing adoption of diverse machine learning, AI, and data science models available in the market, it has become increasingly challenging to assess the safety of processing these models—especially when considering the potential for malicious...

  • 306 Views
  • 1 replies
  • 1 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 1 kudos

Hi @Chiran-Gajula , Thanks for raising this. There are a few complementary controls that can put in place across models, inference traffic, files, and observability. Is there currently any mechanism in place within Databricks to track and verify the ...

  • 1 kudos
maikel
by New Contributor III
  • 695 Views
  • 3 replies
  • 2 kudos

Resolved! Accessing data bricks data outside data bricks

Hi!What is the best way to access data bricks data, outside data bricks e.g. from Python code? The main problem is authentication so that I can access data to which I have permissions but I would like to generate token outside data bricks (e.g. via R...

  • 695 Views
  • 3 replies
  • 2 kudos
Latest Reply
dkushari
Databricks Employee
  • 2 kudos

Hi @maikel - You can set up a Service Principal in Databricks and a client ID and Client Secret. Then set up a Databricks profile and use Python code with that profile. Look at the profile section in step 2, how the profile can be set up with client ...

  • 2 kudos
2 More Replies
Barnita
by New Contributor III
  • 726 Views
  • 2 replies
  • 3 kudos

Pre-Commit hook in Databricks

Hi team,Anyone has any idea how to use pre-commit hooks when developing via Databricks UI?Would specifically want to use something like isort, black, ruff etc.I have created .pre-commit-config.yaml and pyproject.toml files in my cloned repo folder, b...

  • 726 Views
  • 2 replies
  • 3 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 3 kudos

Databricks Repos (Git folders) do not support Git hooks natively.The error you're seeing (git failed. Is it installed, and are you in a Git repository directory?) is expected because:1. The Databricks notebook environment does not expose a full Git C...

  • 3 kudos
1 More Replies
DataCurious
by New Contributor III
  • 16713 Views
  • 24 replies
  • 19 kudos

how do you disable serverless interactive compute for all users

I don't want users using serverless interactive compute for their jobs. how do i disable it for everyone or for specific users

  • 16713 Views
  • 24 replies
  • 19 kudos
Latest Reply
timo2022
New Contributor II
  • 19 kudos

At the local university, we have arranged, for the last few years, a course which uses Spark and Databricks for hands-on coding practice. There are 300 students on the course. We have controlled the price by having a single common cluster. It has aut...

  • 19 kudos
23 More Replies
r_w_
by New Contributor II
  • 4819 Views
  • 7 replies
  • 2 kudos

Resolved! Best Practices for Mapping Between Databricks and AWS Accounts

Hi everyone, this is my first post here. I'm doing my best to write in English, so I apologize if anything is unclear.I'm looking to understand the best practices for how many environments to set up when using Databricks on AWS. I'm considering the f...

  • 4819 Views
  • 7 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

Hey @r_w_ If you think my answer was correct, it would be great if you could mark it as a solution to help future users Thanks,Isi

  • 2 kudos
6 More Replies
Rvwijk
by New Contributor II
  • 3625 Views
  • 2 replies
  • 0 kudos

Resolved! New default notebook format (IPYNB) causes unintended changes on release

Dear Databricks,We have noticed the following issue since the new default notebook format has been set to IPYNB. When we release our code from (for example) DEV to TST using a release pipeline built in Azure DevOps, we see unintended changes popping ...

  • 3625 Views
  • 2 replies
  • 0 kudos
Latest Reply
dkushari
Databricks Employee
  • 0 kudos

Hi @Rvwijk, please take a look at this. This should solve your issue. I suspect the mismatch is happening due to the previous ones, including output for the notebook cells. You may need to perform a rebase of your repository and allow the output to b...

  • 0 kudos
1 More Replies
hasanakhuy
by New Contributor
  • 646 Views
  • 1 replies
  • 1 kudos

Resolved! AIM with Entra ID Groups – Users and Service Principals not visible in Workspace

Hello Community, I am testing Automatic Identity Management (AIM) in Databricks with Unity Catalog enabled. Steps I did:      •     AIM is activated      •     In Microsoft Entra ID I created a group g1 and added user u1 and service principal sp1    ...

  • 646 Views
  • 1 replies
  • 1 kudos
Latest Reply
dkushari
Databricks Employee
  • 1 kudos

In Azure Databricks, when AIM is enabled, Entra users, service principals, and groups are available in Azure Databricks as soon as they’re granted permissions. Group memberships, including nested groups, flow directly from Entra ID, so permissions al...

  • 1 kudos
help_needed_445
by Contributor
  • 473 Views
  • 1 replies
  • 1 kudos

Questions About Notebook Debugging Tools

I'm researching the different ways to debug in databricks notebooks and have some questions.1. Can the python breakpoint() function be used in notebooks? This article says it can be used https://www.databricks.com/blog/new-debugging-features-databric...

  • 473 Views
  • 1 replies
  • 1 kudos
Latest Reply
jack_zaldivar
Databricks Employee
  • 1 kudos

Hi @help_needed_445 ! Can you give a bit more information on your environment? Which cloud are you operating in where you are not able to use the native debugging tool? I have tested in an Azure workspace by adding a breakpoint in the gutter of a spe...

  • 1 kudos
eoferreira
by New Contributor
  • 908 Views
  • 3 replies
  • 4 kudos

Lakebase security

Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...

  • 908 Views
  • 3 replies
  • 4 kudos
Latest Reply
Sudheer-Reddy
New Contributor II
  • 4 kudos

Postgres instance is covered by the private link you configure to your workspace.

  • 4 kudos
2 More Replies
Daniela_Boamba
by New Contributor III
  • 245 Views
  • 0 replies
  • 0 kudos

Databricks certificate expired

Hello,I have a databricks workspace with sso authentication. the IDP is on azure.The client certificate expired and now, I can't log on to databricks to add the new one.How can I do? Any idea is welcomed.Thank you!!Best regards,daniela 

  • 245 Views
  • 0 replies
  • 0 kudos
YugandharG
by New Contributor
  • 670 Views
  • 1 replies
  • 2 kudos

Resolved! Lakebase storage location

Hi,I'm a Solution Architect from a reputed insurance company looking for few key technical information about Lakebase architecture. Being fully managed serverless OLTP offering from Databricks, there is no clear documentation that talks about data st...

  • 670 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @YugandharG ,1. Lakebase data is stored in databricks-managed cloud object storage. There's no option to use customer storage as of now.2. File format: vanilla postgres pages. The storage format of postgres has nothing to do with parquet/delta. Wa...

  • 2 kudos
ctgchris
by New Contributor III
  • 1018 Views
  • 9 replies
  • 0 kudos

User Token Forwarding Between App?

I have a streamlit databricks app that is intended to be a frontend UI app. I also have a FastAPI databricks app that is intended to be a middleware app. I want my streamlit app to query the middleware app for all business logic and databrick queries...

  • 1018 Views
  • 9 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

This post?

  • 0 kudos
8 More Replies
Jhaprakash6608
by New Contributor
  • 711 Views
  • 1 replies
  • 1 kudos

Resolved! Spark executor logs path

We are running spark workloads and have enabled cluster log discovery to push executor logs to Azure blog. While that's running fine, I'd also like to know the local path of the executor logs so that I can make use of oneagent from dynatrace and send...

  • 711 Views
  • 1 replies
  • 1 kudos
Latest Reply
Krishna_S
Databricks Employee
  • 1 kudos

Local Executor Log Path on Azure Databricks Executor logs are written locally on each executor node under the work directory: The path pattern is: /databricks/spark/work/<app-id>/<executor-id> For example: /databricks/spark/work/app-20221121180310-00...

  • 1 kudos
ctgchris
by New Contributor III
  • 504 Views
  • 1 replies
  • 1 kudos

User OBO Token Forwarding between apps

Can user OAuth tokens be forwarded between Databricks Apps for on-behalf-of (OBO) authorization?I have two Databricks Apps deployed in the same workspace:1. **UI App** (Streamlit) - configured with OAuth user authorization2. **Middleware App** (FastA...

  • 504 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Hello @ctgchris Just pushing this issue for visibility to others. Someone from databricks can come up with a solution. 

  • 1 kudos
noklamchan
by New Contributor II
  • 3802 Views
  • 4 replies
  • 3 kudos

How to access UnityCatalog's Volume inside Databricks App?

I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...

  • 3802 Views
  • 4 replies
  • 3 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 3 kudos

Apps don’t mount /Volumes and don’t ship with dbutils. So os.listdir('/Volumes/...') or dbutils.fs.ls(...) won’t work inside an App. Use the Files API or Databricks SDK instead to read/write UC Volume files, then work on a local copy.Code using Pytho...

  • 3 kudos
3 More Replies