cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vamsi_simbus
by New Contributor III
  • 322 Views
  • 1 replies
  • 2 kudos

Resolved! Looking for Databricks–Kinaxis Integration or Accelerator Information

Hi Databricks Community,I’m looking for information on the partnership between Databricks and Kinaxis. Specifically:Are there any official integrations or joint solutions available between the two platforms?Does Databricks provide any accelerators, r...

  • 322 Views
  • 1 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Greetings @vamsi_simbus , I did some digging and have some helpful information for you.   Here’s a concise summary of what’s publicly available today on Databricks + Kinaxis.   Official partnership and integration scope   A formal strategic partnersh...

  • 2 kudos
Barnita
by New Contributor III
  • 403 Views
  • 4 replies
  • 2 kudos

Resolved! How to run black code-formating on the notebooks using custom configurations in UI

Hi all,I’m currently exploring how we can format notebook code using Black (installed via libraries) with specific configurations.I understand that we can configure Black locally using a pyproject.toml file. However, I’d like to know if there’s a way...

  • 403 Views
  • 4 replies
  • 2 kudos
Latest Reply
Barnita
New Contributor III
  • 2 kudos

Hi @szymon_dybczak ,Thanks for your response. My team has been using the same setup you mentioned. I’d like to know if there’s a way to override the default configuration that Black uses in a cluster environment — for example, adjusting the line-leng...

  • 2 kudos
3 More Replies
MaximeGendre
by New Contributor III
  • 542 Views
  • 4 replies
  • 4 kudos

Resolved! Disable SQL Warehouse during week-ends

Hello,I massively deployed SQL Warehouses in our data Platform.Right now, most of them are running every hour (with some inactivity phasis) because of Power BI report/jobs schedules.To limit cost, I would like to stop/disable some on them on Friday e...

  • 542 Views
  • 4 replies
  • 4 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 4 kudos

 Also like to provide you with some alternate options.Tagging & Monitoring: Use tags and cost dashboards to monitor weekend usage and identify high-cost warehouses for manual intervention.Serverless SQL Warehouses: If not already in use, consider swi...

  • 4 kudos
3 More Replies
Angus-Dawson
by Contributor
  • 1580 Views
  • 5 replies
  • 3 kudos

Databricks Runtime 16.4 LTS has inconsistent Spark and Delta Lake versions

Per the release notes for Databricks Runtime 16.4 LTS, the environment has Apache Spark 3.5.2 and Delta Lake 3.3.1:https://docs.databricks.com/aws/en/release-notes/runtime/16.4ltsHowever, Delta Lake 3.3.1 is built on Spark 3.5.3; the newest version o...

  • 1580 Views
  • 5 replies
  • 3 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 3 kudos

Hi @Angus-Dawson Use  Databricks Connect for local development/testing against a remote Databricks cluster—this ensures your code runs in the actual Databricks environment and databricks managed dbrs which are different from open-source versions((DBR...

  • 3 kudos
4 More Replies
chandru44
by New Contributor II
  • 4362 Views
  • 3 replies
  • 2 kudos

Resolved! Networking Challenges with Databricks Serverless Compute (Control Plane) When Connecting to On-Prem

Hi Databricks Community,I'm working through some networking challenges when connecting Databricks clusters to various data sources and wanted to get advice or best practices from others who may have faced similar issues.Current Setup:I have four type...

Databricks Serverless Community Post.drawio (2).png
  • 4362 Views
  • 3 replies
  • 2 kudos
Latest Reply
bitc
New Contributor II
  • 2 kudos

Thank you Louis for the detailed explanation and guidance!

  • 2 kudos
2 More Replies
Chiran-Gajula
by New Contributor III
  • 266 Views
  • 1 replies
  • 1 kudos

Resolved! How safe is Databricks workspaces with user files uploaded to workspace?

With the growing adoption of diverse machine learning, AI, and data science models available in the market, it has become increasingly challenging to assess the safety of processing these models—especially when considering the potential for malicious...

  • 266 Views
  • 1 replies
  • 1 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 1 kudos

Hi @Chiran-Gajula , Thanks for raising this. There are a few complementary controls that can put in place across models, inference traffic, files, and observability. Is there currently any mechanism in place within Databricks to track and verify the ...

  • 1 kudos
maikel
by New Contributor III
  • 630 Views
  • 3 replies
  • 2 kudos

Resolved! Accessing data bricks data outside data bricks

Hi!What is the best way to access data bricks data, outside data bricks e.g. from Python code? The main problem is authentication so that I can access data to which I have permissions but I would like to generate token outside data bricks (e.g. via R...

  • 630 Views
  • 3 replies
  • 2 kudos
Latest Reply
dkushari
Databricks Employee
  • 2 kudos

Hi @maikel - You can set up a Service Principal in Databricks and a client ID and Client Secret. Then set up a Databricks profile and use Python code with that profile. Look at the profile section in step 2, how the profile can be set up with client ...

  • 2 kudos
2 More Replies
Barnita
by New Contributor III
  • 619 Views
  • 2 replies
  • 3 kudos

Pre-Commit hook in Databricks

Hi team,Anyone has any idea how to use pre-commit hooks when developing via Databricks UI?Would specifically want to use something like isort, black, ruff etc.I have created .pre-commit-config.yaml and pyproject.toml files in my cloned repo folder, b...

  • 619 Views
  • 2 replies
  • 3 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 3 kudos

Databricks Repos (Git folders) do not support Git hooks natively.The error you're seeing (git failed. Is it installed, and are you in a Git repository directory?) is expected because:1. The Databricks notebook environment does not expose a full Git C...

  • 3 kudos
1 More Replies
DataCurious
by New Contributor III
  • 16013 Views
  • 24 replies
  • 19 kudos

how do you disable serverless interactive compute for all users

I don't want users using serverless interactive compute for their jobs. how do i disable it for everyone or for specific users

  • 16013 Views
  • 24 replies
  • 19 kudos
Latest Reply
timo2022
New Contributor II
  • 19 kudos

At the local university, we have arranged, for the last few years, a course which uses Spark and Databricks for hands-on coding practice. There are 300 students on the course. We have controlled the price by having a single common cluster. It has aut...

  • 19 kudos
23 More Replies
r_w_
by New Contributor II
  • 4455 Views
  • 7 replies
  • 2 kudos

Resolved! Best Practices for Mapping Between Databricks and AWS Accounts

Hi everyone, this is my first post here. I'm doing my best to write in English, so I apologize if anything is unclear.I'm looking to understand the best practices for how many environments to set up when using Databricks on AWS. I'm considering the f...

  • 4455 Views
  • 7 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

Hey @r_w_ If you think my answer was correct, it would be great if you could mark it as a solution to help future users Thanks,Isi

  • 2 kudos
6 More Replies
Rvwijk
by New Contributor II
  • 3525 Views
  • 2 replies
  • 0 kudos

Resolved! New default notebook format (IPYNB) causes unintended changes on release

Dear Databricks,We have noticed the following issue since the new default notebook format has been set to IPYNB. When we release our code from (for example) DEV to TST using a release pipeline built in Azure DevOps, we see unintended changes popping ...

  • 3525 Views
  • 2 replies
  • 0 kudos
Latest Reply
dkushari
Databricks Employee
  • 0 kudos

Hi @Rvwijk, please take a look at this. This should solve your issue. I suspect the mismatch is happening due to the previous ones, including output for the notebook cells. You may need to perform a rebase of your repository and allow the output to b...

  • 0 kudos
1 More Replies
hasanakhuy
by New Contributor
  • 561 Views
  • 1 replies
  • 1 kudos

Resolved! AIM with Entra ID Groups – Users and Service Principals not visible in Workspace

Hello Community, I am testing Automatic Identity Management (AIM) in Databricks with Unity Catalog enabled. Steps I did:      •     AIM is activated      •     In Microsoft Entra ID I created a group g1 and added user u1 and service principal sp1    ...

  • 561 Views
  • 1 replies
  • 1 kudos
Latest Reply
dkushari
Databricks Employee
  • 1 kudos

In Azure Databricks, when AIM is enabled, Entra users, service principals, and groups are available in Azure Databricks as soon as they’re granted permissions. Group memberships, including nested groups, flow directly from Entra ID, so permissions al...

  • 1 kudos
help_needed_445
by Contributor
  • 416 Views
  • 1 replies
  • 1 kudos

Questions About Notebook Debugging Tools

I'm researching the different ways to debug in databricks notebooks and have some questions.1. Can the python breakpoint() function be used in notebooks? This article says it can be used https://www.databricks.com/blog/new-debugging-features-databric...

  • 416 Views
  • 1 replies
  • 1 kudos
Latest Reply
jack_zaldivar
Databricks Employee
  • 1 kudos

Hi @help_needed_445 ! Can you give a bit more information on your environment? Which cloud are you operating in where you are not able to use the native debugging tool? I have tested in an Azure workspace by adding a breakpoint in the gutter of a spe...

  • 1 kudos
eoferreira
by New Contributor
  • 808 Views
  • 3 replies
  • 4 kudos

Lakebase security

Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...

  • 808 Views
  • 3 replies
  • 4 kudos
Latest Reply
Sudheer-Reddy
New Contributor II
  • 4 kudos

Postgres instance is covered by the private link you configure to your workspace.

  • 4 kudos
2 More Replies
Daniela_Boamba
by New Contributor III
  • 216 Views
  • 0 replies
  • 0 kudos

Databricks certificate expired

Hello,I have a databricks workspace with sso authentication. the IDP is on azure.The client certificate expired and now, I can't log on to databricks to add the new one.How can I do? Any idea is welcomed.Thank you!!Best regards,daniela 

  • 216 Views
  • 0 replies
  • 0 kudos