cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Angus-Dawson
by New Contributor III
  • 370 Views
  • 1 replies
  • 1 kudos

Databricks Runtime 16.4 LTS has inconsistent Spark and Delta Lake versions

Per the release notes for Databricks Runtime 16.4 LTS, the environment has Apache Spark 3.5.2 and Delta Lake 3.3.1:https://docs.databricks.com/aws/en/release-notes/runtime/16.4ltsHowever, Delta Lake 3.3.1 is built on Spark 3.5.3; the newest version o...

  • 370 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 1 kudos

We saw the same thing in previous runtime versions, and even a point-point version broke our code.  We actually log the spark version in one pipeline and see different versions popping up from time to time.  Apparently the long term goal is to move t...

  • 1 kudos
m2chrisp
by New Contributor II
  • 227 Views
  • 0 replies
  • 0 kudos

Setting catalog isolation mode and workspace bindings within a notebook using Python SDK

Hi,I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.All this works just fine.One thing I'm trying to a...

  • 227 Views
  • 0 replies
  • 0 kudos
Marthinus
by New Contributor III
  • 1002 Views
  • 2 replies
  • 0 kudos

Resolved! External Locations to Azure Storage via Private Endpoint

When working with Azure Databricks (with VNET injection) to connect securely to an Azure Storage account via private endpoint, there's a few locations it needs to connect to, firstly the vnet that databricks is connected to, which works well when con...

  • 1002 Views
  • 2 replies
  • 0 kudos
Latest Reply
Marthinus
New Contributor III
  • 0 kudos

I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with ze...

  • 0 kudos
1 More Replies
PradeepPrabha
by New Contributor II
  • 1086 Views
  • 5 replies
  • 0 kudos

Any documentation mentioning connectivity from Azure SQL database connectivity to Azure Databricks

Any documentation available to connect from the Azure SQL database to Azure Databricks SQL workspace. We created a SQL warehouse personal access token for a user in a different team who can connect from his on-prem SQL DB to Databricks using the conn...

  • 1086 Views
  • 5 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

  Here are some considerations: SQL Trigger: Define a trigger in Azure SQL that activates on specific DML operations (e.g., INSERT, UPDATE). External Call: The trigger can log events to an intermediate service (like a control table or Event Grid). ...

  • 0 kudos
4 More Replies
Maxadbuser
by New Contributor III
  • 802 Views
  • 3 replies
  • 2 kudos

Resolved! Compute page Resolver error

I have an issue with my Databricks workspaceWhen i try to attach a notebook to my cluster, the cluster seems to have disappeared.After navigating to the Compute page, it gives me a 'Resolver Error' and no further information. None of the computes/clu...

Maxadbuser_0-1748853981454.jpeg
  • 802 Views
  • 3 replies
  • 2 kudos
Latest Reply
Advika
Databricks Employee
  • 2 kudos

Hello @Maxadbuser! This was a known issue, and the engineering team has implemented a fix. Please check and confirm if you're still experiencing the problem.

  • 2 kudos
2 More Replies
gillesfromparis
by New Contributor II
  • 519 Views
  • 3 replies
  • 1 kudos

Resolved! NAT Gateway IP update

Hi, My Databricks (Premium) account was deployed on AWS.It was provisioned a few months ago from the AWS MarketPlace with the QuickStart method, based on CloudFormation. The NAT Gateway initially created by the CloudFormation stack has been incidenta...

  • 519 Views
  • 3 replies
  • 1 kudos
Latest Reply
gillesfromparis
New Contributor II
  • 1 kudos

Thanks for your answer.Yes I did all of that, as well as allowing the traffic coming from my NAT Instance as an inbound rule of the Security Group of the private instances (and the other way around), and no restriction on the outbound traffic.I suspe...

  • 1 kudos
2 More Replies
Edyta
by New Contributor II
  • 26140 Views
  • 6 replies
  • 0 kudos

Resolved! Delete Databricks account

Hi everyone, as in the topic, I would like to delete unnnecesarily created account. I have found outdated solutions (e.g. https://community.databricks.com/t5/data-engineering/how-to-delete-databricks-account/m-p/6323#M2501), but they do not work anym...

  • 26140 Views
  • 6 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Edyta ,FOR AWS:Manage your subscription | Databricks on AWS" Before you delete a Databricks account, you must first cancel your Databricks subscription and delete all Unity Catalog metastores in the account. After you delete all metastores associ...

  • 0 kudos
5 More Replies
Aminsn
by New Contributor III
  • 324 Views
  • 1 replies
  • 0 kudos

Is it possible to let multiple Apps share the same compute?

Apparently, for every app deployed on Databricks, a separate VM is allocated, costing 0.5 DBU/hour. This seems inefficient, why can't a single VM support multiple apps? It feels like a waste of money and resources to allocate independent VMs per app ...

  • 324 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shua42
Databricks Employee
  • 0 kudos

Hi @Aminsn , You're understanding is correct in that only 1 running app can be deployed per app instance. If you want to maximize the utilization of the compute, one option could be to create a multi-page app, where the landing page will direct users...

  • 0 kudos
JanJaros
by New Contributor
  • 963 Views
  • 0 replies
  • 0 kudos

Databricks OAUTH(OIDC) with ORY Network

Hi,we are trying to setup OIDC AUTH for Databricks with our Ory Network account. So far we have been using it without any issues with all of our apps and now we wanted to set it up also for Databricks. Unfortunately after many attempts with different...

  • 963 Views
  • 0 replies
  • 0 kudos
AnkurMittal008
by New Contributor III
  • 3855 Views
  • 2 replies
  • 0 kudos

Databricks App : Limitations

I have some questions regarding Databricks App.1) Can we use Framework other than mentioned in documentation( Streamlit,Flask,Dash,Gradio,Shiny).2) Can we allocate compute more than 2 vCPU and 6GB memory to any App.3) Any other programming language o...

  • 3855 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ismael-K
Databricks Employee
  • 0 kudos

1.) You can use most Python-based application frameworks, including some beyond those mentioned above.(Reference here) 2.) Currently, app capacity is limited to 2 vCPUs and 6 GB of RAM. However, future updates may introduce options for scaling out an...

  • 0 kudos
1 More Replies
andreapeterson
by New Contributor III
  • 1055 Views
  • 4 replies
  • 2 kudos

Resolved! OAuth API for service user

Is there a way to programmatically create an OAuth secret for for a workspace service principal via API/SDK? As of now, the only way I can see doing this is through UI

  • 1055 Views
  • 4 replies
  • 2 kudos
Latest Reply
vr
Contributor III
  • 2 kudos

@andreapeterson isn't it the API you are looking for?https://docs.databricks.com/api/azure/account/serviceprincipalsecrets/createIt is an account-level API, but, counterintuitively, when we create service principals in the workspace, they propagate i...

  • 2 kudos
3 More Replies
aswinkks
by New Contributor III
  • 501 Views
  • 1 replies
  • 0 kudos

Load assignment during Distributed training

Hi,I wanted to confirm, in a distributed training, if there is any way that I can control what kind/amount of load/data can be send to specific worker nodes, manually ..Or is it completely automatically handled by spark's scheduler, and we don't have...

  • 501 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Contributor III
  • 0 kudos

From what I know, Spark automatically handles how data and workload are distributed across worker nodes during distributed training, you can't manually control exactly what or how much data goes to a specific node. You can still influence the distrib...

  • 0 kudos
BalajiM
by New Contributor
  • 527 Views
  • 1 replies
  • 0 kudos

Running Driver Intensive workloads on all purpose compute

Recently observed when we run a driver intensive code on a all purpose compute. The parallel runs of the same pattern/kind jobs are getting failedExample:Job triggerd on all purpose compute with compute stats of 4 core and 8 gigs ram for driverLets s...

  • 527 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

this will help you # Cluster config adjustmentsspark.conf.set("spark.driver.memory", "16g") # Double current allocationspark.conf.set("spark.driver.maxResultSize", "8g") # Prevent large collects

  • 0 kudos
Inna_M
by New Contributor III
  • 757 Views
  • 3 replies
  • 3 kudos

Resolved! how to modify workspace creator

Hi. Our 3 workspaces were created by a consultant who is no longer with us.  The workspace shows his name still . How can we change it? Also what kind of account should be a creator of a workspace: Service principal, AD account, AD group? Pease see b...

Inna_M_0-1748370746975.png
  • 757 Views
  • 3 replies
  • 3 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 3 kudos

You can file an Azure ticket and request them to contact Databricks Support, who will contact Databricks Engineering, to change that 'workspace owner'. If you have a Databricks Account Team, they can also file an internal ticket to Databricks Enginee...

  • 3 kudos
2 More Replies
vr
by Contributor III
  • 617 Views
  • 4 replies
  • 0 kudos

Terraforming Git credentials for service principals

I am terraforming service principals in my Databricks workspace and it works great until I need to assign Git credentials to my SP. In the UI we have these options to configure credentials on service principal page:However the Terraform resource I fo...

vr_0-1747957962260.png
  • 617 Views
  • 4 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

You're a little bit ahead of me in this process, so I haven't tried the solution yet, but it looks like you create a git credential resource for the service principal.  This requires a token, which I think must be generated in the console.  My refere...

  • 0 kudos
3 More Replies