cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mereud
by New Contributor
  • 545 Views
  • 1 replies
  • 0 kudos

Databricks bundle error

Hi everyone,I'm encountering an issue during deployment with Terraform on Databricks.The error I get is:Error: failed to read remote state file: stream error: stream ID 21; NO_ERROR; received from peerOn another attempt (after a manual mistake on my ...

  • 545 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Yes, I also find that sometimes the deploy fails. F.e. when cluster config using policies has changed or when you want to use  mixed node types in combination with policies.What I do is do destroy the bundle or the job that gives issues.Then deploy.T...

  • 0 kudos
taksheel-a-n
by New Contributor
  • 484 Views
  • 1 replies
  • 0 kudos

Can a Databricks App (React-based) be published on the Databricks Marketplace?

Hi everyone,I’ve been exploring Databricks Apps and building a frontend using React within the Databricks environment. I wanted to know if it’s currently possible to publish a Databricks App to the Databricks Marketplace, similar to how datasets, not...

  • 484 Views
  • 1 replies
  • 0 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 0 kudos

Hi @taksheel-a-n According to the documentation: https://learn.microsoft.com/en-us/azure/databricks/marketplace/ . Here's a list of the types of assets that are supported:According to this post, there's not a public facing roadmap: https://community....

  • 0 kudos
DavidRobinson
by New Contributor
  • 520 Views
  • 1 replies
  • 0 kudos

Transition from SCIM to AIM

We're in the process of transitioning our Azure Databricks instance from SCIM-based provisioning to Automated Identity Management (AIM), now that AIM is generally available. Once enabled, AIM becomes the authoritative source for managing users, group...

  • 520 Views
  • 1 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

@DavidRobinson Let me know how it goes. This is in my to-do list too as we are facing a lot of issues with SCIM like nested group sync and SPN syncs. One of the issue that I can think of is AIM respects nested groups from Entra, which SCIM didn’t. So...

  • 0 kudos
Elben
by New Contributor II
  • 632 Views
  • 4 replies
  • 1 kudos

Delta Sharing Egress Pipeline for Azure

We are currently investigating the options for implementing a multi tenancy solution where clients are separated but share data using delta sharing. Are there any way to track cost for reading data in Azure. It seems like Delta Sharing Egress Pipelin...

  • 632 Views
  • 4 replies
  • 1 kudos
Latest Reply
Elben
New Contributor II
  • 1 kudos

Thanks. I'm aware of the very neat features for analyzing cost in databricks, but we are also interested in monitoring the cost for the underlying storage and network. It seems that this is indeed possible in AWS using s3, but not supported in Azure....

  • 1 kudos
3 More Replies
kebubs
by New Contributor
  • 472 Views
  • 1 replies
  • 1 kudos

Resolved! Difference between AWS Marketplace and direct with Databricks

Hi all,Wanted to check the difference between direct purchase from Databricks and through AWS Marketplace, and the difference in deployment of direct purchase and AWS Marketplace. I understand that from AWS Marketplace will have a auto-deployment whe...

  • 472 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @kebubs ,Maybe you will find below thread as useful. According to databricks employee the main difference will be how billing is handled:"Direct Subscription: If you subscribe directly through Databricks, you will manage billing through the Databr...

  • 1 kudos
Kaz1
by New Contributor
  • 2678 Views
  • 1 replies
  • 0 kudos

Can't post to microsoft teams workflow from databricks notebook

When trying to post to a microsoft teams webhook, from a databricks notebook, using compute with DBR 12.2 I receive the following error:SSL error: HTTPSConnectionPool(host='prod-104.westeurope.logic.azure.com', port=443): Max retries exceeded with ur...

  • 2678 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hello @Kaz1 It’s very likely that the issue is related to where the HTTPS request originates — whether it’s coming from the Databricks control plane or your data plane (your own AWS VPC).When you run a local script or call the Teams webhook from a cl...

  • 0 kudos
m2chrisp
by New Contributor II
  • 1238 Views
  • 1 replies
  • 1 kudos

Resolved! Setting catalog isolation mode and workspace bindings within a notebook using Python SDK

Hi,I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.All this works just fine.One thing I'm trying to a...

  • 1238 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some ...

  • 1 kudos
IoanT
by New Contributor
  • 970 Views
  • 1 replies
  • 2 kudos

Resolved! databricks bundle validate: Recommendation: permissions section should explicitly include the curren

Starting from 10/07/2025 my validation bundle step from databricks bundle deploy fail with the folowing message:2025-07-11T07:07:18.5175554Z Recommendation: permissions section should explicitly include the current deployment identity '***' or one of...

  • 970 Views
  • 1 replies
  • 2 kudos
Latest Reply
mark_ott
Databricks Employee
  • 2 kudos

The error message in your Databricks bundle deploy validation step:   text Recommendation: permissions section should explicitly include the current deployment identity '***' or one of its groups If it is not included, CAN_MANAGE permissions are...

  • 2 kudos
Luke_Kociuba
by New Contributor
  • 787 Views
  • 3 replies
  • 5 kudos

Resolved! dbt+Databrics

Hi!I will use dbt + Databrics in my new project.I'm runing dbt training. Have Databrics free/trial account so far. Want to connect, link dbt with Databbrics and create data warehouse from below data setsjaffle_shop_customers.csvjaffle_shop_orders.csv...

  • 787 Views
  • 3 replies
  • 5 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 5 kudos

Great solution @szymon_dybczak .I don't know too much about DBT yet. Is there much difference between connecting/interacting with DBT cloud vs DBT core?All the best,BS

  • 5 kudos
2 More Replies
HitMah
by New Contributor II
  • 703 Views
  • 3 replies
  • 2 kudos

Transitioning Approach for Evolving EDW

As EDW will continue to evolve with new data and business logic during the multi-phased migration, what architectural strategies and design patterns can minimize rework when migrating from an evolving Enterprise Data Warehouse (EDW) to Databricks? I ...

  • 703 Views
  • 3 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

There is no single approach. It depends on your organization.First you have the 'impact-axis' which is lean and mean vs big bang.Next you also have bottom-up (first fix bronze and work upwards) or top-down (focus on gold and read data from your legac...

  • 2 kudos
2 More Replies
danielrodri
by New Contributor III
  • 2023 Views
  • 6 replies
  • 1 kudos

Resolved! Payment receipts of Databricks payments

Hello experts,I am trying to get receipts for the monthly payments done to Databricks. I need them for the financial department of the organization I am working for. The only billing information I get access to is the usage dashboards and the tables ...

  • 2023 Views
  • 6 replies
  • 1 kudos
Latest Reply
DatabricksEddy
New Contributor II
  • 1 kudos

Hello everyone!I'd like to know how I can resolve a payment issue with Databricks?My credit card is registered to be charged automatically, but it hasn't been charged, as I received a message from Databricks asking me to make the payment, etc.Can any...

  • 1 kudos
5 More Replies
Awoke101
by New Contributor III
  • 4008 Views
  • 1 replies
  • 0 kudos

Resolved! Ray cannot detect GPU on the cluster

I am trying to run ray on databricks for chunking and embedding tasks. The cluster I’m using is:g4dn.xlarge1-4 workers with 4-16 cores1 GPU and 16GB memoryI have set spark.task.resource.gpu.amount to 0.5 currently.This is how I have setup my ray clus...

  • 4008 Views
  • 1 replies
  • 0 kudos
Latest Reply
Krishna_S
Databricks Employee
  • 0 kudos

I have replicated all your steps and created the ray cluster exactly as you have done. Also, I have set: spark.conf.set("spark.task.resource.gpu.amount", "0.5") And I see a warning that shows that I don't allocate any GPU for Spark (as 1), even tho...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 1222 Views
  • 4 replies
  • 2 kudos

Oauth Token federation

Dear allHas anyone tried oauth token federation for authentication with Databricks REST APIs?appreciate if there is a re-usable code snippet to achieve the same.

  • 1222 Views
  • 4 replies
  • 2 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 2 kudos

@noorbasha534  Here is a sample python code I use for getting oauth token from Azure Active Directory and then pass the token in databricks API. Prerequisite is the SPN needs to be a admin in the workspace.import requests # Azure AD credentials tena...

  • 2 kudos
3 More Replies
ekmazars
by New Contributor II
  • 458 Views
  • 2 replies
  • 1 kudos

Resolved! SQLSTATE HY000 after upgrading from Databricks 15.4 to 16.4

After upgrading from Databricks 15.4 to 16.4, without changing our Python code, we suddenly get SQL Timeouts, see below.Is there some new timeout default, that we don't know about, that we need to increase with the new version? After a quick search I...

  • 458 Views
  • 2 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

After upgrading to Databricks 16.4, there is a notable change in SQL timeout behavior. The default timeout for SQL statements and objects like materialized views and streaming tables is now set to two days (172,800 seconds). This system-wide default ...

  • 1 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 561 Views
  • 2 replies
  • 0 kudos

View Refresh Frequency

Dear allwe have around 5000+ finished data products (aka views) in several schemas of unity catalog. One question that comes from business users frequently is - how frequently these get refreshed?for that the answer is not simpler as the underlying t...

  • 561 Views
  • 2 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 0 kudos

Hi @noorbasha534 just a pseudocode:for view in all_views:lineage = get_lineage(view) # Use Unity Catalog APIbase_tables = extract_base_tables(lineage)refresh_times = []for table in base_tables:job = find_job_refreshing_table(table) # Custom logic/met...

  • 0 kudos
1 More Replies