- 258 Views
- 4 replies
- 1 kudos
Delta Sharing Egress Pipeline for Azure
We are currently investigating the options for implementing a multi tenancy solution where clients are separated but share data using delta sharing. Are there any way to track cost for reading data in Azure. It seems like Delta Sharing Egress Pipelin...
- 258 Views
- 4 replies
- 1 kudos
- 1 kudos
Thanks. I'm aware of the very neat features for analyzing cost in databricks, but we are also interested in monitoring the cost for the underlying storage and network. It seems that this is indeed possible in AWS using s3, but not supported in Azure....
- 1 kudos
- 3992 Views
- 2 replies
- 2 kudos
Unity Catalog Volume mounting broken by cluster environment variables (http proxy)
Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...
- 3992 Views
- 2 replies
- 2 kudos
- 2 kudos
Unfortunately, the only solution I found was to not use the proxy globally. Good luck!
- 2 kudos
- 129 Views
- 1 replies
- 1 kudos
Difference between AWS Marketplace and direct with Databricks
Hi all,Wanted to check the difference between direct purchase from Databricks and through AWS Marketplace, and the difference in deployment of direct purchase and AWS Marketplace. I understand that from AWS Marketplace will have a auto-deployment whe...
- 129 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @kebubs ,Maybe you will find below thread as useful. According to databricks employee the main difference will be how billing is handled:"Direct Subscription: If you subscribe directly through Databricks, you will manage billing through the Databr...
- 1 kudos
- 2359 Views
- 1 replies
- 0 kudos
Can't post to microsoft teams workflow from databricks notebook
When trying to post to a microsoft teams webhook, from a databricks notebook, using compute with DBR 12.2 I receive the following error:SSL error: HTTPSConnectionPool(host='prod-104.westeurope.logic.azure.com', port=443): Max retries exceeded with ur...
- 2359 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Kaz1 It’s very likely that the issue is related to where the HTTPS request originates — whether it’s coming from the Databricks control plane or your data plane (your own AWS VPC).When you run a local script or call the Teams webhook from a cl...
- 0 kudos
- 656 Views
- 1 replies
- 1 kudos
Resolved! Setting catalog isolation mode and workspace bindings within a notebook using Python SDK
Hi,I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.All this works just fine.One thing I'm trying to a...
- 656 Views
- 1 replies
- 1 kudos
- 1 kudos
The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some ...
- 1 kudos
- 542 Views
- 1 replies
- 2 kudos
Resolved! databricks bundle validate: Recommendation: permissions section should explicitly include the curren
Starting from 10/07/2025 my validation bundle step from databricks bundle deploy fail with the folowing message:2025-07-11T07:07:18.5175554Z Recommendation: permissions section should explicitly include the current deployment identity '***' or one of...
- 542 Views
- 1 replies
- 2 kudos
- 2 kudos
The error message in your Databricks bundle deploy validation step: text Recommendation: permissions section should explicitly include the current deployment identity '***' or one of its groups If it is not included, CAN_MANAGE permissions are...
- 2 kudos
- 303 Views
- 3 replies
- 5 kudos
dbt+Databrics
Hi!I will use dbt + Databrics in my new project.I'm runing dbt training. Have Databrics free/trial account so far. Want to connect, link dbt with Databbrics and create data warehouse from below data setsjaffle_shop_customers.csvjaffle_shop_orders.csv...
- 303 Views
- 3 replies
- 5 kudos
- 5 kudos
Great solution @szymon_dybczak .I don't know too much about DBT yet. Is there much difference between connecting/interacting with DBT cloud vs DBT core?All the best,BS
- 5 kudos
- 301 Views
- 3 replies
- 2 kudos
Transitioning Approach for Evolving EDW
As EDW will continue to evolve with new data and business logic during the multi-phased migration, what architectural strategies and design patterns can minimize rework when migrating from an evolving Enterprise Data Warehouse (EDW) to Databricks? I ...
- 301 Views
- 3 replies
- 2 kudos
- 2 kudos
There is no single approach. It depends on your organization.First you have the 'impact-axis' which is lean and mean vs big bang.Next you also have bottom-up (first fix bronze and work upwards) or top-down (focus on gold and read data from your legac...
- 2 kudos
- 1576 Views
- 6 replies
- 1 kudos
Resolved! Payment receipts of Databricks payments
Hello experts,I am trying to get receipts for the monthly payments done to Databricks. I need them for the financial department of the organization I am working for. The only billing information I get access to is the usage dashboards and the tables ...
- 1576 Views
- 6 replies
- 1 kudos
- 1 kudos
Hello everyone!I'd like to know how I can resolve a payment issue with Databricks?My credit card is registered to be charged automatically, but it hasn't been charged, as I received a message from Databricks asking me to make the payment, etc.Can any...
- 1 kudos
- 1113 Views
- 4 replies
- 3 kudos
Databricks Runtime 16.4 LTS has inconsistent Spark and Delta Lake versions
Per the release notes for Databricks Runtime 16.4 LTS, the environment has Apache Spark 3.5.2 and Delta Lake 3.3.1:https://docs.databricks.com/aws/en/release-notes/runtime/16.4ltsHowever, Delta Lake 3.3.1 is built on Spark 3.5.3; the newest version o...
- 1113 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @Angus-Dawson Use Databricks Connect for local development/testing against a remote Databricks cluster—this ensures your code runs in the actual Databricks environment and databricks managed dbrs which are different from open-source versions((DBR...
- 3 kudos
- 222 Views
- 1 replies
- 1 kudos
Resolved! Problem with Metastore
Hello community.We are facing an issue when deploying and configuring metastore using terraform. We are using Azure Devops pipeline for deployment. The identity running the pipeline is a managed identity and it's set as account admin in Account porta...
- 222 Views
- 1 replies
- 1 kudos
- 1 kudos
Greetings @jzu , I did some digging around with internal docs and references and put together some helpful tips and things to consider. This is a common authorization issue related to permission propagation delays and ownership configuration when m...
- 1 kudos
- 3575 Views
- 1 replies
- 0 kudos
Resolved! Ray cannot detect GPU on the cluster
I am trying to run ray on databricks for chunking and embedding tasks. The cluster I’m using is:g4dn.xlarge1-4 workers with 4-16 cores1 GPU and 16GB memoryI have set spark.task.resource.gpu.amount to 0.5 currently.This is how I have setup my ray clus...
- 3575 Views
- 1 replies
- 0 kudos
- 0 kudos
I have replicated all your steps and created the ray cluster exactly as you have done. Also, I have set: spark.conf.set("spark.task.resource.gpu.amount", "0.5") And I see a warning that shows that I don't allocate any GPU for Spark (as 1), even tho...
- 0 kudos
- 957 Views
- 4 replies
- 2 kudos
Oauth Token federation
Dear allHas anyone tried oauth token federation for authentication with Databricks REST APIs?appreciate if there is a re-usable code snippet to achieve the same.
- 957 Views
- 4 replies
- 2 kudos
- 2 kudos
@noorbasha534 Here is a sample python code I use for getting oauth token from Azure Active Directory and then pass the token in databricks API. Prerequisite is the SPN needs to be a admin in the workspace.import requests # Azure AD credentials tena...
- 2 kudos
- 280 Views
- 2 replies
- 1 kudos
Resolved! SQLSTATE HY000 after upgrading from Databricks 15.4 to 16.4
After upgrading from Databricks 15.4 to 16.4, without changing our Python code, we suddenly get SQL Timeouts, see below.Is there some new timeout default, that we don't know about, that we need to increase with the new version? After a quick search I...
- 280 Views
- 2 replies
- 1 kudos
- 1 kudos
After upgrading to Databricks 16.4, there is a notable change in SQL timeout behavior. The default timeout for SQL statements and objects like materialized views and streaming tables is now set to two days (172,800 seconds). This system-wide default ...
- 1 kudos
- 369 Views
- 2 replies
- 0 kudos
View Refresh Frequency
Dear allwe have around 5000+ finished data products (aka views) in several schemas of unity catalog. One question that comes from business users frequently is - how frequently these get refreshed?for that the answer is not simpler as the underlying t...
- 369 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534 just a pseudocode:for view in all_views:lineage = get_lineage(view) # Use Unity Catalog APIbase_tables = extract_base_tables(lineage)refresh_times = []for table in base_tables:job = find_job_refreshing_table(table) # Custom logic/met...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
97 | |
37 | |
26 | |
25 | |
18 |