- 42 Views
- 4 replies
- 1 kudos
Delta Sharing Egress Pipeline for Azure
We are currently investigating the options for implementing a multi tenancy solution where clients are separated but share data using delta sharing. Are there any way to track cost for reading data in Azure. It seems like Delta Sharing Egress Pipelin...
- 42 Views
- 4 replies
- 1 kudos
- 1 kudos
Thanks. I'm aware of the very neat features for analyzing cost in databricks, but we are also interested in monitoring the cost for the underlying storage and network. It seems that this is indeed possible in AWS using s3, but not supported in Azure....
- 1 kudos
- 3867 Views
- 2 replies
- 2 kudos
Unity Catalog Volume mounting broken by cluster environment variables (http proxy)
Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...
- 3867 Views
- 2 replies
- 2 kudos
- 2 kudos
Unfortunately, the only solution I found was to not use the proxy globally. Good luck!
- 2 kudos
- 20 Views
- 1 replies
- 0 kudos
I need a switch to turn off Data Apps in databricks workspaces
HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...
- 20 Views
- 1 replies
- 0 kudos
- 0 kudos
And this website has really become sluggish, or is it just me
- 0 kudos
- 23 Views
- 1 replies
- 0 kudos
Difference between AWS Marketplace and direct with Databricks
Hi all,Wanted to check the difference between direct purchase from Databricks and through AWS Marketplace, and the difference in deployment of direct purchase and AWS Marketplace. I understand that from AWS Marketplace will have a auto-deployment whe...
- 23 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @kebubs ,Maybe you will find below thread as useful. According to databricks employee the main difference will be how billing is handled:"Direct Subscription: If you subscribe directly through Databricks, you will manage billing through the Databr...
- 0 kudos
- 2254 Views
- 1 replies
- 0 kudos
Can't post to microsoft teams workflow from databricks notebook
When trying to post to a microsoft teams webhook, from a databricks notebook, using compute with DBR 12.2 I receive the following error:SSL error: HTTPSConnectionPool(host='prod-104.westeurope.logic.azure.com', port=443): Max retries exceeded with ur...
- 2254 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Kaz1 It’s very likely that the issue is related to where the HTTPS request originates — whether it’s coming from the Databricks control plane or your data plane (your own AWS VPC).When you run a local script or call the Teams webhook from a cl...
- 0 kudos
- 554 Views
- 1 replies
- 1 kudos
Setting catalog isolation mode and workspace bindings within a notebook using Python SDK
Hi,I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.All this works just fine.One thing I'm trying to a...
- 554 Views
- 1 replies
- 1 kudos
- 1 kudos
The error occurs because the Databricks Python SDK (databricks-sdk) and the authentication method within an Azure Databricks notebook use a special “db-internal” token for user-based notebook execution, which does not have permission to perform some ...
- 1 kudos
- 459 Views
- 1 replies
- 2 kudos
databricks bundle validate: Recommendation: permissions section should explicitly include the curren
Starting from 10/07/2025 my validation bundle step from databricks bundle deploy fail with the folowing message:2025-07-11T07:07:18.5175554Z Recommendation: permissions section should explicitly include the current deployment identity '***' or one of...
- 459 Views
- 1 replies
- 2 kudos
- 2 kudos
The error message in your Databricks bundle deploy validation step: text Recommendation: permissions section should explicitly include the current deployment identity '***' or one of its groups If it is not included, CAN_MANAGE permissions are...
- 2 kudos
- 126 Views
- 3 replies
- 5 kudos
dbt+Databrics
Hi!I will use dbt + Databrics in my new project.I'm runing dbt training. Have Databrics free/trial account so far. Want to connect, link dbt with Databbrics and create data warehouse from below data setsjaffle_shop_customers.csvjaffle_shop_orders.csv...
- 126 Views
- 3 replies
- 5 kudos
- 5 kudos
Great solution @szymon_dybczak .I don't know too much about DBT yet. Is there much difference between connecting/interacting with DBT cloud vs DBT core?All the best,BS
- 5 kudos
- 121 Views
- 3 replies
- 2 kudos
Transitioning Approach for Evolving EDW
As EDW will continue to evolve with new data and business logic during the multi-phased migration, what architectural strategies and design patterns can minimize rework when migrating from an evolving Enterprise Data Warehouse (EDW) to Databricks? I ...
- 121 Views
- 3 replies
- 2 kudos
- 2 kudos
There is no single approach. It depends on your organization.First you have the 'impact-axis' which is lean and mean vs big bang.Next you also have bottom-up (first fix bronze and work upwards) or top-down (focus on gold and read data from your legac...
- 2 kudos
- 1425 Views
- 6 replies
- 1 kudos
Resolved! Payment receipts of Databricks payments
Hello experts,I am trying to get receipts for the monthly payments done to Databricks. I need them for the financial department of the organization I am working for. The only billing information I get access to is the usage dashboards and the tables ...
- 1425 Views
- 6 replies
- 1 kudos
- 1 kudos
Hello everyone!I'd like to know how I can resolve a payment issue with Databricks?My credit card is registered to be charged automatically, but it hasn't been charged, as I received a message from Databricks asking me to make the payment, etc.Can any...
- 1 kudos
- 958 Views
- 4 replies
- 3 kudos
Databricks Runtime 16.4 LTS has inconsistent Spark and Delta Lake versions
Per the release notes for Databricks Runtime 16.4 LTS, the environment has Apache Spark 3.5.2 and Delta Lake 3.3.1:https://docs.databricks.com/aws/en/release-notes/runtime/16.4ltsHowever, Delta Lake 3.3.1 is built on Spark 3.5.3; the newest version o...
- 958 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @Angus-Dawson Use Databricks Connect for local development/testing against a remote Databricks cluster—this ensures your code runs in the actual Databricks environment and databricks managed dbrs which are different from open-source versions((DBR...
- 3 kudos
- 100 Views
- 1 replies
- 1 kudos
Problem with Metastore
Hello community.We are facing an issue when deploying and configuring metastore using terraform. We are using Azure Devops pipeline for deployment. The identity running the pipeline is a managed identity and it's set as account admin in Account porta...
- 100 Views
- 1 replies
- 1 kudos
- 1 kudos
Greetings @jzu , I did some digging around with internal docs and references and put together some helpful tips and things to consider. This is a common authorization issue related to permission propagation delays and ownership configuration when m...
- 1 kudos
- 3489 Views
- 1 replies
- 0 kudos
Ray cannot detect GPU on the cluster
I am trying to run ray on databricks for chunking and embedding tasks. The cluster I’m using is:g4dn.xlarge1-4 workers with 4-16 cores1 GPU and 16GB memoryI have set spark.task.resource.gpu.amount to 0.5 currently.This is how I have setup my ray clus...
- 3489 Views
- 1 replies
- 0 kudos
- 0 kudos
I have replicated all your steps and created the ray cluster exactly as you have done. Also, I have set: spark.conf.set("spark.task.resource.gpu.amount", "0.5") And I see a warning that shows that I don't allocate any GPU for Spark (as 1), even tho...
- 0 kudos
- 880 Views
- 4 replies
- 2 kudos
Oauth Token federation
Dear allHas anyone tried oauth token federation for authentication with Databricks REST APIs?appreciate if there is a re-usable code snippet to achieve the same.
- 880 Views
- 4 replies
- 2 kudos
- 2 kudos
@noorbasha534 Here is a sample python code I use for getting oauth token from Azure Active Directory and then pass the token in databricks API. Prerequisite is the SPN needs to be a admin in the workspace.import requests # Azure AD credentials tena...
- 2 kudos
- 166 Views
- 2 replies
- 1 kudos
Resolved! SQLSTATE HY000 after upgrading from Databricks 15.4 to 16.4
After upgrading from Databricks 15.4 to 16.4, without changing our Python code, we suddenly get SQL Timeouts, see below.Is there some new timeout default, that we don't know about, that we need to increase with the new version? After a quick search I...
- 166 Views
- 2 replies
- 1 kudos
- 1 kudos
After upgrading to Databricks 16.4, there is a notable change in SQL timeout behavior. The default timeout for SQL statements and objects like materialized views and streaming tables is now set to two days (172,800 seconds). This system-wide default ...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
41 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
94 | |
37 | |
25 | |
24 | |
18 |