- 180 Views
- 5 replies
- 0 kudos
Proxy configuration - while bootstraping
I am trying to start a cluster in Az databricks,our policy is to use proxy for outbound traffic. I have configured http_proxy, https_proxy, HTTP_PROXY, HTTPS_PROXY, no_proxy and NO_PROXY List in env variables and global . Made sure proxy is bypassin...
- 180 Views
- 5 replies
- 0 kudos
- 103 Views
- 1 replies
- 0 kudos
Databricks import directory false positive import
Hello evryone,I'm using databricks CLI to moving several directories from Azure Repos to Databricks Workspace.The problem is that files are not updating properly, with no error to display.The self-hosted agent in pipeline i'm using has installed the ...
- 103 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Giuseppe_C,Databricks CLI is not syncing updates during your pipeline runs. Several teams we work with have faced the same issue with legacy CLI versions and workspace import behavior. We’ve helped them stabilize CI/CD pipelines for Databricks, i...
- 0 kudos
- 97 Views
- 1 replies
- 0 kudos
Databricks Job : Unable to read Databricks job run parameter in scala code and sql query.
we are created data bricks job with Jar (scala code) and provided parameters/jar parameters and able to read those as arguments in main method. we are running job with parameters (run parameters / job parameters) , those parametes are not able to re...
- 97 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @kumarV , I did some digging and here are some hints/tips to help you further troubleshoot. Yep — this really comes down to how parameters flow through Lakeflow Jobs depending on the task type. JAR tasks are the odd duck: they don’t get the same ...
- 0 kudos
- 116 Views
- 2 replies
- 2 kudos
Resolved! Model serving with provisioned throughput fails
I'm trying to serve a model with provisioned throughput but I'm getting this error:Build could not start due to an internal error. If you are serving a model from UC and Azure storage firewall or Private Link is configured on your storage account, pl...
- 116 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi team, Creating an endpoint in your workspace needs Serverless, and so you need to update the storage account’s firewall to allow Databricks serverless compute via your workspace’s Network Connectivity Configuration (NCC). If the storage account f...
- 2 kudos
- 92 Views
- 1 replies
- 1 kudos
Updating projects created from Databricks Asset Bundles
Hi allWe are using Databricks Asset Bundles for our data science / ML projects. The asset bundle we have, have spawned quite a few projects by now, but now we need to make some updates to the asset bundle. The updates should also be added to the spaw...
- 92 Views
- 1 replies
- 1 kudos
- 1 kudos
Greetings @Sleiny , Here’s what’s really going on, plus a pragmatic, field-tested plan you can actually execute without tearing up your repo strategy. Let’s dig in. What’s happening Databricks Asset Bundles templates are used at initialization time ...
- 1 kudos
- 183 Views
- 5 replies
- 6 kudos
Resolved! AbfsRestOperationException when adding privatelink.dfs.core.windows.net
Hey Databricks forum,Have been searching a lot, but can't find a solution. I have the following setup:- a vnet connected to the databricks workspace with - public-subnet (deligated to Microsoft.Databricks/workspaces) and a NSG - private-subnet (d...
- 183 Views
- 5 replies
- 6 kudos
- 6 kudos
Yes, that's the solution! I thought I had tested this (maybe some caching..)When I changed it to abfss://metastore@<storageaccount>.dfs.core.windows.net it still failed with:Failed to access cloud storage: [AbfsRestOperationException]The storage publ...
- 6 kudos
- 125 Views
- 3 replies
- 7 kudos
Issue when creating Salesforce Connector
HiI'm trying to create a Salesforce Connector in Lakeflow.In the "salesforce authentication step", I'm entering my Salesforce Username and Password and then I get stucked with the following error message : "OAUTH_APPROVAL_ERROR_GENERIC"My Salesforce...
- 125 Views
- 3 replies
- 7 kudos
- 7 kudos
Hi guysThank you so much for these pre-requisites to check !Whatever he likes it or not, my salesforce adm will some tasks to do
- 7 kudos
- 3726 Views
- 6 replies
- 1 kudos
Unable to destroy NCC private endpoint
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 3726 Views
- 6 replies
- 1 kudos
- 1 kudos
Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...
- 1 kudos
- 41 Views
- 0 replies
- 0 kudos
Delta Sharing from Databricks to SAP BDC fails with invalid_client error
ContextWe are in the process of extracting data between SAP BDC Datasphere and Databricks (Brownfield Implementation).SAP Datasphere is hosted in AWS (eu10)Databricks is hosted in Azure (West Europe)The BDC Connect System is located in the same regio...
- 41 Views
- 0 replies
- 0 kudos
- 933 Views
- 3 replies
- 3 kudos
Resolved! Lakebase -- Enable RLS in synced Table
Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...
- 933 Views
- 3 replies
- 3 kudos
- 3 kudos
Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?
- 3 kudos
- 115 Views
- 1 replies
- 1 kudos
My trial is about to expire
I'm aware, my workspace/subscription will be converted into a 'pay-as-you-go' model. That's okay - however I wonder why you don't provide a non-restricted plan just for learning. I'm sure there are ways to block commercial use. However, that's not my...
- 115 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @quakenbush ,In the past you had to create a new VNet injected workspace and migrate all workloads from the existing managed workspace to enable VNet injection. This process was necessary because there was no direct way to convert a managed worksp...
- 1 kudos
- 373 Views
- 6 replies
- 0 kudos
Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?
Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....
- 373 Views
- 6 replies
- 0 kudos
- 0 kudos
Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...
- 0 kudos
- 178 Views
- 2 replies
- 3 kudos
Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table
When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...
- 178 Views
- 2 replies
- 3 kudos
- 738 Views
- 5 replies
- 0 kudos
Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials
I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...
- 738 Views
- 5 replies
- 0 kudos
- 0 kudos
Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...
- 0 kudos
- 186 Views
- 4 replies
- 1 kudos
TCO calculator for Databricks Analytics
Similar to the cloud infra calculators, is there a TCO calculator exist for Databricks?Lets say we have the inputs such as Number of source tables, data pipelines (estimated number), data growth per day, transfromation complexity and target reports a...
- 186 Views
- 4 replies
- 1 kudos
- 1 kudos
@szymon_dybczak - I am aware of that calculator, however, the challenge is - how to even calculate the number of DBU it will consume based on the volume of data processing etc. The tool starts with the Infra and compute inputs. However, my question i...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
57 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |