cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dbxsteve
by New Contributor II
  • 178 Views
  • 5 replies
  • 0 kudos

Proxy configuration - while bootstraping

I am trying to start a cluster in Az databricks,our policy is to use proxy for outbound traffic. I  have configured http_proxy, https_proxy, HTTP_PROXY, HTTPS_PROXY, no_proxy and NO_PROXY List in env variables and global . Made sure proxy is bypassin...

  • 178 Views
  • 5 replies
  • 0 kudos
Latest Reply
dbxsteve
New Contributor II
  • 0 kudos

@Debayan​ 

  • 0 kudos
4 More Replies
Giuseppe_C
by New Contributor
  • 103 Views
  • 1 replies
  • 0 kudos

Databricks import directory false positive import

Hello evryone,I'm using databricks CLI to moving several directories from Azure Repos to Databricks Workspace.The problem is that files are not updating properly, with no error to display.The self-hosted agent in pipeline i'm using has installed the ...

  • 103 Views
  • 1 replies
  • 0 kudos
Latest Reply
Abeshek
New Contributor
  • 0 kudos

Hi @Giuseppe_C,Databricks CLI is not syncing updates during your pipeline runs. Several teams we work with have faced the same issue with legacy CLI versions and workspace import behavior. We’ve helped them stabilize CI/CD pipelines for Databricks, i...

  • 0 kudos
kumarV
by New Contributor
  • 97 Views
  • 1 replies
  • 0 kudos

Databricks Job : Unable to read Databricks job run parameter in scala code and sql query.

we are created data bricks job with Jar (scala code) and provided parameters/jar parameters and able to read those as arguments in main method.  we are running job with parameters (run parameters / job parameters) , those parametes are not able to re...

  • 97 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hey @kumarV , I did some digging and here are some hints/tips to help you further troubleshoot. Yep — this really comes down to how parameters flow through Lakeflow Jobs depending on the task type. JAR tasks are the odd duck: they don’t get the same ...

  • 0 kudos
PNC
by New Contributor III
  • 115 Views
  • 2 replies
  • 2 kudos

Resolved! Model serving with provisioned throughput fails

I'm trying to serve a model with provisioned throughput but I'm getting this error:Build could not start due to an internal error. If you are serving a model from UC and Azure storage firewall or Private Link is configured on your storage account, pl...

  • 115 Views
  • 2 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

Hi team, Creating an endpoint in your workspace needs Serverless, and so you need to update the storage account’s firewall to allow Databricks serverless compute via your workspace’s Network Connectivity Configuration (NCC).  If the storage account f...

  • 2 kudos
1 More Replies
Sleiny
by New Contributor
  • 92 Views
  • 1 replies
  • 1 kudos

Updating projects created from Databricks Asset Bundles

Hi allWe are using Databricks Asset Bundles for our data science / ML projects. The asset bundle we have, have spawned quite a few projects by now, but now we need to make some updates to the asset bundle. The updates should also be added to the spaw...

  • 92 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Greetings @Sleiny ,  Here’s what’s really going on, plus a pragmatic, field-tested plan you can actually execute without tearing up your repo strategy. Let’s dig in. What’s happening Databricks Asset Bundles templates are used at initialization time ...

  • 1 kudos
fabian564
by New Contributor
  • 180 Views
  • 5 replies
  • 6 kudos

Resolved! AbfsRestOperationException when adding privatelink.dfs.core.windows.net

Hey Databricks forum,Have been searching a lot, but can't find a solution. I have the following setup:- a vnet connected to the databricks workspace with   - public-subnet (deligated to Microsoft.Databricks/workspaces) and a NSG   - private-subnet (d...

  • 180 Views
  • 5 replies
  • 6 kudos
Latest Reply
fabian564
New Contributor
  • 6 kudos

Yes, that's the solution! I thought I had tested this (maybe some caching..)When I changed it to abfss://metastore@<storageaccount>.dfs.core.windows.net it still failed with:Failed to access cloud storage: [AbfsRestOperationException]The storage publ...

  • 6 kudos
4 More Replies
PatHua
by New Contributor
  • 125 Views
  • 3 replies
  • 7 kudos

Issue when creating Salesforce Connector

HiI'm  trying to create a Salesforce Connector in Lakeflow.In the "salesforce authentication step", I'm entering my Salesforce Username and Password and then I get stucked with the following error message : "OAUTH_APPROVAL_ERROR_GENERIC"My Salesforce...

  • 125 Views
  • 3 replies
  • 7 kudos
Latest Reply
PatHua
New Contributor
  • 7 kudos

Hi guysThank you so much for these pre-requisites to check !Whatever he likes it or not, my salesforce adm will some tasks to do

  • 7 kudos
2 More Replies
Dnirmania
by Contributor
  • 3726 Views
  • 6 replies
  • 1 kudos

Unable to destroy NCC private endpoint

Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...

  • 3726 Views
  • 6 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...

  • 1 kudos
5 More Replies
DaPo
by New Contributor III
  • 932 Views
  • 3 replies
  • 3 kudos

Resolved! Lakebase -- Enable RLS in synced Table

Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...

  • 932 Views
  • 3 replies
  • 3 kudos
Latest Reply
Advika
Databricks Employee
  • 3 kudos

Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?

  • 3 kudos
2 More Replies
quakenbush
by Contributor
  • 115 Views
  • 1 replies
  • 1 kudos

My trial is about to expire

I'm aware, my workspace/subscription will be converted into a 'pay-as-you-go' model. That's okay - however I wonder why you don't provide a non-restricted plan just for learning. I'm sure there are ways to block commercial use. However, that's not my...

  • 115 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @quakenbush ,In the past you had to create a new VNet injected workspace and migrate all workloads from the existing managed workspace to enable VNet injection. This process was necessary because there was no direct way to convert a managed worksp...

  • 1 kudos
martkev
by New Contributor III
  • 372 Views
  • 6 replies
  • 0 kudos

Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?

Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....

  • 372 Views
  • 6 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...

  • 0 kudos
5 More Replies
Escarigasco
by New Contributor III
  • 178 Views
  • 2 replies
  • 3 kudos

Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table

When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...

  • 178 Views
  • 2 replies
  • 3 kudos
Latest Reply
Escarigasco
New Contributor III
  • 3 kudos

Thank you Bianca, great answer!

  • 3 kudos
1 More Replies
Nisha_Tech
by New Contributor II
  • 738 Views
  • 5 replies
  • 0 kudos

Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials

I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...

  • 738 Views
  • 5 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...

  • 0 kudos
4 More Replies
Raman_Unifeye
by Contributor III
  • 186 Views
  • 4 replies
  • 1 kudos

TCO calculator for Databricks Analytics

Similar to the cloud infra calculators, is there a TCO calculator exist for Databricks?Lets say we have the inputs such as Number of source tables, data pipelines (estimated number), data growth per day, transfromation complexity and target reports a...

  • 186 Views
  • 4 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@szymon_dybczak - I am aware of that calculator, however, the challenge is - how to even calculate the number of DBU it will consume based on the volume of data processing etc. The tool starts with the Infra and compute inputs. However, my question i...

  • 1 kudos
3 More Replies