cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sleiny
by New Contributor
  • 1196 Views
  • 1 replies
  • 1 kudos

Resolved! Updating projects created from Databricks Asset Bundles

Hi allWe are using Databricks Asset Bundles for our data science / ML projects. The asset bundle we have, have spawned quite a few projects by now, but now we need to make some updates to the asset bundle. The updates should also be added to the spaw...

  • 1196 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Greetings @Sleiny ,  Here’s what’s really going on, plus a pragmatic, field-tested plan you can actually execute without tearing up your repo strategy. Let’s dig in. What’s happening Databricks Asset Bundles templates are used at initialization time ...

  • 1 kudos
fabian564
by New Contributor III
  • 1625 Views
  • 5 replies
  • 6 kudos

Resolved! AbfsRestOperationException when adding privatelink.dfs.core.windows.net

Hey Databricks forum,Have been searching a lot, but can't find a solution. I have the following setup:- a vnet connected to the databricks workspace with   - public-subnet (deligated to Microsoft.Databricks/workspaces) and a NSG   - private-subnet (d...

  • 1625 Views
  • 5 replies
  • 6 kudos
Latest Reply
fabian564
New Contributor III
  • 6 kudos

Yes, that's the solution! I thought I had tested this (maybe some caching..)When I changed it to abfss://metastore@<storageaccount>.dfs.core.windows.net it still failed with:Failed to access cloud storage: [AbfsRestOperationException]The storage publ...

  • 6 kudos
4 More Replies
Dnirmania
by Contributor
  • 4993 Views
  • 6 replies
  • 1 kudos

Unable to destroy NCC private endpoint

Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...

  • 4993 Views
  • 6 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 1 kudos

Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...

  • 1 kudos
5 More Replies
quakenbush
by Contributor
  • 564 Views
  • 1 replies
  • 1 kudos

Resolved! My trial is about to expire

I'm aware, my workspace/subscription will be converted into a 'pay-as-you-go' model. That's okay - however I wonder why you don't provide a non-restricted plan just for learning. I'm sure there are ways to block commercial use. However, that's not my...

  • 564 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @quakenbush ,In the past you had to create a new VNet injected workspace and migrate all workloads from the existing managed workspace to enable VNet injection. This process was necessary because there was no direct way to convert a managed worksp...

  • 1 kudos
martkev
by New Contributor III
  • 1222 Views
  • 6 replies
  • 0 kudos

Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?

Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....

  • 1222 Views
  • 6 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...

  • 0 kudos
5 More Replies
Escarigasco
by New Contributor III
  • 925 Views
  • 2 replies
  • 3 kudos

Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table

When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...

  • 925 Views
  • 2 replies
  • 3 kudos
Latest Reply
Escarigasco
New Contributor III
  • 3 kudos

Thank you Bianca, great answer!

  • 3 kudos
1 More Replies
Nisha_Tech
by New Contributor II
  • 1713 Views
  • 5 replies
  • 0 kudos

Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials

I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...

  • 1713 Views
  • 5 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 0 kudos

Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...

  • 0 kudos
4 More Replies
Raman_Unifeye
by Honored Contributor III
  • 1305 Views
  • 4 replies
  • 1 kudos

TCO calculator for Databricks Analytics

Similar to the cloud infra calculators, is there a TCO calculator exist for Databricks?Lets say we have the inputs such as Number of source tables, data pipelines (estimated number), data growth per day, transfromation complexity and target reports a...

  • 1305 Views
  • 4 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Honored Contributor III
  • 1 kudos

@szymon_dybczak - I am aware of that calculator, however, the challenge is - how to even calculate the number of DBU it will consume based on the volume of data processing etc. The tool starts with the Infra and compute inputs. However, my question i...

  • 1 kudos
3 More Replies
IvanPopov
by New Contributor
  • 407 Views
  • 1 replies
  • 0 kudos

Does Databricks support HNS in GCP?

Hello,I need to set up some buckets in GCP which will be used as an analytics and productive data lake. I am getting diverging feedback on whether hierarchical namespaces (HNS) should be enabled for these buckets.On one hand, HNS is advisable for ana...

  • 407 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @IvanPopov ,According to docs Google Cloud Storage hierarchical namespace (HNS) is not supported with external locations. You must disable hierarchical namespace before creating an external location. 

  • 0 kudos
nk-five1
by New Contributor III
  • 1959 Views
  • 7 replies
  • 3 kudos

UC volumes not useable in Apps?

I have to install some custom library in a Python Databricks App. According to the documentation this should be possible through UC volumes:https://docs.databricks.com/aws/en/dev-tools/databricks-apps/dependencies#install-wheel-files-from-unity-catal...

Administration & Architecture
App
UC volumes
Unity Catalog
  • 1959 Views
  • 7 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

 

  • 3 kudos
6 More Replies
AlekseiDiaz
by New Contributor II
  • 404 Views
  • 2 replies
  • 0 kudos

Internet Access from Serverless Databricks - free trial

Hi community. I started to use databricks quick set up free trial and I have been trying to access internet from a python notebook but I haven't been able to do so. Even my UI is different. Is it becasue I am using free trial?

  • 404 Views
  • 2 replies
  • 0 kudos
Latest Reply
AlekseiDiaz
New Contributor II
  • 0 kudos

I changed the set up and I linked it to aws workspace. It doesn't raise any error now.But I was using requests

  • 0 kudos
1 More Replies
rmusti
by New Contributor II
  • 972 Views
  • 3 replies
  • 1 kudos

What is the maximum number of workspaces per account on GCP

I found this in the docs: "you can create at most 200 workspaces per week in the same Google Cloud project" -But that directly contradicts the 20 limit that is mentioned in resource limits docs  . But Azure has no limit, and AWS has a limit of 50. So...

  • 972 Views
  • 3 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @rmusti ,This is a bit confusing but not contradictory. Here's the important line in the docs:"For limits where Fixed is No, you can request a limit increase through your Databricks account team."So, below you have a table with resource limits. In...

  • 1 kudos
2 More Replies
AxelM
by New Contributor
  • 5110 Views
  • 3 replies
  • 0 kudos

Only absolute paths are currently supported. Paths must begin with '/'

I am facing the above issue when using the Python Databricks SDK. I retreive the job-definition by "client.jobs.get()" and then try to create it on another workspace with"client.jobs.create()"Therefore the job-definition is correct and working fine o...

  • 5110 Views
  • 3 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

You’re hitting a Jobs validation rule that depends on where the notebook is sourced from. With Git-sourced jobs, notebook paths must be relative; with workspace-sourced jobs, paths must be absolute and start with “/”. If a task’s source is treated as...

  • 0 kudos
2 More Replies
SantoshMundhe
by New Contributor II
  • 2159 Views
  • 2 replies
  • 1 kudos

Resolved! Databricks On prem version

Hello, Does Databricks offer an on-premises deployment option?If so, does the on-prem version have any restrictions? If not, is there a way around it? Thank you.  

  • 2159 Views
  • 2 replies
  • 1 kudos
Latest Reply
SantoshMundhe
New Contributor II
  • 1 kudos

Thank you 

  • 1 kudos
1 More Replies
158576
by New Contributor
  • 291 Views
  • 1 replies
  • 0 kudos

mount cifs volume on all purpose compute results in permission denied

I have all networking already set, nslookup resolves NAS server IP and connectivity is enabled from worker nodes to nas server. I am able to mount the same nas drive outside of databricks, I mean standalone linux vm in the same VPC where worker nodes...

  • 291 Views
  • 1 replies
  • 0 kudos
Latest Reply
siva-anantha
Databricks Partner
  • 0 kudos

Hello, Could you provide with more information about why you want to attach NAS drive to Databricks cluster, please? I am no expert in Storage. As far as I understand, NAS will suffer with IO and Replication Bottlenecks, when attached to Distributed ...

  • 0 kudos