cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DaPo
by New Contributor III
  • 975 Views
  • 3 replies
  • 3 kudos

Resolved! Lakebase -- Enable RLS in synced Table

Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...

  • 975 Views
  • 3 replies
  • 3 kudos
Latest Reply
Advika
Community Manager
  • 3 kudos

Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?

  • 3 kudos
2 More Replies
quakenbush
by Contributor
  • 133 Views
  • 1 replies
  • 1 kudos

My trial is about to expire

I'm aware, my workspace/subscription will be converted into a 'pay-as-you-go' model. That's okay - however I wonder why you don't provide a non-restricted plan just for learning. I'm sure there are ways to block commercial use. However, that's not my...

  • 133 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @quakenbush ,In the past you had to create a new VNet injected workspace and migrate all workloads from the existing managed workspace to enable VNet injection. This process was necessary because there was no direct way to convert a managed worksp...

  • 1 kudos
martkev
by New Contributor III
  • 401 Views
  • 6 replies
  • 0 kudos

Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?

Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....

  • 401 Views
  • 6 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...

  • 0 kudos
5 More Replies
Escarigasco
by New Contributor III
  • 206 Views
  • 2 replies
  • 3 kudos

Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table

When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...

  • 206 Views
  • 2 replies
  • 3 kudos
Latest Reply
Escarigasco
New Contributor III
  • 3 kudos

Thank you Bianca, great answer!

  • 3 kudos
1 More Replies
Nisha_Tech
by New Contributor II
  • 780 Views
  • 5 replies
  • 0 kudos

Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials

I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...

  • 780 Views
  • 5 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 0 kudos

Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...

  • 0 kudos
4 More Replies
Raman_Unifeye
by Contributor III
  • 210 Views
  • 4 replies
  • 1 kudos

TCO calculator for Databricks Analytics

Similar to the cloud infra calculators, is there a TCO calculator exist for Databricks?Lets say we have the inputs such as Number of source tables, data pipelines (estimated number), data growth per day, transfromation complexity and target reports a...

  • 210 Views
  • 4 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@szymon_dybczak - I am aware of that calculator, however, the challenge is - how to even calculate the number of DBU it will consume based on the volume of data processing etc. The tool starts with the Infra and compute inputs. However, my question i...

  • 1 kudos
3 More Replies
IvanPopov
by New Contributor
  • 126 Views
  • 1 replies
  • 0 kudos

Does Databricks support HNS in GCP?

Hello,I need to set up some buckets in GCP which will be used as an analytics and productive data lake. I am getting diverging feedback on whether hierarchical namespaces (HNS) should be enabled for these buckets.On one hand, HNS is advisable for ana...

  • 126 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @IvanPopov ,According to docs Google Cloud Storage hierarchical namespace (HNS) is not supported with external locations. You must disable hierarchical namespace before creating an external location. 

  • 0 kudos
nk-five1
by New Contributor II
  • 551 Views
  • 7 replies
  • 3 kudos

UC volumes not useable in Apps?

I have to install some custom library in a Python Databricks App. According to the documentation this should be possible through UC volumes:https://docs.databricks.com/aws/en/dev-tools/databricks-apps/dependencies#install-wheel-files-from-unity-catal...

Administration & Architecture
App
UC volumes
Unity Catalog
  • 551 Views
  • 7 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

 

  • 3 kudos
6 More Replies
skarpeck
by New Contributor III
  • 106 Views
  • 0 replies
  • 0 kudos

Lakebase not accessible in Private Network

We have a VNET Injected workspace in Azure. There are multiple SQL Warehouse which are easily accessible from Private Network - both directly from VM and via VPN on client's machine. We deployed Lakebase. Inside the Workspace, connectivity is working...

  • 106 Views
  • 0 replies
  • 0 kudos
AlekseiDiaz
by New Contributor II
  • 219 Views
  • 2 replies
  • 0 kudos

Internet Access from Serverless Databricks - free trial

Hi community. I started to use databricks quick set up free trial and I have been trying to access internet from a python notebook but I haven't been able to do so. Even my UI is different. Is it becasue I am using free trial?

  • 219 Views
  • 2 replies
  • 0 kudos
Latest Reply
AlekseiDiaz
New Contributor II
  • 0 kudos

I changed the set up and I linked it to aws workspace. It doesn't raise any error now.But I was using requests

  • 0 kudos
1 More Replies
rmusti
by New Contributor II
  • 230 Views
  • 3 replies
  • 1 kudos

What is the maximum number of workspaces per account on GCP

I found this in the docs: "you can create at most 200 workspaces per week in the same Google Cloud project" -But that directly contradicts the 20 limit that is mentioned in resource limits docs  . But Azure has no limit, and AWS has a limit of 50. So...

  • 230 Views
  • 3 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @rmusti ,This is a bit confusing but not contradictory. Here's the important line in the docs:"For limits where Fixed is No, you can request a limit increase through your Databricks account team."So, below you have a table with resource limits. In...

  • 1 kudos
2 More Replies
AxelM
by New Contributor
  • 3803 Views
  • 3 replies
  • 0 kudos

Only absolute paths are currently supported. Paths must begin with '/'

I am facing the above issue when using the Python Databricks SDK. I retreive the job-definition by "client.jobs.get()" and then try to create it on another workspace with"client.jobs.create()"Therefore the job-definition is correct and working fine o...

  • 3803 Views
  • 3 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

You’re hitting a Jobs validation rule that depends on where the notebook is sourced from. With Git-sourced jobs, notebook paths must be relative; with workspace-sourced jobs, paths must be absolute and start with “/”. If a task’s source is treated as...

  • 0 kudos
2 More Replies
158576
by New Contributor
  • 133 Views
  • 1 replies
  • 0 kudos

mount cifs volume on all purpose compute results in permission denied

I have all networking already set, nslookup resolves NAS server IP and connectivity is enabled from worker nodes to nas server. I am able to mount the same nas drive outside of databricks, I mean standalone linux vm in the same VPC where worker nodes...

  • 133 Views
  • 1 replies
  • 0 kudos
Latest Reply
siva-anantha
Contributor
  • 0 kudos

Hello, Could you provide with more information about why you want to attach NAS drive to Databricks cluster, please? I am no expert in Storage. As far as I understand, NAS will suffer with IO and Replication Bottlenecks, when attached to Distributed ...

  • 0 kudos
zaicnupagadi
by New Contributor II
  • 278 Views
  • 2 replies
  • 4 kudos

Resolved! Reaching out to Azure Storage with IP from Private VNET pool

Hey All,Is there a way for Databricks to reach out to Azure Storage using private endpoint?We would like no omit enabling access by "all trusted services".All resources are in the same VNET however when Databrics tries to reach out to Storage instead...

  • 278 Views
  • 2 replies
  • 4 kudos
Latest Reply
zaicnupagadi
New Contributor II
  • 4 kudos

Sorry for late reply - thank you for your help Nayan!

  • 4 kudos
1 More Replies