- 975 Views
- 3 replies
- 3 kudos
Resolved! Lakebase -- Enable RLS in synced Table
Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...
- 975 Views
- 3 replies
- 3 kudos
- 3 kudos
Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?
- 3 kudos
- 133 Views
- 1 replies
- 1 kudos
My trial is about to expire
I'm aware, my workspace/subscription will be converted into a 'pay-as-you-go' model. That's okay - however I wonder why you don't provide a non-restricted plan just for learning. I'm sure there are ways to block commercial use. However, that's not my...
- 133 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @quakenbush ,In the past you had to create a new VNet injected workspace and migrate all workloads from the existing managed workspace to enable VNet injection. This process was necessary because there was no direct way to convert a managed worksp...
- 1 kudos
- 401 Views
- 6 replies
- 0 kudos
Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?
Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....
- 401 Views
- 6 replies
- 0 kudos
- 0 kudos
Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...
- 0 kudos
- 206 Views
- 2 replies
- 3 kudos
Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table
When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...
- 206 Views
- 2 replies
- 3 kudos
- 780 Views
- 5 replies
- 0 kudos
Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials
I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...
- 780 Views
- 5 replies
- 0 kudos
- 0 kudos
Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...
- 0 kudos
- 210 Views
- 4 replies
- 1 kudos
TCO calculator for Databricks Analytics
Similar to the cloud infra calculators, is there a TCO calculator exist for Databricks?Lets say we have the inputs such as Number of source tables, data pipelines (estimated number), data growth per day, transfromation complexity and target reports a...
- 210 Views
- 4 replies
- 1 kudos
- 1 kudos
@szymon_dybczak - I am aware of that calculator, however, the challenge is - how to even calculate the number of DBU it will consume based on the volume of data processing etc. The tool starts with the Infra and compute inputs. However, my question i...
- 1 kudos
- 126 Views
- 1 replies
- 0 kudos
Does Databricks support HNS in GCP?
Hello,I need to set up some buckets in GCP which will be used as an analytics and productive data lake. I am getting diverging feedback on whether hierarchical namespaces (HNS) should be enabled for these buckets.On one hand, HNS is advisable for ana...
- 126 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @IvanPopov ,According to docs Google Cloud Storage hierarchical namespace (HNS) is not supported with external locations. You must disable hierarchical namespace before creating an external location.
- 0 kudos
- 551 Views
- 7 replies
- 3 kudos
UC volumes not useable in Apps?
I have to install some custom library in a Python Databricks App. According to the documentation this should be possible through UC volumes:https://docs.databricks.com/aws/en/dev-tools/databricks-apps/dependencies#install-wheel-files-from-unity-catal...
- 551 Views
- 7 replies
- 3 kudos
- 106 Views
- 0 replies
- 0 kudos
Lakebase not accessible in Private Network
We have a VNET Injected workspace in Azure. There are multiple SQL Warehouse which are easily accessible from Private Network - both directly from VM and via VPN on client's machine. We deployed Lakebase. Inside the Workspace, connectivity is working...
- 106 Views
- 0 replies
- 0 kudos
- 219 Views
- 2 replies
- 0 kudos
Internet Access from Serverless Databricks - free trial
Hi community. I started to use databricks quick set up free trial and I have been trying to access internet from a python notebook but I haven't been able to do so. Even my UI is different. Is it becasue I am using free trial?
- 219 Views
- 2 replies
- 0 kudos
- 0 kudos
I changed the set up and I linked it to aws workspace. It doesn't raise any error now.But I was using requests
- 0 kudos
- 230 Views
- 3 replies
- 1 kudos
What is the maximum number of workspaces per account on GCP
I found this in the docs: "you can create at most 200 workspaces per week in the same Google Cloud project" -But that directly contradicts the 20 limit that is mentioned in resource limits docs . But Azure has no limit, and AWS has a limit of 50. So...
- 230 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @rmusti ,This is a bit confusing but not contradictory. Here's the important line in the docs:"For limits where Fixed is No, you can request a limit increase through your Databricks account team."So, below you have a table with resource limits. In...
- 1 kudos
- 3803 Views
- 3 replies
- 0 kudos
Only absolute paths are currently supported. Paths must begin with '/'
I am facing the above issue when using the Python Databricks SDK. I retreive the job-definition by "client.jobs.get()" and then try to create it on another workspace with"client.jobs.create()"Therefore the job-definition is correct and working fine o...
- 3803 Views
- 3 replies
- 0 kudos
- 0 kudos
You’re hitting a Jobs validation rule that depends on where the notebook is sourced from. With Git-sourced jobs, notebook paths must be relative; with workspace-sourced jobs, paths must be absolute and start with “/”. If a task’s source is treated as...
- 0 kudos
- 331 Views
- 2 replies
- 1 kudos
Resolved! Databricks On prem version
Hello, Does Databricks offer an on-premises deployment option?If so, does the on-prem version have any restrictions? If not, is there a way around it? Thank you.
- 331 Views
- 2 replies
- 1 kudos
- 133 Views
- 1 replies
- 0 kudos
mount cifs volume on all purpose compute results in permission denied
I have all networking already set, nslookup resolves NAS server IP and connectivity is enabled from worker nodes to nas server. I am able to mount the same nas drive outside of databricks, I mean standalone linux vm in the same VPC where worker nodes...
- 133 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello, Could you provide with more information about why you want to attach NAS drive to Databricks cluster, please? I am no expert in Storage. As far as I understand, NAS will suffer with IO and Replication Bottlenecks, when attached to Distributed ...
- 0 kudos
- 278 Views
- 2 replies
- 4 kudos
Resolved! Reaching out to Azure Storage with IP from Private VNET pool
Hey All,Is there a way for Databricks to reach out to Azure Storage using private endpoint?We would like no omit enabling access by "all trusted services".All resources are in the same VNET however when Databrics tries to reach out to Storage instead...
- 278 Views
- 2 replies
- 4 kudos
- 4 kudos
Sorry for late reply - thank you for your help Nayan!
- 4 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |