- 197 Views
- 3 replies
- 2 kudos
Resolved! Removing access to Lakehouse and only allowing Databricks One?
Hello, I am trying to set up a user group for business users in our Azure Databricks that will only be able to query data. It looks like Databricks One is the solution to use. So I followed the documentation and granted the user group Consumer Access...
- 197 Views
- 3 replies
- 2 kudos
- 2 kudos
Yeah, that was it. I had set up Databricks with Entra groups from the beginning and had done all my permission work there. I didn't even think of checking the default groups. Thank you!
- 2 kudos
- 137 Views
- 1 replies
- 1 kudos
how to complete connection to snowflake
Hi all,Simple test creating connection to my snowflake trial account was successful, and..Anyway:dbx > external data > connections > create connection > ... > [Test] successful but still see last 2 steps todo (5 access, 6 metadata) but don't see any ...
- 137 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @emanueol ,From you description it seems that you were able to set up connection. Not every step in the wizard has "next button". From example in step 4 (catalog basic) if you create a catalog then you will be automatically transfered to the next ...
- 1 kudos
- 253 Views
- 4 replies
- 3 kudos
Resolved! AI/BI Dashboard embed issue in Databricks App
Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...
- 253 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.
- 3 kudos
- 203 Views
- 2 replies
- 1 kudos
Resolved! Databricks Asset Bundles capability for cross cloud migration
Hi everyone,We are planning a migration from Azure Databricks to GCP Databricks and would like to understand whether Databricks Asset Bundles (DAB) can be used to migrate workspace assets such as jobs, pipelines, notebooks, and custom serving endpoin...
- 203 Views
- 2 replies
- 1 kudos
- 1 kudos
@iyashk-DB Thanks for the details.. It helps.
- 1 kudos
- 176 Views
- 1 replies
- 0 kudos
Databricks import directory false positive import
Hello evryone,I'm using databricks CLI to moving several directories from Azure Repos to Databricks Workspace.The problem is that files are not updating properly, with no error to display.The self-hosted agent in pipeline i'm using has installed the ...
- 176 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Giuseppe_C,Databricks CLI is not syncing updates during your pipeline runs. Several teams we work with have faced the same issue with legacy CLI versions and workspace import behavior. We’ve helped them stabilize CI/CD pipelines for Databricks, i...
- 0 kudos
- 3895 Views
- 6 replies
- 1 kudos
Unable to destroy NCC private endpoint
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 3895 Views
- 6 replies
- 1 kudos
- 1 kudos
Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...
- 1 kudos
- 449 Views
- 6 replies
- 0 kudos
Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?
Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....
- 449 Views
- 6 replies
- 0 kudos
- 0 kudos
Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...
- 0 kudos
- 224 Views
- 2 replies
- 3 kudos
Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table
When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...
- 224 Views
- 2 replies
- 3 kudos
- 806 Views
- 5 replies
- 0 kudos
Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials
I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...
- 806 Views
- 5 replies
- 0 kudos
- 0 kudos
Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...
- 0 kudos
- 228 Views
- 2 replies
- 0 kudos
Internet Access from Serverless Databricks - free trial
Hi community. I started to use databricks quick set up free trial and I have been trying to access internet from a python notebook but I haven't been able to do so. Even my UI is different. Is it becasue I am using free trial?
- 228 Views
- 2 replies
- 0 kudos
- 0 kudos
I changed the set up and I linked it to aws workspace. It doesn't raise any error now.But I was using requests
- 0 kudos
- 138 Views
- 1 replies
- 0 kudos
mount cifs volume on all purpose compute results in permission denied
I have all networking already set, nslookup resolves NAS server IP and connectivity is enabled from worker nodes to nas server. I am able to mount the same nas drive outside of databricks, I mean standalone linux vm in the same VPC where worker nodes...
- 138 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello, Could you provide with more information about why you want to attach NAS drive to Databricks cluster, please? I am no expert in Storage. As far as I understand, NAS will suffer with IO and Replication Bottlenecks, when attached to Distributed ...
- 0 kudos
- 316 Views
- 2 replies
- 1 kudos
Task Hanging issue on DBR 15.4
Hello,I am running strucutred streaming pipeline with 5 models loaded using pyfunc.spark_udf. Lately we have been noticing very strange issue of tasks getting hanged and batch is taking very long time finishing its execution.CPU utilization is around...
- 316 Views
- 2 replies
- 1 kudos
- 1 kudos
On DBR 15.4 the DeadlockDetector: TASK_HANGING message usually just means Spark has noticed some very long-running tasks and is checking for deadlocks. With multiple pyfunc.spark_udf models in a streaming query the tasks often appear “stuck” because ...
- 1 kudos
- 318 Views
- 3 replies
- 0 kudos
Azure VM quota for databricks jobs - demand prediction
Hey folks,a quick check -wanted to gather thoughts on how you manage demand for azure VM quota so you don't run into quota limits issues.In our case, we have several data domains (finance, master data, supply chain...) executing their projects in Dat...
- 318 Views
- 3 replies
- 0 kudos
- 0 kudos
Yes, Azure Databricks compute policies let you define “quota-like” limits, but only within Databricks, not Azure subscription quotas themselves. You still rely on Azure’s own quota system for vCPU/VM core limits at the subscription level. What you c...
- 0 kudos
- 226 Views
- 3 replies
- 0 kudos
Cap on OIDC (max 20) Enable workload identity federation for GitHub Actions
Hi Databricks community,I have followed below page and created Github OIDCs but there seems to be a cap on how many OIDC's a Service Principal can create (20 max). Is there any work around for this or some other solution apart from using Client ID an...
- 226 Views
- 3 replies
- 0 kudos
- 0 kudos
I can't speak for specifically why, but allowing wildcards creates security risks and most identity providers and standards guidance require exact, pre-registered URLs.
- 0 kudos
- 356 Views
- 5 replies
- 3 kudos
Prevent Access to AI Functions Execution
As a workspace Admin, I want to prevent unexpected API costs from unrestricted usage of AI Functions (AI_QUERY() etc.), how can we control that only a particular group-users can execute AI Functions ?I understand the function execution cost can be vi...
- 356 Views
- 5 replies
- 3 kudos
- 3 kudos
ok, so it has to be done at individual end-point and function level
- 3 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
61 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2