- 47 Views
- 1 replies
- 0 kudos
Issue Using Private CA Certificates for Databricks Serverless Private Git → On-Prem GitLab Connectio
Hi everyone,I’m trying to properly configure Databricks Serverless Private Git to connect to our on-premises GitLab, but I'm running into issues with private CA certificates.Following the latest Databricks recommendations, our connection to GitLab go...
- 47 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @kfadratek , thanks for the detailed context — Let's take a look at what could be causing the SSL verification to fail with a custome CA in Serverless Private Git and discuss some approaches that might resolve it. What’s likely going wrong B...
- 0 kudos
- 451 Views
- 3 replies
- 3 kudos
A question about Databricks Fine-grained Access Control (FGAC) cost on dedicated compute
Hi All,recently, while testing Fine-grained Access Control (FGAC) on dedicated compute, I came across something that seems a bit unusual, and I’d like to ask if anyone else has seen similar behavior.I created a view with only one record, and had anot...
- 451 Views
- 3 replies
- 3 kudos
- 3 kudos
You’ve observed that Fine-grained Access Control (FGAC) queries on Databricks dedicated compute can be billed in a way that seems disproportionate to actual execution time: a very short query (2.39s) results in a 10-minute usage window and a higher-t...
- 3 kudos
- 67 Views
- 2 replies
- 3 kudos
Resolved! Azure Databricks Cluster Pricing
Hi, I am trying to workout a rough total pricing of Azure Databricks Cluster using the following assumption. I want to spin a cluster on D13 v2 vms with 9 executors, so in total 1+9 = 10 nodes. I want to use the cluster for 10 hours a day, 30 hours a...
- 67 Views
- 2 replies
- 3 kudos
- 3 kudos
Here is the simple calculation I use based on dollars and assuming the infra is in EUS.Cost ComponentsAzure VM Cost (D13 v2)On-demand price: $0.741/hour per VMMonthly VM cost:10 VMs×300 hours×$0.741=$2,223Yearly VM cost:10×3600×$0.741=$26,6762 2. Dat...
- 3 kudos
- 578 Views
- 6 replies
- 3 kudos
Resolved! I need a switch to turn off Data Apps in databricks workspaces
HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...
- 578 Views
- 6 replies
- 3 kudos
- 3 kudos
It is presently not an option at the Workspace level. Regards, Louis.
- 3 kudos
- 150 Views
- 1 replies
- 1 kudos
Resolved! Delta share not showing in delta shared with me
Hi Everyone,We just start using Databricks, and we were expecting to receive a Delta Share from a third-party provider. They’ve confirmed that the sharing process has been completed on their end. However, the shared data is not appearing on our porta...
- 150 Views
- 1 replies
- 1 kudos
- 1 kudos
You need USE PROVIDER privileges on the recipient workspaces assigned metastore (or you need to be a metastore admin), you will then see the providers delta sharing org name in SHOW PROVIDERS then you can mount their share as a catalog, let me know h...
- 1 kudos
- 4116 Views
- 3 replies
- 2 kudos
Resolved! Networking Challenges with Databricks Serverless Compute (Control Plane) When Connecting to On-Prem
Hi Databricks Community,I'm working through some networking challenges when connecting Databricks clusters to various data sources and wanted to get advice or best practices from others who may have faced similar issues.Current Setup:I have four type...
- 4116 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you Louis for the detailed explanation and guidance!
- 2 kudos
- 3415 Views
- 2 replies
- 0 kudos
Resolved! New default notebook format (IPYNB) causes unintended changes on release
Dear Databricks,We have noticed the following issue since the new default notebook format has been set to IPYNB. When we release our code from (for example) DEV to TST using a release pipeline built in Azure DevOps, we see unintended changes popping ...
- 3415 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Rvwijk, please take a look at this. This should solve your issue. I suspect the mismatch is happening due to the previous ones, including output for the notebook cells. You may need to perform a rebase of your repository and allow the output to b...
- 0 kudos
- 710 Views
- 3 replies
- 4 kudos
Lakebase security
Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...
- 710 Views
- 3 replies
- 4 kudos
- 4 kudos
Postgres instance is covered by the private link you configure to your workspace.
- 4 kudos
- 345 Views
- 1 replies
- 1 kudos
User OBO Token Forwarding between apps
Can user OAuth tokens be forwarded between Databricks Apps for on-behalf-of (OBO) authorization?I have two Databricks Apps deployed in the same workspace:1. **UI App** (Streamlit) - configured with OAuth user authorization2. **Middleware App** (FastA...
- 345 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @ctgchris Just pushing this issue for visibility to others. Someone from databricks can come up with a solution.
- 1 kudos
- 434 Views
- 3 replies
- 0 kudos
Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials
I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...
- 434 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Nisha_Tech ,It seems that for some reason github actions wants to authenticate osuing OAuth Token federation:Authenticate access to Databricks using OAuth token federation | Databricks on AWSI guess that you want to authenticate using SP. Could y...
- 0 kudos
- 3404 Views
- 4 replies
- 3 kudos
How to access UnityCatalog's Volume inside Databricks App?
I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...
- 3404 Views
- 4 replies
- 3 kudos
- 3 kudos
Apps don’t mount /Volumes and don’t ship with dbutils. So os.listdir('/Volumes/...') or dbutils.fs.ls(...) won’t work inside an App. Use the Files API or Databricks SDK instead to read/write UC Volume files, then work on a local copy.Code using Pytho...
- 3 kudos
- 235 Views
- 1 replies
- 1 kudos
Workload identity federation policy
Dear allCan I create a single workload federation policy for all devops pipelines?Our set-up : we have code version controlled in Github repos. And, we use Azure DevOps pipelines to authenticate with Databricks via a service principal currently and d...
- 235 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @noorbasha534 ,In docs they are giving following example of subject requirements for Azure Devops. So, the subject (sub) claim must uniquely identify the workload. So as long as all of your pipelines resides in the same organization, same project ...
- 1 kudos
- 3627 Views
- 8 replies
- 2 kudos
Resolved! Reporting serverless costs to azure costs
So, we've just recently applied serverless budget polices to some of our vector searches and apps. At the moment they're all going to azure under one general tag that we created.However, we needed more definition. So i added the serverless budget pol...
- 3627 Views
- 8 replies
- 2 kudos
- 2 kudos
Billing or set up explicit export pipelines. Check whether your serverless budget policy tags are under a different namespace in Azure, as sometimes they show up nested.
- 2 kudos
- 903 Views
- 3 replies
- 3 kudos
Resolved! AWS-Databricks' workspace attached to a NCC doesn't generate Egress Stable IPs
I am facing an issue when configuring a Databricks workspace on AWS with a Network Connectivity Configuration (NCC).Even after attaching the NCC, the workspace does not generate Egress Stable IPs as expected.In the workspace configuration tab, under ...
- 903 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @ricelso ,Sorry to hear you are still facing this issue.This behaviour isn't expected - I would suggest you kindly raise this with your Databricks Account Executive, and they can raise a support request to get this investigated further.Please let ...
- 3 kudos
- 758 Views
- 2 replies
- 0 kudos
Resolved! Using pip cache for pypi compute libraries
I am able to configure pip's behavior w.r.t index url by setting PIP_INDEX_URL, PIP_TRUSTED_HOST etc. I would like to cache compute-wide pypi libraries, to improve cluster startup performance / reliability. However, I notice that PIP_CACHE_DIR has no...
- 758 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Isi,We moved away from docker images for the reasons you mention, and because they otherwise had issues for us. We are already using artifactory (as hinted by the environment variables mentioned in my post). I wanted to try further improving the s...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2