- 480 Views
- 3 replies
- 4 kudos
Lakebase security
Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...
- 480 Views
- 3 replies
- 4 kudos
- 4 kudos
Postgres instance is covered by the private link you configure to your workspace.
- 4 kudos
- 321 Views
- 4 replies
- 2 kudos
I need a switch to turn off Data Apps in databricks workspaces
HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...
- 321 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @Louis_Frolio Thanks for your reply. I have checked the previews regularly but never found an option on workspace level to disable server less. See screenshot for how it looks for me.I can reach out to the account team if that's the best option.
- 2 kudos
- 160 Views
- 1 replies
- 1 kudos
User OBO Token Forwarding between apps
Can user OAuth tokens be forwarded between Databricks Apps for on-behalf-of (OBO) authorization?I have two Databricks Apps deployed in the same workspace:1. **UI App** (Streamlit) - configured with OAuth user authorization2. **Middleware App** (FastA...
- 160 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @ctgchris Just pushing this issue for visibility to others. Someone from databricks can come up with a solution.
- 1 kudos
- 183 Views
- 3 replies
- 0 kudos
Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials
I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...
- 183 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Nisha_Tech ,It seems that for some reason github actions wants to authenticate osuing OAuth Token federation:Authenticate access to Databricks using OAuth token federation | Databricks on AWSI guess that you want to authenticate using SP. Could y...
- 0 kudos
- 3529 Views
- 1 replies
- 0 kudos
Networking Challenges with Databricks Serverless Compute (Control Plane) When Connecting to On-Prem
Hi Databricks Community,I'm working through some networking challenges when connecting Databricks clusters to various data sources and wanted to get advice or best practices from others who may have faced similar issues.Current Setup:I have four type...
- 3529 Views
- 1 replies
- 0 kudos
- 0 kudos
Thank you for posting this question. I am encountering the exact same scenarios with Databricks serverless compute while trying to connect to on-prem systems via site-to-site VPN as well as third party SaaS applications requiring IP-based access con...
- 0 kudos
- 1737 Views
- 4 replies
- 3 kudos
How to access UnityCatalog's Volume inside Databricks App?
I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...
- 1737 Views
- 4 replies
- 3 kudos
- 3 kudos
Apps don’t mount /Volumes and don’t ship with dbutils. So os.listdir('/Volumes/...') or dbutils.fs.ls(...) won’t work inside an App. Use the Files API or Databricks SDK instead to read/write UC Volume files, then work on a local copy.Code using Pytho...
- 3 kudos
- 167 Views
- 1 replies
- 1 kudos
Workload identity federation policy
Dear allCan I create a single workload federation policy for all devops pipelines?Our set-up : we have code version controlled in Github repos. And, we use Azure DevOps pipelines to authenticate with Databricks via a service principal currently and d...
- 167 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @noorbasha534 ,In docs they are giving following example of subject requirements for Azure Devops. So, the subject (sub) claim must uniquely identify the workload. So as long as all of your pipelines resides in the same organization, same project ...
- 1 kudos
- 3442 Views
- 8 replies
- 2 kudos
Resolved! Reporting serverless costs to azure costs
So, we've just recently applied serverless budget polices to some of our vector searches and apps. At the moment they're all going to azure under one general tag that we created.However, we needed more definition. So i added the serverless budget pol...
- 3442 Views
- 8 replies
- 2 kudos
- 2 kudos
Billing or set up explicit export pipelines. Check whether your serverless budget policy tags are under a different namespace in Azure, as sometimes they show up nested.
- 2 kudos
- 690 Views
- 3 replies
- 3 kudos
Resolved! AWS-Databricks' workspace attached to a NCC doesn't generate Egress Stable IPs
I am facing an issue when configuring a Databricks workspace on AWS with a Network Connectivity Configuration (NCC).Even after attaching the NCC, the workspace does not generate Egress Stable IPs as expected.In the workspace configuration tab, under ...
- 690 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @ricelso ,Sorry to hear you are still facing this issue.This behaviour isn't expected - I would suggest you kindly raise this with your Databricks Account Executive, and they can raise a support request to get this investigated further.Please let ...
- 3 kudos
- 641 Views
- 2 replies
- 0 kudos
Resolved! Using pip cache for pypi compute libraries
I am able to configure pip's behavior w.r.t index url by setting PIP_INDEX_URL, PIP_TRUSTED_HOST etc. I would like to cache compute-wide pypi libraries, to improve cluster startup performance / reliability. However, I notice that PIP_CACHE_DIR has no...
- 641 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Isi,We moved away from docker images for the reasons you mention, and because they otherwise had issues for us. We are already using artifactory (as hinted by the environment variables mentioned in my post). I wanted to try further improving the s...
- 0 kudos
- 1828 Views
- 7 replies
- 2 kudos
Unable to see Manage Account option in the Databricks Workspace
Hi, I have an organizational account who is the owner of the databricks workspace (premium) and also the global administrator. Still, I don't see "Account Console" option in the databricks after clicking the "manage account" option.I have tried to cl...
- 1828 Views
- 7 replies
- 2 kudos
- 2 kudos
Hi @szymon_dybczak ,Yes have databricks premium account and admin as well
- 2 kudos
- 459 Views
- 3 replies
- 0 kudos
databricks terraform provider, databricks_credential resource, service
I cannot make the databricks_credential resource create a service credential. It works fine with storage credentials. However, when i put `purpose = "SERVICE"` plus aws_iam_role and comment, in the apply phase it fails with `Error: cannot create cred...
- 459 Views
- 3 replies
- 0 kudos
- 0 kudos
I have the same error message now when trying to create a USE_SCHEMA grant for a service principal as in https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/grant#schema-grants . I create a new service principal and th...
- 0 kudos
- 621 Views
- 3 replies
- 3 kudos
Resolved! Missing configured "sql" scope in Databricks Apps User Token
I have User authorization for apps enabled in my workspace.I have added the sql scope to my app. However, when making sql queries to my app, authorization errors ensue:Error during request to server: : Provided OAuth token does not have required scop...
- 621 Views
- 3 replies
- 3 kudos
- 3 kudos
@Advika thanks. It looks like this was only a temporary issue; I had already restarted the app, but today it is working. I will mark your answer as accepted. The problem may have been due to recreating the app (using bundles), which reset the user sc...
- 3 kudos
- 552 Views
- 1 replies
- 0 kudos
Databricks Publish to PowerBI feature - Security aspect
Can someone please explain what access databricks requires to publish UC tables to powerBI service. In above snapshot I see it says read all workspace - so these are PBI workspace or all databricks workspace?If I enable this request, will the publish...
- 552 Views
- 1 replies
- 0 kudos
- 0 kudos
@bharatn at the bottom of you picture, it says "Show Details". perhaps clicking on that will provide some of the granularity you're looking for. If it's DB requesting to Microsoft, it'll be DB being able to see the PBI workspaces. I think the bottom...
- 0 kudos
- 320 Views
- 1 replies
- 1 kudos
Clarification on Data Privacy with ai_query Models
Hi everyone,We've had a client ask about the use of the Claude 3.7 Sonnet model (and others) in the Databricks SQL editor via the ai_query function. Specifically, they want to confirm whether any data passed to these models is ringfenced — i.e., not ...
- 320 Views
- 1 replies
- 1 kudos
- 1 kudos
HI @boitumelodikoko ,The documentation you've provided is official confirmation by Databricks (otherwise they wouldn't put it in public documentation in the first place). Every customer that uses ai functions within databricks should expect that any ...
- 1 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2