- 953 Views
- 1 replies
- 1 kudos
Questions About Notebook Debugging Tools
I'm researching the different ways to debug in databricks notebooks and have some questions.1. Can the python breakpoint() function be used in notebooks? This article says it can be used https://www.databricks.com/blog/new-debugging-features-databric...
- 953 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @help_needed_445 ! Can you give a bit more information on your environment? Which cloud are you operating in where you are not able to use the native debugging tool? I have tested in an Azure workspace by adding a breakpoint in the gutter of a spe...
- 1 kudos
- 2383 Views
- 9 replies
- 0 kudos
User Token Forwarding Between App?
I have a streamlit databricks app that is intended to be a frontend UI app. I also have a FastAPI databricks app that is intended to be a middleware app. I want my streamlit app to query the middleware app for all business logic and databrick queries...
- 2383 Views
- 9 replies
- 0 kudos
- 1375 Views
- 1 replies
- 1 kudos
Resolved! Spark executor logs path
We are running spark workloads and have enabled cluster log discovery to push executor logs to Azure blog. While that's running fine, I'd also like to know the local path of the executor logs so that I can make use of oneagent from dynatrace and send...
- 1375 Views
- 1 replies
- 1 kudos
- 1 kudos
Local Executor Log Path on Azure Databricks Executor logs are written locally on each executor node under the work directory: The path pattern is: /databricks/spark/work/<app-id>/<executor-id> For example: /databricks/spark/work/app-20221121180310-00...
- 1 kudos
- 1401 Views
- 1 replies
- 1 kudos
User OBO Token Forwarding between apps
Can user OAuth tokens be forwarded between Databricks Apps for on-behalf-of (OBO) authorization?I have two Databricks Apps deployed in the same workspace:1. **UI App** (Streamlit) - configured with OAuth user authorization2. **Middleware App** (FastA...
- 1401 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @ctgchris Just pushing this issue for visibility to others. Someone from databricks can come up with a solution.
- 1 kudos
- 5125 Views
- 4 replies
- 3 kudos
How to access UnityCatalog's Volume inside Databricks App?
I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...
- 5125 Views
- 4 replies
- 3 kudos
- 3 kudos
Apps don’t mount /Volumes and don’t ship with dbutils. So os.listdir('/Volumes/...') or dbutils.fs.ls(...) won’t work inside an App. Use the Files API or Databricks SDK instead to read/write UC Volume files, then work on a local copy.Code using Pytho...
- 3 kudos
- 2056 Views
- 5 replies
- 1 kudos
Resolved! Databricks Apps On behalf of user authorization - General availability date?
Currently Databricks apps on behalf of user authorization is in public-preview. Any idea when this would be generally available or where I can see it's release plan?https://docs.databricks.com/aws/en/release-notes/product/2025/march#databricks-apps-c...
- 2056 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @rabbitturtles Additionally, you can subscribe to the Databricks Newsletter and join the Product Roadmap Webinars, where they announce all the latest private previews.”https://www.databricks.com/resources?_sft_resource_type=newsletters
- 1 kudos
- 4150 Views
- 5 replies
- 1 kudos
Resolved! Databricks service principal token federation on Kubernetes
Hi I am trying to create a service principal federation policy against AKS cluster. But I am struggling to make it work without any examples. It would be great if you could share examples on how this would work for a service account.Additionally, wha...
- 4150 Views
- 5 replies
- 1 kudos
- 1 kudos
I am currently using a two step process, logging in using azure library and then getting an access token from Azure using the databricks scope. And then using that to authorize towards Databricks. I would like to use `env-oidc` auth type instead, but...
- 1 kudos
- 2599 Views
- 5 replies
- 4 kudos
Resolved! SQLSTATE: 42501 - Missing Privileges for User Groups
Dear AllI'm investigating missing privileges for some of our users.When connecting to an Oracle database via JDBC and attempting to display a DataFrame, we encounter the following error:User does not have permission SELECT on any file. SQLSTATE: 4250...
- 2599 Views
- 5 replies
- 4 kudos
- 4 kudos
@nayan_wylde thank you, that is exactly what I was looking for and could not find
- 4 kudos
- 2095 Views
- 7 replies
- 1 kudos
SAT Tool Scan other workspaces
Hello Team, i have been setting up SAT in my Databricks workspace and i am able to do it and scan in my workspace. i have provided my SP access to all other Workspaces as well When i run the initialize job (SAT Initializer Notebook (one-time)) , I c...
- 2095 Views
- 7 replies
- 1 kudos
- 1 kudos
It seems like a access is denied by network policy. You have to update Network Policy for Serverless at account levelIn Account Console → Cloud Resources → Policies → Serverless Egress Control → default-policyCheck the Allow access to all destination...
- 1 kudos
- 879 Views
- 1 replies
- 0 kudos
Databricks bundle error
Hi everyone,I'm encountering an issue during deployment with Terraform on Databricks.The error I get is:Error: failed to read remote state file: stream error: stream ID 21; NO_ERROR; received from peerOn another attempt (after a manual mistake on my ...
- 879 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, I also find that sometimes the deploy fails. F.e. when cluster config using policies has changed or when you want to use mixed node types in combination with policies.What I do is do destroy the bundle or the job that gives issues.Then deploy.T...
- 0 kudos
- 865 Views
- 1 replies
- 0 kudos
Can a Databricks App (React-based) be published on the Databricks Marketplace?
Hi everyone,I’ve been exploring Databricks Apps and building a frontend using React within the Databricks environment. I wanted to know if it’s currently possible to publish a Databricks App to the Databricks Marketplace, similar to how datasets, not...
- 865 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @taksheel-a-n According to the documentation: https://learn.microsoft.com/en-us/azure/databricks/marketplace/ . Here's a list of the types of assets that are supported:According to this post, there's not a public facing roadmap: https://community....
- 0 kudos
- 1367 Views
- 1 replies
- 0 kudos
Transition from SCIM to AIM
We're in the process of transitioning our Azure Databricks instance from SCIM-based provisioning to Automated Identity Management (AIM), now that AIM is generally available. Once enabled, AIM becomes the authoritative source for managing users, group...
- 1367 Views
- 1 replies
- 0 kudos
- 0 kudos
@DavidRobinson Let me know how it goes. This is in my to-do list too as we are facing a lot of issues with SCIM like nested group sync and SPN syncs. One of the issue that I can think of is AIM respects nested groups from Entra, which SCIM didn’t. So...
- 0 kudos
- 1251 Views
- 4 replies
- 1 kudos
Delta Sharing Egress Pipeline for Azure
We are currently investigating the options for implementing a multi tenancy solution where clients are separated but share data using delta sharing. Are there any way to track cost for reading data in Azure. It seems like Delta Sharing Egress Pipelin...
- 1251 Views
- 4 replies
- 1 kudos
- 1 kudos
Thanks. I'm aware of the very neat features for analyzing cost in databricks, but we are also interested in monitoring the cost for the underlying storage and network. It seems that this is indeed possible in AWS using s3, but not supported in Azure....
- 1 kudos
- 1500 Views
- 1 replies
- 1 kudos
Resolved! Difference between AWS Marketplace and direct with Databricks
Hi all,Wanted to check the difference between direct purchase from Databricks and through AWS Marketplace, and the difference in deployment of direct purchase and AWS Marketplace. I understand that from AWS Marketplace will have a auto-deployment whe...
- 1500 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @kebubs ,Maybe you will find below thread as useful. According to databricks employee the main difference will be how billing is handled:"Direct Subscription: If you subscribe directly through Databricks, you will manage billing through the Databr...
- 1 kudos
- 2977 Views
- 1 replies
- 0 kudos
Can't post to microsoft teams workflow from databricks notebook
When trying to post to a microsoft teams webhook, from a databricks notebook, using compute with DBR 12.2 I receive the following error:SSL error: HTTPSConnectionPool(host='prod-104.westeurope.logic.azure.com', port=443): Max retries exceeded with ur...
- 2977 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Kaz1 It’s very likely that the issue is related to where the HTTPS request originates — whether it’s coming from the Databricks control plane or your data plane (your own AWS VPC).When you run a local script or call the Teams webhook from a cl...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
79 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 54 | |
| 38 | |
| 36 | |
| 25 |