- 866 Views
- 2 replies
- 0 kudos
Webhooks for Microsoft Teams are not being received from Databricks jobs
We have configured a couple of webhooks in Teams channels and added the URLs to Databricks under > Settings> notifications. But our jobs do not post anything into the Teams channels.This used to work but is now not doing anything.
- 866 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello marcusfox, I encounter the same issue as u did. If u already solved it, could u please give the tutorial here please? Thank u in advance for guideline
- 0 kudos
- 2925 Views
- 6 replies
- 1 kudos
Resolved! Compute cluster in Azure workspace is unable to access Unity Catalog volume on storage account
Hi,I'm setting up a workspace in Azure with VNet injection. I'm able to upload files to a Unity Catalog managed storage account volume through the web UI, and access them from notebooks using serverless compute, for example, `dbutils.fs.list("/Volume...
- 2925 Views
- 6 replies
- 1 kudos
- 1 kudos
Could the classic cluster still be using the old ABFS driver instead of the managed identity?
- 1 kudos
- 1053 Views
- 3 replies
- 3 kudos
Resolved! Missing configured "sql" scope in Databricks Apps User Token
I have User authorization for apps enabled in my workspace.I have added the sql scope to my app. However, when making sql queries to my app, authorization errors ensue:Error during request to server: : Provided OAuth token does not have required scop...
- 1053 Views
- 3 replies
- 3 kudos
- 3 kudos
@Advika thanks. It looks like this was only a temporary issue; I had already restarted the app, but today it is working. I will mark your answer as accepted. The problem may have been due to recreating the app (using bundles), which reset the user sc...
- 3 kudos
- 889 Views
- 1 replies
- 0 kudos
Databricks Publish to PowerBI feature - Security aspect
Can someone please explain what access databricks requires to publish UC tables to powerBI service. In above snapshot I see it says read all workspace - so these are PBI workspace or all databricks workspace?If I enable this request, will the publish...
- 889 Views
- 1 replies
- 0 kudos
- 0 kudos
@bharatn at the bottom of you picture, it says "Show Details". perhaps clicking on that will provide some of the granularity you're looking for. If it's DB requesting to Microsoft, it'll be DB being able to see the PBI workspaces. I think the bottom...
- 0 kudos
- 848 Views
- 2 replies
- 3 kudos
Resolved! Databricks Free Edition - compute does not start
Hello,I am using the Databricks Free Edition, but for the past couple of days, I have been facing a problem where I am unable to start the serverless compute. It takes forever to initiate. When I check the log, I see the message that the compute <......
- 848 Views
- 2 replies
- 3 kudos
- 3 kudos
Hello @JrVerbiest! Thanks for sharing this, @BS_THE_ANALYST.This is a known issue with the Free Edition, and the team is already aware and working on a fix. It should be resolved soon.Thanks for your patience in the meantime!
- 3 kudos
- 2037 Views
- 6 replies
- 5 kudos
Resolved! Datbricks CLI is encoding the secrets in base 64 automatically
Hello Everyone,I am using my MAc terminal to save a databricks primary key using a specific scope and secret.Now everything runs smoothly ! except when I get to the step where I generate a secret. Problem is that my primary key is for example "test12...
- 2037 Views
- 6 replies
- 5 kudos
- 5 kudos
Hi @spearitchmeta ,Yes, you're right. In order to read the value of a secret using the Databricks CLI, you must decode the base64 encoded value. You can use jq to extract the value and base --decode to decode it:databricks secrets get-secret <scope-n...
- 5 kudos
- 1730 Views
- 3 replies
- 0 kudos
how to parallel n number of process in databricks
Requirement: I have a volume in which random txt file coming from MQ with random numbers. In my workspace I have python script. Also, i have created job which, when new file will come in volume it will trigger automatically. My requirement is, I need...
- 1730 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @jitenjha11 : You can do it same manner they way it has highlighted by @MujtabaNoori but you have to call the process process twice.Sharing the sample reference code below :Iterating through the files in each directory. for directory in direct...
- 0 kudos
- 797 Views
- 3 replies
- 2 kudos
Provision users and groups from an Identity Provider (IdP)
In our organization, SCIM is not supported for user and group provisioning. I’d like to know what other options are available to provision users and groups from an Identity Provider (IdP) into Databricks.Are there alternative methods (e.g., JIT provi...
- 797 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you for sharing this information. I would like to inform you that our environment is Databricks on AWS, and our IdP is Ping Federate. Could you please advise if there are equivalent solutions or recommended best practices for this setup?
- 2 kudos
- 1992 Views
- 5 replies
- 4 kudos
Resolved! Setting up observability for serverless Databricks
I’m looking for best practices and guidance on setting up observability for serverless Databricks. Specifically, I’d like to know:How to capture and monitor system-level metrics (CPU, memory, network, disk) in a serverless setup.How to configure and ...
- 1992 Views
- 5 replies
- 4 kudos
- 4 kudos
Ok, Got it I will start explore it and get back to you.
- 4 kudos
- 4098 Views
- 1 replies
- 1 kudos
Help with cookiecutter Prompts in Databricks Notebooks
Hi everyone! I’m working on using cookiecutter to help us set up consistent project templates on Databricks. So far, it’s mostly working, but I’m struggling with the prompts – they’re not displaying well in the Databricks environment.I’ve tried some ...
- 4098 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi! Great initiative using cookiecutter to standardize project templates on Databricks — I’ve been exploring similar workflows.I ran into the same issue with prompts not displaying well in the notebook environment. One workaround I found effective wa...
- 1 kudos
- 1197 Views
- 5 replies
- 3 kudos
How to embed a databricks dashboard in an iframe when running on localhost?
Hello!I'm working on embedding one of our databricks dashboards into an iframe but even after adding localhost to the Approved domains list, I'm getting the message that embedding for this domain isn't possible (see below).We need to be able to devel...
- 1197 Views
- 5 replies
- 3 kudos
- 3 kudos
So for tunnelling, something like what is described here: https://dev.to/tahsin000/free-services-to-expose-localhost-to-https-a-comparison-5c19
- 3 kudos
- 829 Views
- 1 replies
- 1 kudos
Clarification on Data Privacy with ai_query Models
Hi everyone,We've had a client ask about the use of the Claude 3.7 Sonnet model (and others) in the Databricks SQL editor via the ai_query function. Specifically, they want to confirm whether any data passed to these models is ringfenced — i.e., not ...
- 829 Views
- 1 replies
- 1 kudos
- 1 kudos
HI @boitumelodikoko ,The documentation you've provided is official confirmation by Databricks (otherwise they wouldn't put it in public documentation in the first place). Every customer that uses ai functions within databricks should expect that any ...
- 1 kudos
- 2290 Views
- 4 replies
- 1 kudos
Connecting to Databricks from Workato. JDBCDriver 500593 Communication Link failure
I'm trying to connect to Databricks from Workato, to pull data in as part of a Workato Recipe. I'm getting the following error when I test the connection:"Database bridge error: Failed since could not connect to the database - Failed to initialize p...
- 2290 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi WiliamRosaI am in fact trying to connect via the recommended method of using the built-in databricks connector. When I provide all the relevant details for the configuration I get the error mentioned in the original post. I don't understand why i ...
- 1 kudos
- 1145 Views
- 2 replies
- 2 kudos
Resolved! Network Connectivity Configurations : "Default rules" tab not visible
Hi, I'm trying to enable static outbound IPs for Serverless Compute in order to whitelist them on external SaaS.I created an NCC in the same AWS region as my workspace (customer-managed VPC)According to the documentation(Serverless Firewall) , I expe...
- 1145 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello @toko_chi! Yes, as noted in the documentation, this feature is currently in Public Preview and requires enablement by your Databricks account team.Since you’re not seeing the Default rules tab and your API response shows an empty egress_config,...
- 2 kudos
- 473 Views
- 1 replies
- 0 kudos
Unity Catalog Backup
Dear allHas anyone explored on backing up unity catalog?...because of gaps in the security model of Databricks, we had to make certain user groups as the schema owners and the schema owners can drop the schema I always wonder how to recover the schem...
- 473 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534,Currently, there’s no native backup/restore. The pragmatic approach is.Treat Unity Catalog metadata as code (Terraform / SQL DDL checked into Git), regularly export UC object definitions with REST API and lock down schema ownership s...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
64 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 121 | |
| 42 | |
| 37 | |
| 31 | |
| 25 |