- 1077 Views
- 3 replies
- 2 kudos
Provision users and groups from an Identity Provider (IdP)
In our organization, SCIM is not supported for user and group provisioning. I’d like to know what other options are available to provision users and groups from an Identity Provider (IdP) into Databricks.Are there alternative methods (e.g., JIT provi...
- 1077 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you for sharing this information. I would like to inform you that our environment is Databricks on AWS, and our IdP is Ping Federate. Could you please advise if there are equivalent solutions or recommended best practices for this setup?
- 2 kudos
- 3228 Views
- 5 replies
- 4 kudos
Resolved! Setting up observability for serverless Databricks
I’m looking for best practices and guidance on setting up observability for serverless Databricks. Specifically, I’d like to know:How to capture and monitor system-level metrics (CPU, memory, network, disk) in a serverless setup.How to configure and ...
- 3228 Views
- 5 replies
- 4 kudos
- 4 kudos
Ok, Got it I will start explore it and get back to you.
- 4 kudos
- 4330 Views
- 1 replies
- 1 kudos
Help with cookiecutter Prompts in Databricks Notebooks
Hi everyone! I’m working on using cookiecutter to help us set up consistent project templates on Databricks. So far, it’s mostly working, but I’m struggling with the prompts – they’re not displaying well in the Databricks environment.I’ve tried some ...
- 4330 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi! Great initiative using cookiecutter to standardize project templates on Databricks — I’ve been exploring similar workflows.I ran into the same issue with prompts not displaying well in the notebook environment. One workaround I found effective wa...
- 1 kudos
- 1626 Views
- 5 replies
- 3 kudos
How to embed a databricks dashboard in an iframe when running on localhost?
Hello!I'm working on embedding one of our databricks dashboards into an iframe but even after adding localhost to the Approved domains list, I'm getting the message that embedding for this domain isn't possible (see below).We need to be able to devel...
- 1626 Views
- 5 replies
- 3 kudos
- 3 kudos
So for tunnelling, something like what is described here: https://dev.to/tahsin000/free-services-to-expose-localhost-to-https-a-comparison-5c19
- 3 kudos
- 1425 Views
- 1 replies
- 1 kudos
Clarification on Data Privacy with ai_query Models
Hi everyone,We've had a client ask about the use of the Claude 3.7 Sonnet model (and others) in the Databricks SQL editor via the ai_query function. Specifically, they want to confirm whether any data passed to these models is ringfenced — i.e., not ...
- 1425 Views
- 1 replies
- 1 kudos
- 1 kudos
HI @boitumelodikoko ,The documentation you've provided is official confirmation by Databricks (otherwise they wouldn't put it in public documentation in the first place). Every customer that uses ai functions within databricks should expect that any ...
- 1 kudos
- 2730 Views
- 4 replies
- 1 kudos
Connecting to Databricks from Workato. JDBCDriver 500593 Communication Link failure
I'm trying to connect to Databricks from Workato, to pull data in as part of a Workato Recipe. I'm getting the following error when I test the connection:"Database bridge error: Failed since could not connect to the database - Failed to initialize p...
- 2730 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi WiliamRosaI am in fact trying to connect via the recommended method of using the built-in databricks connector. When I provide all the relevant details for the configuration I get the error mentioned in the original post. I don't understand why i ...
- 1 kudos
- 1488 Views
- 2 replies
- 2 kudos
Resolved! Network Connectivity Configurations : "Default rules" tab not visible
Hi, I'm trying to enable static outbound IPs for Serverless Compute in order to whitelist them on external SaaS.I created an NCC in the same AWS region as my workspace (customer-managed VPC)According to the documentation(Serverless Firewall) , I expe...
- 1488 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello @toko_chi! Yes, as noted in the documentation, this feature is currently in Public Preview and requires enablement by your Databricks account team.Since you’re not seeing the Default rules tab and your API response shows an empty egress_config,...
- 2 kudos
- 678 Views
- 1 replies
- 0 kudos
Unity Catalog Backup
Dear allHas anyone explored on backing up unity catalog?...because of gaps in the security model of Databricks, we had to make certain user groups as the schema owners and the schema owners can drop the schema I always wonder how to recover the schem...
- 678 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534,Currently, there’s no native backup/restore. The pragmatic approach is.Treat Unity Catalog metadata as code (Terraform / SQL DDL checked into Git), regularly export UC object definitions with REST API and lock down schema ownership s...
- 0 kudos
- 2189 Views
- 4 replies
- 2 kudos
Resolved! VNet Injected Workspace trouble connecting to the Storage Account of a catalog (UC),
Hi! I have some issues with setting up my workspace and storage account within a virtual network and letting them connect to each other.Background set up (all done in terraform):- Databricks workspace with VNet Injection and Unity Catalog enabled. (3...
- 2189 Views
- 4 replies
- 2 kudos
- 2 kudos
It was correctly linked.Turned out we were missing one extra private endpoint of type 'dfs'.So for our storage account we needed to create 2 private endpoints, both configured on the same subnet with one of subresource type 'blob' and one subresource...
- 2 kudos
- 1256 Views
- 3 replies
- 1 kudos
Databricks in AWS K8 cluster
Hi. I have a question. I have recently started using Databricks. I need the databricks to be deployed on AWS K8 cluster. Where can I find the sources?
- 1256 Views
- 3 replies
- 1 kudos
- 741 Views
- 3 replies
- 3 kudos
Databricks Workspace APIs
Hello, I'm trying to collect metadata about the files stored in a Workspace Repo Folder, but it looks like the "size" field is always missing. The field is listed in a few SDKs (for example, in the Python documentation: https://databricks-sdk-py.read...
- 741 Views
- 3 replies
- 3 kudos
- 3 kudos
@szymon_dybczak, Help me understand if this makes sense: what if I use the Files API instead of the Workspace API to get the size?
- 3 kudos
- 1193 Views
- 2 replies
- 1 kudos
Verbose logging
Hello! I want audit access to some delta tables in unity catalog from jobs and notebooks. We enabled audit verbose logging. But not all queries that were run from notebooks are written to system.access.audit, column_lineage and table_lineage tables
- 1193 Views
- 2 replies
- 1 kudos
- 1 kudos
It are queries to registered table name.
- 1 kudos
- 1918 Views
- 6 replies
- 2 kudos
Getting 403 Forbidden Error When Joining Tables Across Two Unity Catalogs in the Same Workspace
Hi everyone, I’m facing an unusual issue in my Databricks environment and would appreciate your guidance. I’m using a consumer workspace with access to two different Unity Catalogs. I can successfully query tables from both catalogs individually wit...
- 1918 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi @vishnuvardhan, I appreciate the information isn't directly going to answer your question but I wanted to share what I've found. Firstly, I've not heard of a consumer databricks workspace prior to your post. I'm not sure if you've looked into the ...
- 2 kudos
- 1718 Views
- 2 replies
- 0 kudos
Unity Catalog federation with Snowflake-managed Iceberg table fails with Universal Format conversion
Hello All, Error:[DELTA_UNIVERSAL_FORMAT_CONVERSION_FAILED] Failed to convert the table version 3 to the universal format iceberg. Clone validation failed - Size and number of data files in target table should match with source table. srcTableSize: 7...
- 1718 Views
- 2 replies
- 0 kudos
- 0 kudos
@fhameed The error occurs if the Iceberg metadata written by Snowflake does not match the number of files in object storage. When attempting to read the table in Databricks, there is a verification process that checks to see if the Iceberg metadata ...
- 0 kudos
- 1214 Views
- 3 replies
- 6 kudos
Resolved! Add a tag to a catalog with REST API
Goodday,I have an Azure DevOps pipeline with which I provision our Databricks workspaces. With this pipeline I also create the catalogs through the Databricks API (/api/2.1/unity-catalog/catalogs).Create a catalog | Catalogs API | REST API reference ...
- 1214 Views
- 3 replies
- 6 kudos
- 6 kudos
Hi @Marco37 ,Unfortunately, I believe this is not currently possible through REST API. The only options you have are via UI or using SQL. Unfortunately, it's also no possible to do this using terraform. There's a PR that is hanging from some time, bu...
- 6 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
75 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 53 | |
| 38 | |
| 36 | |
| 25 |