- 273 Views
- 1 replies
- 0 kudos
Unity Catalog Backup
Dear allHas anyone explored on backing up unity catalog?...because of gaps in the security model of Databricks, we had to make certain user groups as the schema owners and the schema owners can drop the schema I always wonder how to recover the schem...
- 273 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534,Currently, there’s no native backup/restore. The pragmatic approach is.Treat Unity Catalog metadata as code (Terraform / SQL DDL checked into Git), regularly export UC object definitions with REST API and lock down schema ownership s...
- 0 kudos
- 969 Views
- 4 replies
- 2 kudos
Resolved! VNet Injected Workspace trouble connecting to the Storage Account of a catalog (UC),
Hi! I have some issues with setting up my workspace and storage account within a virtual network and letting them connect to each other.Background set up (all done in terraform):- Databricks workspace with VNet Injection and Unity Catalog enabled. (3...
- 969 Views
- 4 replies
- 2 kudos
- 2 kudos
It was correctly linked.Turned out we were missing one extra private endpoint of type 'dfs'.So for our storage account we needed to create 2 private endpoints, both configured on the same subnet with one of subresource type 'blob' and one subresource...
- 2 kudos
- 725 Views
- 3 replies
- 1 kudos
Databricks in AWS K8 cluster
Hi. I have a question. I have recently started using Databricks. I need the databricks to be deployed on AWS K8 cluster. Where can I find the sources?
- 725 Views
- 3 replies
- 1 kudos
- 347 Views
- 3 replies
- 3 kudos
Databricks Workspace APIs
Hello, I'm trying to collect metadata about the files stored in a Workspace Repo Folder, but it looks like the "size" field is always missing. The field is listed in a few SDKs (for example, in the Python documentation: https://databricks-sdk-py.read...
- 347 Views
- 3 replies
- 3 kudos
- 3 kudos
@szymon_dybczak, Help me understand if this makes sense: what if I use the Files API instead of the Workspace API to get the size?
- 3 kudos
- 605 Views
- 2 replies
- 1 kudos
Verbose logging
Hello! I want audit access to some delta tables in unity catalog from jobs and notebooks. We enabled audit verbose logging. But not all queries that were run from notebooks are written to system.access.audit, column_lineage and table_lineage tables
- 605 Views
- 2 replies
- 1 kudos
- 1 kudos
It are queries to registered table name.
- 1 kudos
- 974 Views
- 6 replies
- 2 kudos
Getting 403 Forbidden Error When Joining Tables Across Two Unity Catalogs in the Same Workspace
Hi everyone, I’m facing an unusual issue in my Databricks environment and would appreciate your guidance. I’m using a consumer workspace with access to two different Unity Catalogs. I can successfully query tables from both catalogs individually wit...
- 974 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi @vishnuvardhan, I appreciate the information isn't directly going to answer your question but I wanted to share what I've found. Firstly, I've not heard of a consumer databricks workspace prior to your post. I'm not sure if you've looked into the ...
- 2 kudos
- 813 Views
- 2 replies
- 0 kudos
Unity Catalog federation with Snowflake-managed Iceberg table fails with Universal Format conversion
Hello All, Error:[DELTA_UNIVERSAL_FORMAT_CONVERSION_FAILED] Failed to convert the table version 3 to the universal format iceberg. Clone validation failed - Size and number of data files in target table should match with source table. srcTableSize: 7...
- 813 Views
- 2 replies
- 0 kudos
- 0 kudos
@fhameed The error occurs if the Iceberg metadata written by Snowflake does not match the number of files in object storage. When attempting to read the table in Databricks, there is a verification process that checks to see if the Iceberg metadata ...
- 0 kudos
- 464 Views
- 3 replies
- 6 kudos
Resolved! Add a tag to a catalog with REST API
Goodday,I have an Azure DevOps pipeline with which I provision our Databricks workspaces. With this pipeline I also create the catalogs through the Databricks API (/api/2.1/unity-catalog/catalogs).Create a catalog | Catalogs API | REST API reference ...
- 464 Views
- 3 replies
- 6 kudos
- 6 kudos
Hi @Marco37 ,Unfortunately, I believe this is not currently possible through REST API. The only options you have are via UI or using SQL. Unfortunately, it's also no possible to do this using terraform. There's a PR that is hanging from some time, bu...
- 6 kudos
- 237 Views
- 1 replies
- 0 kudos
Is There a Way to Test Subscriptions to the Databricks Status Page?
Hi everyone,I’ve subscribed to the Databricks Status page and set up a webhook for notifications. I’m wondering if there’s any way to test that the subscription is working correctly—perhaps by triggering a test notification or receiving a sample JSON...
- 237 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @KingAJ ,There's is no such mechanism that databricks provides. But you can try to use similar approach to the below one and test it on your own.https://github.com/jvargh/Webhooks-For-Status-AlertsSo, basically this PowerShell script generates pay...
- 0 kudos
- 733 Views
- 1 replies
- 1 kudos
Resolved! Databricks Apps On Aws
ContextMigrating from Azure Databricks (Premium) to AWS Databricks (Premium) in eu‑west‑2 (London) with Unity Catalog attached.On Azure, Databricks Apps are available (Compute → Apps and New → App). (ы)Goal: run the same Streamlit apps on AWS.What we...
- 733 Views
- 1 replies
- 1 kudos
- 1 kudos
[Resolved]. The root cause was that Databricks Apps (including Serverless App) are not available in all AWS regions. My primary workspace was created in eu-west-2 (London), where this feature is not supported.After creating a new workspace in eu-west...
- 1 kudos
- 804 Views
- 5 replies
- 1 kudos
Cross-region serverless compute network access to Azure storage account
Given we have Unity Catalog and an Azure Databricks workspace with both in Azure west us region, and we want to allow serverless compute to access data in catalogs that use external locations on an Azure Storage account in west us 3, how can we get t...
- 804 Views
- 5 replies
- 1 kudos
- 1 kudos
Turns out we don't have a Databricks-managed NAT Gateway because our workspace is deployed in our own VNet and we have SCC enabled. I opened a track with Microsoft Support and will be working with them today; if we get it figured out I'll share the i...
- 1 kudos
- 1505 Views
- 6 replies
- 2 kudos
Resolved! Delta Sharing Error from Azure Databricks - "received more than two lines"
Hello,I am trying to query a Delta table located on AWS S3 from Azure Databricks using Delta Sharing.My setup includes a Delta Sharing server running on AWS Fargate. The server itself is running correctly, and I can successfully query it from my loca...
- 1505 Views
- 6 replies
- 2 kudos
- 2 kudos
I was able to connect after changing the delta-sharing-server version to 1.1.0.Thank you for your kind help!
- 2 kudos
- 1111 Views
- 2 replies
- 0 kudos
Terraform Databricks Integration - specially for Unity Catalog in AWS S3
We are attempting to provision Unity Catalog using Terraform, but we're encountering issues with establishing authentication with AWS through IAM Roles and Policies.For EC2/Cluster instances, the instance profile works fine with a trust relationship ...
- 1111 Views
- 2 replies
- 0 kudos
- 0 kudos
Is this work tested? I'm getting an errorError: Self-referential block on index.tf line 31, in resource "aws_iam_role" "reader": 31: "arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/${aws_iam_role.reader.name}" Conf...
- 0 kudos
- 794 Views
- 5 replies
- 5 kudos
Resolved! Recent Databricks UI issue with mouse
For the past week or so, I've been having a weird UI issue with Databricks. Frequently, when I try to select text within a notebook, by dragging the mouse, it behaves as if I have clicked on a notebook within the LHS explorer pane, and loads it. So, ...
- 794 Views
- 5 replies
- 5 kudos
- 5 kudos
Thanks for the tip! I'm a bit surprised that I've only just started triggering this - and am now seeing it all the time. But yes, closing the explorer window is probably the way to go if it can't be fixed
- 5 kudos
- 3513 Views
- 14 replies
- 1 kudos
Running jobs as service principal, while pulling code from Azure DevOps
In our Dataplatform, our jobs are defined in a dataplatform_jobs.yml within a Databricks Asset Bundle, and then pushed to Databricks via an Azure Devops Pipeline (Azure Devops is where our codebase resides). Currently, this results in workflows looki...
- 3513 Views
- 14 replies
- 1 kudos
- 1 kudos
Hi @LuukDSL had you try the solution I have provided above ?
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |