Yes, there are several ways to automate volume syncing in Databricks. Here are the main approaches:1. Databricks Jobs with Scheduled Triggers2. Using Delta Live Tables (DLT) for Data Syncing3. Workflow Orchestration with Databricks Workflows4. Real-t...
Hi @data-wrangler Dataset-Specific CatalogsUnfortunately, Databricks doesn't support dataset-level scoping in the CREATE FOREIGN CATALOG command for BigQuery.The catalog always tries to discover all datasets in the specified project. The options are ...
Hi @Malthe You're absolutely right - it's completely reasonable to want to verify constraint integrity without relying on the optimizer's assumptions.This is a classic challenge with query optimizers that use constraint information for optimization.C...
Hi @nayan1 Yes, this is a common challenge when transitioning to Unity Catalog (UC) enabled clusters.The installation of Maven packages from Artifactory repositories does work differently in UC environments,but there are several approaches you can us...
Hi @MauricioS Yes, you can achieve similar reprocessing functionality with DLT streaming tables,but it requires a different approach than your current batch process. Here are the main strategies:1. CDC Pattern with Tombstone RecordsThe most common ap...