@Pratikmsbsvm
You can leverage the below one for your architect solution.
Your Setup at a Glance
Sources
Targets
Core Requirement
Cross-Workspace Data Sharing
Recommended Architecture (High-Level View)
[ SAP / Salesforce / Adobe ]
โ
โผ
Ingestion Layer (via ADF / Synapse / Partner Connectors / REST API)
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Azure Data Lake Gen2 โ (Storage layer - centralized)
โ + Delta Lake for ACID โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
Azure Databricks (Primary Workspace)
โโ Bronze: Raw Data
โโ Silver: Cleaned & Transformed
โโ Gold: Aggregated / Business Logic Applied
โ
โโโ> Load to Hightouch / Mad Mobile (via REST APIs / Hightouch Sync)
โโโ> Share curated Delta Tables to Other Databricks Workspace (via Delta Sharing or External Table Mount)
Key Components & Patterns
1. Ingestion Options
2. Storage & Processing Layer
5. Cross-Workspace Databricks Access (This is Your Core Challenge and most important)
Option A: Delta Sharing (Recommended if in different orgs/subscriptions)
Option B: Mount/Use Service Principal ADLS Storage Account (Only if workspaces are under same Azure AD tenant)
Option C: Data Replication with Jobs
Governance / Security Recommendations
Use Unity Catalog (if available) for fine-grained access control
Encrypt data at rest (ADLS) and in transit
Use service principals or managed identities for secure access between services
Summary Visual (Simplified)
Sources โ Ingestion โ Delta Lakehouse โ Destinations
[SAP, SFDC, Adobe] [ADF, APIs] [Bronze, Silver, Gold] [Hightouch, Mad Mobile, Other DBX]
โฒ
โ
Cross-Workspace Access (Delta Sharing / Mounting / Jobs)
Let me know if this helps ๐
Databricks Solution Architect