Hi @ad_k, To create a data model from Unity Catalog tables and store it in Azure data lake in Delta format, use Databricks Notebooks with PySpark or SQL. The process involves reading raw data from Unity Catalog, transforming it into fact and dimension tables, and then writing the transformed data to Azure data lake in Delta format. Here's a sample PySpark script: initialize a Spark session, read the data from Unity Catalog, perform necessary transformations, and save the results to Azure Data Lake in Delta format by replacing placeholders with your Azure details.