Databricks Academy offers Build Data Pipelines with Lakeflow Spark Declarative Pipelines, a course on the core concepts of Lakeflow Spark Declarative Pipelines on the Databricks Data Intelligence Platform, taught with a simple, SQL-based approach (with Python code examples provided).
You’ll learn to:
- Build incremental batch and streaming pipelines with Lakeflow Spark Declarative Pipelines
- Combine streaming tables, materialized views, and temporary views to power end-to-end pipelines
- Configure compute, data assets, trigger modes, and data quality expectations
- Use event logs, metrics, and AUTO CDC INTO to monitor pipelines and handle slowly changing dimensions
Designed for:
- Data engineers building or maintaining data pipelines on Databricks
- Learners with Databricks basics (workspaces, Delta Lake, Medallion Architecture, Unity Catalog)
- SQL users with experience in ingesting data into Delta tables
Course format & details:
- Syllabus: 3 Sections | 22 Lessons
- Duration: 2 hours 00 minutes
- Skill Level: Associate
- Cost: Free
🔗 Enroll Now 👈