Databricks has launched the Public Preview of Lakeflow Designer, a visual, no‑code, AI‑native way for analysts and business users to prepare and analyze data directly on Databricks without leaving the governed environment of Unity Catalog.
Key highlights
- Visual, no‑code experience: Build data prep and analytics workflows with a drag‑and‑drop canvas and natural language, instead of hand‑writing SQL or Python.
- Built into Databricks: Work on native lakehouse data with Unity Catalog governance, lineage, and permissions from day one, no data movement to an external tool.
- AI‑assisted authoring with Genie Code: Describe what you want in plain English and let Databricks’ agentic coding assistant generate or modify workflows that understand your tables, metadata, and lineage.
- Step‑by‑step, reviewable transformations: Each change is a visual operator with data previews, making AI‑generated logic easier to inspect, explain, and trust.
- Production‑ready under the hood: Every visual flow emits real Python code that can be versioned, scheduled with Lakeflow Jobs, and integrated into broader production pipelines with no per‑user licenses to manage.
In the full post, you’ll see how teams across consulting, financial services, and business functions are using Lakeflow Designer to clean data, combine sources, and power semantic models and dashboards, all while reducing the “SQL bottleneck” and keeping work inside Databricks.
🔗 Read the full post here 👈