This page brings together externally published articles written by our MVPs. Discover expert perspectives, real-world guidance, and community contributions from leaders across the ecosystem.
Databricks One is now Genie. And it's a big deal吏Not just a rebrand. A completely new experience for every employee who has ever been told "you need to ask an analyst for that."Here's what just shipped:Account-Level Genie is GA: one Genie across all...
I recently published a practical write-up on using Databricks + Lovable to quickly turn data processing and ML outputs into a working MVP: Databricks + Lovable: A Practical Case Study of Building an MVP and Managing CostsAt first, I thought the Datab...
This article is all about Lakeflow Designer — Visual Data Prep. Instead of delving into theory, I’ll keep it practical and to the point: a quick guide to help you get started and quickly prepare the data you need.Getting Started with Lakeflow Designe...
Most people spin up Lakebase and hit surprises they didn't see coming.
Here's everything I wish I'd known before shipping packed into 5 slides.
What's inside:
What is Lakebase?
Fully managed Postgres on Databricks. OLTP for the Lakehouse. No ETL p...
The ATOMIC Compound Statement in Databricks SQL is a block of statements that executes as a single, all-or-nothing transaction. If any statement inside the block fails, the entire block is rolled back. Read Complete Article here:Databricks SQL Runtim...
An ATOMIC compound statement in Databricks SQL runs multiple SQL commands as one transaction.If any statement fails, everything is rolled back; if all succeed, all changes are committed together. genesiscard com
As organisations increasingly move toward AI-driven analytics, the need to bring insights closer to end users is more important than ever. With Databricks Genie Spaces, you can enable natural language interactions over your data, allowing users to as...
What is a Metric View? (Think of it as a virtual report definition)Metric Views are a first-class object in Databricks Unity Catalog that allow you to define reusable, governed business metrics on top of your existing tables and views.Think of them a...
Azure Databricks now allows organisations to configure a custom URL at the account level, providing a unified and branded access point for all users.Instead of navigating multiple workspace-specific URLs, users can log in once using a single custom U...
Sharing the latest update on Databricks Genie.The written article is in 3 versions to make it easy to follow the journey:Version 1 covers the initial releaseVersion 2 includes the major improvementsVersion 3 has the latest updates till 2026 Agent...
I’ve been exploring Databricks Genie Code and wanted to share a few practical observations from early usage.What stands out to me is that Genie Code feels less like a traditional coding assistant and more like an agentic workflow assistant. It does n...
I’ve been testing Genie Code in Databricks and wanted to understand not just the UX, but the actual cost behavior.My impression so far:for simple code edits / code assistance, it looks almost free,but if Genie Code starts doing things that involve re...
I was looking at the cost inferred by genie, but i'm not understanding how genie usage is calculated, i looked the system.billing.usage table, can you tell me if i can use this to filter genie usage.
Modern data platforms require constant monitoring and maintenance. From pipeline failures to schema changes, data engineers often spend a large portion of their time reacting to operational issues rather than building new solutions. This is where bac...
Many teams think migrating from Azure Data Factory to Databricks Lakeflow Jobs is just a pipeline rewrite. It’s not.It’s about:* Simplifying over-engineered architectures* Reducing orchestration and monitoring overhead* Aligning data engineering with...
Hi everyone, I wanted to share a quick technical tip for those looking to optimize their SQL Warehouse configurations, especially when dealing with intermittent ad hoc queries or AI BI Dashboards. The Challenge: When setting up a SQL Warehouse, the D...
Recently, Databricks introduced System Tags, allowing you to explicitly label objects as Certified, Deprecated or None.Certification status system tagThe certified status system tag allows users to label objects, such as catalogs, schemas, tables, da...