This page brings together externally published articles written by our MVPs. Discover expert perspectives, real-world guidance, and community contributions from leaders across the ecosystem.
Delta support now includes VOID columns, which are empty columns in our Delta (can be kept for future use or for schema match). VOID is a new datatype; the only accepted value is NULL.
https://databrickster.medium.com/databricks-news-watermark-based-...
Thanks to Skills, we can finally implement the enterprise naming convention. Not only through Genie but also through Agent, auditing all our schemas. #databricks
https://databrickster.medium.com/implementing-enterprise-naming-convention-agentic-way-3...
With readChangeFeed flag AUTO, CDC automatically reads data from the Delta CDF. Thanks to the new flag and the ability to orchestrate the pipeline from the SQL warehouse, processing the Delta CDF is faster than ever. #databricks
https://www.sunnydat...
AUTO CDC made me curious about one practical question: if Auto CDC is now one of the easiest ways to process CDF, is it also the cheapest? To answer that, I compared 3 approaches:
- AUTO CDC pipeline (in standard and performance mode)- Spark Structur...
We can ask Genie to explain the chart or its changes, such as spikes. There is a new button directly in the chart corner to start a conversation.
more news https://databrickster.medium.com/
We can now define Data Quality Alerts and schedule them. We will be notified when an anomaly is detected. It was possible before, but required setting a custom query and using system tables. Additionally, SQL Alert is now a normal Lakeflow job task, ...
There is a new account level, Databricks one. It includes all assets from all workspaces the user has access to in one place. It is available through https://accounts.azuredatabricks.net/one or https://accounts.cloud.databricks.com/one
more news http...
Quality monitoring just got a big upgrade. Intuitive traffic lights make it easy to spot issues instantly, with detailed insights available on hover. Plus, a dedicated Quality tab and new checks (like null values) bring everything into one clear, act...
There are more and more resources available in DABS, and I have to say, defining them is much nicer and easier to manage than in Terraform. We will continue using Terraform to deploy Azure or AWS resources, but we need to pass data from Terraform to ...
A new dynamic drop-down filter is available for the SQL editor. It takes the first column from the other saved query we point to.
https://databrickster.medium.com/databricks-news-2026-week-13-23-march-2026-to-29-march-2026-24f99a978752
Databricks is entering a new market: cybersecurity. It’s one of the fastest-growing markets, alongside AI. The choice was obvious; Databricks already has a strong foundation with agents and the Lakehouse architecture. Many companies are already stori...
From DABS, you can pass a git branch. It is also a really useful best practice, as this way you define that only the given branch can be deployed to the target (e.g., main only to target prod, otherwise it will fail). #databricks
https://databrickste...
Do not get the current access token from entry_point or variables. Databricks SDK has built-in authentication, which can be used even for REST API calls. #databricks
https://databrickster.medium.com/just-because-you-can-do-it-in-databricks-doesnt-mea...
Do not use entry_point to get workspace_id, job_id, run_id, and other metadata. There is a ready, stable solution to do that.
More good/bad practices on:
https://www.sunnydata.ai/blog/databricks-multi-statement-transactions
https://databrickster.medi...