cancel
Showing results for 
Search instead for 
Did you mean: 
MVP Articles
This page brings together externally published articles written by our MVPs. Discover expert perspectives, real-world guidance, and community contributions from leaders across the ecosystem.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Databricks MVP
  • 52 Views
  • 0 replies
  • 1 kudos

Void

Delta support now includes VOID columns, which are empty columns in our Delta (can be kept for future use or for schema match). VOID is a new datatype; the only accepted value is NULL. https://databrickster.medium.com/databricks-news-watermark-based-...

void.png
  • 52 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 148 Views
  • 0 replies
  • 1 kudos

Implementing a naming convention

Thanks to Skills, we can finally implement the enterprise naming convention. Not only through Genie but also through Agent, auditing all our schemas. #databricks https://databrickster.medium.com/implementing-enterprise-naming-convention-agentic-way-3...

10cc4473-528d-444c-9bdb-9c1768d1c066.png
  • 148 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 142 Views
  • 0 replies
  • 0 kudos

readChangeFeed flag

 With readChangeFeed flag AUTO, CDC automatically reads data from the Delta CDF. Thanks to the new flag and the ability to orchestrate the pipeline from the SQL warehouse, processing the Delta CDF is faster than ever. #databricks https://www.sunnydat...

cdfflag.png
  • 142 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Databricks MVP
  • 256 Views
  • 2 replies
  • 3 kudos

Notebook tags

Now you can also tag notebooks. Especially useful if you process any PII data. #databricks More news on https://databrickster.medium.com/

notebooktags.png
  • 256 Views
  • 2 replies
  • 3 kudos
Latest Reply
Sumit_7
Honored Contributor III
  • 3 kudos

Nice one @Hubert-Dudek 

  • 3 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 173 Views
  • 0 replies
  • 1 kudos

Change Data Feed - ingestion test

AUTO CDC made me curious about one practical question: if Auto CDC is now one of the easiest ways to process CDF, is it also the cheapest? To answer that, I compared 3 approaches: - AUTO CDC pipeline (in standard and performance mode)- Spark Structur...

cdftest1.png
  • 173 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 217 Views
  • 1 replies
  • 1 kudos

Dashboards - ask Genie

We can ask Genie to explain the chart or its changes, such as spikes. There is a new button directly in the chart corner to start a conversation. more news https://databrickster.medium.com/

askgenie.png
  • 217 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sumit_7
Honored Contributor III
  • 1 kudos

Good one to share @Hubert-Dudek 

  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 213 Views
  • 0 replies
  • 0 kudos

Data Quality Alerts

We can now define Data Quality Alerts and schedule them. We will be notified when an anomaly is detected. It was possible before, but required setting a custom query and using system tables. Additionally, SQL Alert is now a normal Lakeflow job task, ...

quality_alerts.png
  • 213 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Databricks MVP
  • 181 Views
  • 0 replies
  • 1 kudos

Databricks One Account-level

There is a new account level, Databricks one. It includes all assets from all workspaces the user has access to in one place. It is available through https://accounts.azuredatabricks.net/one or https://accounts.cloud.databricks.com/one more news http...

one.png
  • 181 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 224 Views
  • 0 replies
  • 1 kudos

Quality monitoring improvements

Quality monitoring just got a big upgrade. Intuitive traffic lights make it easy to spot issues instantly, with detailed insights available on hover. Plus, a dedicated Quality tab and new checks (like null values) bring everything into one clear, act...

quality.png
  • 224 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 201 Views
  • 0 replies
  • 1 kudos

How to Pass Terraform Outputs to Databricks’ DABS

There are more and more resources available in DABS, and I have to say, defining them is much nicer and easier to manage than in Terraform. We will continue using Terraform to deploy Azure or AWS resources, but we need to pass data from Terraform to ...

terraformanddabs (1).png
  • 201 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 426 Views
  • 0 replies
  • 1 kudos

Dynamic drop-down filter

A new dynamic drop-down filter is available for the SQL editor. It takes the first column from the other saved query we point to. https://databrickster.medium.com/databricks-news-2026-week-13-23-march-2026-to-29-march-2026-24f99a978752

12dropdown.png
  • 426 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 544 Views
  • 0 replies
  • 2 kudos

Lakewatch

Databricks is entering a new market: cybersecurity. It’s one of the fastest-growing markets, alongside AI. The choice was obvious; Databricks already has a strong foundation with agents and the Lakehouse architecture. Many companies are already stori...

lakewathc.png
  • 544 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 205 Views
  • 0 replies
  • 1 kudos

DABS and git branch

From DABS, you can pass a git branch. It is also a really useful best practice, as this way you define that only the given branch can be deployed to the target (e.g., main only to target prod, otherwise it will fail). #databricks https://databrickste...

image (64).png
  • 205 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 197 Views
  • 0 replies
  • 0 kudos

access token from entry_point VS SDK built-in authentication

Do not get the current access token from entry_point or variables. Databricks SDK has built-in authentication, which can be used even for REST API calls. #databricks https://databrickster.medium.com/just-because-you-can-do-it-in-databricks-doesnt-mea...

SDK.png
  • 197 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Databricks MVP
  • 204 Views
  • 0 replies
  • 1 kudos

Get job and other metadata from notebook

Do not use entry_point to get workspace_id, job_id, run_id, and other metadata. There is a ready, stable solution to do that. More good/bad practices on: https://www.sunnydata.ai/blog/databricks-multi-statement-transactions https://databrickster.medi...

image (62).png
  • 204 Views
  • 0 replies
  • 1 kudos