This page brings together externally published articles written by our MVPs. Discover expert perspectives, real-world guidance, and community contributions from leaders across the ecosystem.
With readChangeFeed flag AUTO, CDC automatically reads data from the Delta CDF. Thanks to the new flag and the ability to orchestrate the pipeline from the SQL warehouse, processing the Delta CDF is faster than ever. #databricks
https://www.sunnydat...
If you’ve worked with Azure Databricks for a while, you’ve probably noticed one small but persistent friction point: the URLs.They’re long, system-generated, and not exactly memorable.Something like:Now imagine sharing that with business users, analy...
This looks very promising!Question - how do the workspace URLs work once the custom domain at the account level has been configured. If one goes to the development workspace, will the new url look like "https://contoso.databricks.com/WORKSPACE_ID"?
AUTO CDC made me curious about one practical question: if Auto CDC is now one of the easiest ways to process CDF, is it also the cheapest? To answer that, I compared 3 approaches:
- AUTO CDC pipeline (in standard and performance mode)- Spark Structur...
We can ask Genie to explain the chart or its changes, such as spikes. There is a new button directly in the chart corner to start a conversation.
more news https://databrickster.medium.com/
What is a Metric View? (Think of it as a virtual report definition)Metric Views are a first-class object in Databricks Unity Catalog that allow you to define reusable, governed business metrics on top of your existing tables and views.Think of them a...
Azure Databricks now allows organisations to configure a custom URL at the account level, providing a unified and branded access point for all users.Instead of navigating multiple workspace-specific URLs, users can log in once using a single custom U...
We can now define Data Quality Alerts and schedule them. We will be notified when an anomaly is detected. It was possible before, but required setting a custom query and using system tables. Additionally, SQL Alert is now a normal Lakeflow job task, ...
There is a new account level, Databricks one. It includes all assets from all workspaces the user has access to in one place. It is available through https://accounts.azuredatabricks.net/one or https://accounts.cloud.databricks.com/one
more news http...
Are you managing multiple LLMs on Databricks with no visibility or control? If the answer is Yes, don't worry, you are not alone. I have been in multiple discussion and got the same question about managing multiple LLMs in one place. Now, we have Dat...
Quality monitoring just got a big upgrade. Intuitive traffic lights make it easy to spot issues instantly, with detailed insights available on hover. Plus, a dedicated Quality tab and new checks (like null values) bring everything into one clear, act...
There are more and more resources available in DABS, and I have to say, defining them is much nicer and easier to manage than in Terraform. We will continue using Terraform to deploy Azure or AWS resources, but we need to pass data from Terraform to ...
A new dynamic drop-down filter is available for the SQL editor. It takes the first column from the other saved query we point to.
https://databrickster.medium.com/databricks-news-2026-week-13-23-march-2026-to-29-march-2026-24f99a978752
Databricks is entering a new market: cybersecurity. It’s one of the fastest-growing markets, alongside AI. The choice was obvious; Databricks already has a strong foundation with agents and the Lakehouse architecture. Many companies are already stori...
From DABS, you can pass a git branch. It is also a really useful best practice, as this way you define that only the given branch can be deployed to the target (e.g., main only to target prod, otherwise it will fail). #databricks
https://databrickste...