This page brings together externally published articles written by our MVPs. Discover expert perspectives, real-world guidance, and community contributions from leaders across the ecosystem.
The primary goal of Databricks Assistant was to help users write code, debug issues, and fix errors directly inside notebooks. It acted as an AI-powered helper that simplified development for data engineers, data scientists, and analysts working on t...
I think it is time to move out of ADF now. If databricks is your main platform, you can go to Databricks Lakeflow Jobs or to Fabric ADF. Obviously first choice makes more sense, especially if you orchestrate databricks and don't want to spend unneces...
@Hubert-Dudek Really insightful article!I’ve worked quite a lot with ADF over the past few years, but with the recent advances in orchestration in Databricks I’m starting to feel the same way — it might be time to move on from ADF. Your point about...
Unity Catalog is getting serious and becoming more business-friendly. New discovery page with business domains, of course, everything ruled by tags #databricks
more news https://databrickster.medium.com/databricks-news-2026-week-9-23-february-2026-to...
Granular Permissions are available in Databricks Workspace. For access tokens, I hope that someday it will also be a general entitlement setting for users/groups (not only for their access tokens). #databricks
more recent news https://databrickster.m...
If you are new to Databricks and want to get started, I have started a playlist. 26 videos already published, more in the pipeline. Happy learning !!Linkedin post: https://www.linkedin.com/posts/sudarshan-koirala_databricks-dataengineering-datascienc...
Declarative pipelines are among the best ways to deduplicate your data, especially for dimensions. From AUTO_CDC() to advanced deduplication quality check #databricks
https://databrickster.medium.com/deduplicating-data-on-the-databricks-lakehouse-5-w...
Published a new blog detailing how I used Codex to configure Databricks AI Dev Kit on my local Mac and then implemented a simple tool-calling agent on Databricks in a step-by-step workflow.This post is part of my ongoing series focused on practical d...
Codex, Claude, Gemini blocked? No problem. Route everything through Databricks AI Gateway. #databricks
https://databrickster.medium.com/databricks-news-2026-week-8-16-february-2026-to-22-february-2026-f2ec48bc234f
More under DABS! External locations are now available as DABS code. I hope that credentials will be available soon, too, so it will be possible to reference the credential resource from an external location. #databricks
https://medium.com/@databricks...
This is great, so eventually DAB might help replacing Terraform overhead for these platform resources or any other infrastructure maintenance and governance.
Many teams think migrating from Azure Data Factory to Databricks Lakeflow Jobs is just a pipeline rewrite. It’s not.It’s about:* Simplifying over-engineered architectures* Reducing orchestration and monitoring overhead* Aligning data engineering with...
Catalogs are now under DABS, and I am happy to say goodbye to Terraform and to manage all UC grants in DABS. #databricks
https://databrickster.medium.com/databricks-news-2026-week-8-16-february-2026-to-22-february-2026-f2ec48bc234f
Hi everyone, I wanted to share a quick technical tip for those looking to optimize their SQL Warehouse configurations, especially when dealing with intermittent ad hoc queries or AI BI Dashboards. The Challenge: When setting up a SQL Warehouse, the D...
It is possible to tag queries. That functionality is also supported by external clients (Jdbc/dbt/Power BI, etc.) #databricks
https://databrickster.medium.com/databricks-news-2026-week-8-16-february-2026-to-22-february-2026-f2ec48bc234f
Do you know that instead of SELECT * FROM TABLE, you can just use TABLE? TABLE is just part of pipe syntax, so you can always add another part after the pipe. Thanks to Martin Debus for noticing the possibility of using just TABLE. #databricks
https:...
I am back, and runtime 18.1 is here, and with it INSERT WITH SCHEMA EVOLUTION
https://databrickster.medium.com/databricks-news-2026-week-8-16-february-2026-to-22-february-2026-f2ec48bc234f