Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
Runtime 18 / Spark 4.1 brings Literal string coalescing everywhere, thanks to what you can make your code more readable. Useful, for example, for table comments #databricks
Latest updates:
read: https://databrickster.medium.com/databricks-news-week-1...
We can see new grant types in Unity Catalog. It seems that secrets are coming to UC, and I especially love the "Reference Secret" grant. #databricks
Read more:- https://databrickster.medium.com/databricks-news-week-1-29-december-2025-to-4-january-202...
DABS deployment from a JSON plan is one of my favourite new options. You can review the changes or even integrate the plan with your CI/CD process. #databricks
Read more:- https://databrickster.medium.com/databricks-news-week-1-29-december-2025-to-4-...
We can ingest Excel into Databricks, including natively from SharePoint. It was top news in December, but in fact is part of a big strategy which will allow us to ingest any format from anywhere in databricks. Foundation is already built as there is ...
Dashboards now offer more flexibility, allowing us to use another field or expression to label or sort the chart.
See demo at:- https://www.youtube.com/watch?v=4ngQUkdmD3o&t=893s- https://databrickster.medium.com/databricks-news-week-52-22-december-2...
When something goes wrong, and your pattern involves daily MERGE operations in your jobs, backfill jobs let you reload multiple days in a single execution without writing custom scripts or manually triggering runs.
Read more:- https://www.sunnydata.a...
I just logged in to the community edition for the last time and spun up the cluster for the last time. Today is the last day, but it's still there. Haven't logged in there for a while, as the free edition offers much more, but it is a place where man...
I recently saw a business case in which an external orchestrator accounted for nearly 30% of their total Databricks job costs. That's when it hit me: we're often paying a premium for complexity we don't need. Besides FinOps, I tried to gather all the...
There is a new direct mode in Databricks Asset Bundles: the main difference is that there is no Terraform anymore, and a simple state in JSON. It offers a few significant benefits:
- No requirement to download Terraform and terraform-provider-databr...
I am still amazed by Lakebase and all the possible use cases that we can achieve. Integration of Lakebase with Lakehouse is the innovation of the year. Please read my blog posts to see why it is the best of two worlds. #databricks
Read here:- https:/...
Recently, it has not only become difficult to get a quota in some regions, but even if you have one, it doesn't mean that there are available VMs. Even if you have a quota, you may need to move your bundles to a different subscription when different ...
Thanks for sharing, @Hubert-Dudek. This is a common challenge users face, and flexible node types can significantly improve compute launch reliability.
From hardcoded IDs, through lookups, to finally referencing resources. I think almost everyone, including me, wants to go through such a journey with Databricks Asset Bundles. #databricks
In the article below, I am looking at how to reference a resou...
Our calendar is coming to an end. One of the most significant innovations of last year is Agent Bricks. We received a few ready-made solutions for deploying agents. As the Agents ecosystem becomes more complex, one of my favourites is the Multi-Agent...
During the last two weeks, five new Lakeflow Connect connectors were announced. It allows incremental ingestion of the data in an easy way. In the coming weeks, there will be more announcements about Lakeflow Connect, and we can expect Databricks to ...