cancel
Showing results for 
Search instead for 
Did you mean: 
Community Articles
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Divaker_Soni
by New Contributor III
  • 179 Views
  • 1 replies
  • 2 kudos

Databricks Table Protection Features

This article provides an overview of key Databricks features and best practices that protect Gold tables from accidental deletion. It also covers the implications if both the Gold and Landing layers are deleted without active retention or backup. Cor...

  • 179 Views
  • 1 replies
  • 2 kudos
Latest Reply
Sanjeeb2024
Contributor III
  • 2 kudos

Thanks for sharing this. Time Travel is applicable all tables in Databricks NOT restricted to gold.

  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 208 Views
  • 0 replies
  • 2 kudos

Databricks Asset Bundles Direct Mode

There is a new direct mode in Databricks Asset Bundles: the main difference is that there is no Terraform anymore, and a simple state in JSON. It offers a few significant benefits: - No requirement to download Terraform and terraform-provider-databr...

news_direct_mode.png
  • 208 Views
  • 0 replies
  • 2 kudos
joseph_in_sf
by New Contributor III
  • 239 Views
  • 1 replies
  • 1 kudos

Version 1.1: Data Isolation and Governance within PySpark DataFrame's

See the comments below for a runnable notebookThroughout my career I have worked at several companies that handle sensitive data; including PII, PHI, EMR, HIPPA, Class I/II/III FOMC - Internal (FR). One entity I worked at even required a Department O...

  • 239 Views
  • 1 replies
  • 1 kudos
Latest Reply
SamanthaGivings
New Contributor II
  • 1 kudos

We just failed a HIPAA audit, they asked why our pipelines had patients names in the if the pipeline didnt need that info, they recommended it to be encrypted. We thought S3 encryption was good enough.We implemented row level encryption by extending ...

  • 1 kudos
Divaker_Soni
by New Contributor III
  • 339 Views
  • 1 replies
  • 4 kudos

Designing Reliable Stream–Stream Joins with Watermarks in Databricks

Stream–stream joins are one of the most powerful features in Databricks Structured Streaming – and also one of the easiest to misconfigure. As soon as you move from simple append-only pipelines to real-time correlations across multiple streams (order...

  • 339 Views
  • 1 replies
  • 4 kudos
Latest Reply
TejeshS
Contributor
  • 4 kudos

Interesting and very Insightful @Divaker_Soni.

  • 4 kudos
Harun
by Honored Contributor
  • 9084 Views
  • 4 replies
  • 2 kudos

Optimizing Costs in Databricks by Dynamically Choosing Cluster Sizes

Databricks is a popular unified data analytics platform known for its powerful data processing capabilities and seamless integration with Apache Spark. However, managing and optimizing costs in Databricks can be challenging, especially when it comes ...

  • 9084 Views
  • 4 replies
  • 2 kudos
Latest Reply
mame17
New Contributor II
  • 2 kudos

@Second Reply You’re right just printing out selected_pool isn’t enough to actually leverage dynamic cluster sizing at runtime. In practice, the value of selected_pool would feed directly into your Databricks cluster creation API or workflow automati...

  • 2 kudos
3 More Replies
Yogesh_Verma_
by Contributor II
  • 306 Views
  • 2 replies
  • 6 kudos

Meta Ads is now a native data source in Databricks Lakeflow Connect

Meta Ads is now a native data source in DatabricksDatabricks just announced a Meta Ads connector (Beta) powered by Lakeflow Connect, making it easy to ingest advertising data directly into Databricks—no custom APIs, no CSV exports, no brittle scripts...

Yogesh_Verma__0-1766916082806.png
  • 306 Views
  • 2 replies
  • 6 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 6 kudos

Have to test it. Confluence Lakeflow Connect is amazing, so I suspect that one also!

  • 6 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 137 Views
  • 0 replies
  • 2 kudos

Lakebase Use Cases

I am still amazed by Lakebase and all the possible use cases that we can achieve. Integration of Lakebase with Lakehouse is the innovation of the year. Please read my blog posts to see why it is the best of two worlds. #databricks Read here:- https:/...

lakebase_benefits.png
  • 137 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 190 Views
  • 1 replies
  • 2 kudos

Flexible Node Types

Recently, it has not only become difficult to get a quota in some regions, but even if you have one, it doesn't mean that there are available VMs. Even if you have a quota, you may need to move your bundles to a different subscription when different ...

watch_flexible.png
  • 190 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Thanks for sharing, @Hubert-Dudek. This is a common challenge users face, and flexible node types can significantly improve compute launch reliability.

  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 168 Views
  • 0 replies
  • 1 kudos

DABs: Referencing Your Resources

From hardcoded IDs, through lookups, to finally referencing resources. I think almost everyone, including me, wants to go through such a journey with Databricks Asset Bundles. #databricks In the article below, I am looking at how to reference a resou...

resources.png
  • 168 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 191 Views
  • 0 replies
  • 1 kudos

Confluence Lakeflow Connector

Incrementally upload data from Confluence. I remember there were a few times in my life when I spent weeks on it. Now, it is incredible how simple it is to implement it with Lakeflow Connect. Additionally, I love DABS's first approach for connectors,...

confluence.png
  • 191 Views
  • 0 replies
  • 1 kudos
Lakshmipriya
by New Contributor III
  • 463 Views
  • 4 replies
  • 4 kudos

From Learning to Enablement: My 2025 Databricks Journey

Celebrating platform capabilities, community impact, and responsible adoptionIn 2025, my Databricks journey evolved from mastering features to empowering outcomes.What became clear this year is that Databricks isn’t just a powerful platform — it’s a ...

Community Articles
CommunityEnablement
DatabricksChampion
DatabricksMVP
DataLeadership
  • 463 Views
  • 4 replies
  • 4 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 4 kudos

Nice write-up @Lakshmipriya ,  I really like this framing. The “Builder” vs “Strategist” distinction maps almost perfectly to how Databricks shows up in the real world. You can move fast and iterate in notebooks, but the same Lakehouse naturally nudg...

  • 4 kudos
3 More Replies
Hubert-Dudek
by Databricks MVP
  • 202 Views
  • 0 replies
  • 1 kudos

Databricks Advent Calendar 2025 #23

Our calendar is coming to an end. One of the most significant innovations of last year is Agent Bricks. We received a few ready-made solutions for deploying agents. As the Agents ecosystem becomes more complex, one of my favourites is the Multi-Agent...

2025_23.png
  • 202 Views
  • 0 replies
  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels