cancel
Showing results for 
Search instead for 
Did you mean: 
Community Articles
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Databricks MVP
  • 828 Views
  • 0 replies
  • 3 kudos

Goodbye community edition, Long live the free edition

I just logged in to the community edition for the last time and spun up the cluster for the last time. Today is the last day, but it's still there. Haven't logged in there for a while, as the free edition offers much more, but it is a place where man...

goodbye_era.png
  • 828 Views
  • 0 replies
  • 3 kudos
Divaker_Soni
by Databricks Partner
  • 855 Views
  • 1 replies
  • 2 kudos

Databricks Table Protection Features

This article provides an overview of key Databricks features and best practices that protect Gold tables from accidental deletion. It also covers the implications if both the Gold and Landing layers are deleted without active retention or backup. Cor...

  • 855 Views
  • 1 replies
  • 2 kudos
Latest Reply
Sanjeeb2024
Valued Contributor
  • 2 kudos

Thanks for sharing this. Time Travel is applicable all tables in Databricks NOT restricted to gold.

  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 563 Views
  • 0 replies
  • 2 kudos

Databricks Asset Bundles Direct Mode

There is a new direct mode in Databricks Asset Bundles: the main difference is that there is no Terraform anymore, and a simple state in JSON. It offers a few significant benefits: - No requirement to download Terraform and terraform-provider-databr...

news_direct_mode.png
  • 563 Views
  • 0 replies
  • 2 kudos
joseph_in_sf
by New Contributor III
  • 1066 Views
  • 1 replies
  • 1 kudos

Version 1.1: Data Isolation and Governance within PySpark DataFrame's

See the comments below for a runnable notebookThroughout my career I have worked at several companies that handle sensitive data; including PII, PHI, EMR, HIPPA, Class I/II/III FOMC - Internal (FR). One entity I worked at even required a Department O...

  • 1066 Views
  • 1 replies
  • 1 kudos
Latest Reply
SamanthaGivings
New Contributor II
  • 1 kudos

We just failed a HIPAA audit, they asked why our pipelines had patients names in the if the pipeline didnt need that info, they recommended it to be encrypted. We thought S3 encryption was good enough.We implemented row level encryption by extending ...

  • 1 kudos
Divaker_Soni
by Databricks Partner
  • 1746 Views
  • 1 replies
  • 4 kudos

Designing Reliable Stream–Stream Joins with Watermarks in Databricks

Stream–stream joins are one of the most powerful features in Databricks Structured Streaming – and also one of the easiest to misconfigure. As soon as you move from simple append-only pipelines to real-time correlations across multiple streams (order...

  • 1746 Views
  • 1 replies
  • 4 kudos
Latest Reply
TejeshS
Contributor
  • 4 kudos

Interesting and very Insightful @Divaker_Soni.

  • 4 kudos
Harun
by Honored Contributor
  • 10165 Views
  • 4 replies
  • 2 kudos

Optimizing Costs in Databricks by Dynamically Choosing Cluster Sizes

Databricks is a popular unified data analytics platform known for its powerful data processing capabilities and seamless integration with Apache Spark. However, managing and optimizing costs in Databricks can be challenging, especially when it comes ...

  • 10165 Views
  • 4 replies
  • 2 kudos
Latest Reply
mame17
New Contributor II
  • 2 kudos

@Second Reply You’re right just printing out selected_pool isn’t enough to actually leverage dynamic cluster sizing at runtime. In practice, the value of selected_pool would feed directly into your Databricks cluster creation API or workflow automati...

  • 2 kudos
3 More Replies
Yogesh_Verma_
by Contributor II
  • 888 Views
  • 2 replies
  • 6 kudos

Meta Ads is now a native data source in Databricks Lakeflow Connect

Meta Ads is now a native data source in DatabricksDatabricks just announced a Meta Ads connector (Beta) powered by Lakeflow Connect, making it easy to ingest advertising data directly into Databricks—no custom APIs, no CSV exports, no brittle scripts...

Yogesh_Verma__0-1766916082806.png
  • 888 Views
  • 2 replies
  • 6 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 6 kudos

Have to test it. Confluence Lakeflow Connect is amazing, so I suspect that one also!

  • 6 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 387 Views
  • 1 replies
  • 2 kudos

Flexible Node Types

Recently, it has not only become difficult to get a quota in some regions, but even if you have one, it doesn't mean that there are available VMs. Even if you have a quota, you may need to move your bundles to a different subscription when different ...

watch_flexible.png
  • 387 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Thanks for sharing, @Hubert-Dudek. This is a common challenge users face, and flexible node types can significantly improve compute launch reliability.

  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 306 Views
  • 0 replies
  • 1 kudos

DABs: Referencing Your Resources

From hardcoded IDs, through lookups, to finally referencing resources. I think almost everyone, including me, wants to go through such a journey with Databricks Asset Bundles. #databricks In the article below, I am looking at how to reference a resou...

resources.png
  • 306 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 356 Views
  • 0 replies
  • 1 kudos

Confluence Lakeflow Connector

Incrementally upload data from Confluence. I remember there were a few times in my life when I spent weeks on it. Now, it is incredible how simple it is to implement it with Lakeflow Connect. Additionally, I love DABS's first approach for connectors,...

confluence.png
  • 356 Views
  • 0 replies
  • 1 kudos
Lakshmipriya_Na
by New Contributor III
  • 1263 Views
  • 4 replies
  • 4 kudos

From Learning to Enablement: My 2025 Databricks Journey

Celebrating platform capabilities, community impact, and responsible adoptionIn 2025, my Databricks journey evolved from mastering features to empowering outcomes.What became clear this year is that Databricks isn’t just a powerful platform — it’s a ...

Community Articles
CommunityEnablement
DatabricksChampion
DatabricksMVP
DataLeadership
  • 1263 Views
  • 4 replies
  • 4 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 4 kudos

Nice write-up @Lakshmipriya_Na ,  I really like this framing. The “Builder” vs “Strategist” distinction maps almost perfectly to how Databricks shows up in the real world. You can move fast and iterate in notebooks, but the same Lakehouse naturally n...

  • 4 kudos
3 More Replies
Hubert-Dudek
by Databricks MVP
  • 336 Views
  • 0 replies
  • 1 kudos

Databricks Advent Calendar 2025 #23

Our calendar is coming to an end. One of the most significant innovations of last year is Agent Bricks. We received a few ready-made solutions for deploying agents. As the Agents ecosystem becomes more complex, one of my favourites is the Multi-Agent...

2025_23.png
  • 336 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 309 Views
  • 0 replies
  • 1 kudos

Databricks Advent Calendar 2025 #22

During the last two weeks, five new Lakeflow Connect connectors were announced. It allows incremental ingestion of the data in an easy way. In the coming weeks, there will be more announcements about Lakeflow Connect, and we can expect Databricks to ...

2025_22.png
  • 309 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 304 Views
  • 0 replies
  • 1 kudos

Databricks Advent Calendar 2025 #21

Your stream can have a state, and now, with TransformWithStateInPandas, it’s easy to manage - you can handle things like initial state, deduplication, recovery, etc., with the 2025 improvements.

2025_21.png
  • 304 Views
  • 0 replies
  • 1 kudos
Labels