cancel
Showing results for 
Search instead for 
Did you mean: 
Community Articles
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Senga98
by New Contributor
  • 129 Views
  • 1 replies
  • 3 kudos

Hadoop Walked So Databricks Could Run

Are you familiar with this scenario: Your data team spends 80% of their time fixing infrastructure issues instead of extracting insights.In today’s data-driven world, organisations are drowning in data but starving for actionable insights. Traditiona...

  • 129 Views
  • 1 replies
  • 3 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 3 kudos

Great one!! @Senga98  All the best!

  • 3 kudos
rathorer
by New Contributor III
  • 391 Views
  • 2 replies
  • 5 kudos

API Consumption on Databricks

In this blog, I will be talking about the building the architecture to serve the API consumption on Databricks Platform. I will be using Lakebase approach for this. It will be useful for this kind of API requirement.API Requirement: Performance:Curre...

rathorer_0-1759990971452.png rathorer_0-1760078458113.png rathorer_1-1760078513459.png
  • 391 Views
  • 2 replies
  • 5 kudos
Latest Reply
Advika
Databricks Employee
  • 5 kudos

Sharp design choices, @rathorer! Appreciate you sharing this detailed architecture.

  • 5 kudos
1 More Replies
VamsiDatabricks
by New Contributor
  • 118 Views
  • 0 replies
  • 1 kudos

Validating pointer-based Delta comparison architecture using flatMapGroupsWithState in Structured St

Hi everyone,I’m leading an implementation where we’re comparing events from two real-time streams — a Source and a Target — in Databricks Structured Streaming (Scala).Our goal is to identify and emit “delta” differences between corresponding records ...

  • 118 Views
  • 0 replies
  • 1 kudos
DavidOBrien
by New Contributor
  • 10828 Views
  • 6 replies
  • 3 kudos

Editing value of widget parameter within notebook code

I have a notebook with a text widget where I want to be able to edit the value of the widget within the notebook and then reference it in SQL code. For example, assuming there is a text widget named Var1 that has input value "Hello", I would want to ...

  • 10828 Views
  • 6 replies
  • 3 kudos
Latest Reply
Ville_Leinonen
New Contributor II
  • 3 kudos

It seems that only way to use parameters in sql code block is to use dbutils.widget and you cannot change those parameters without removing widget and setting it up again in code

  • 3 kudos
5 More Replies
TejeshS
by Contributor
  • 2088 Views
  • 6 replies
  • 9 kudos

Attribute-Based Access Control (ABAC) in Databricks Unity Catalog

What Is ABAC and Why Does It Matter?Attribute-Based Access Control (ABAC) is a data governance model now available in Databricks, designed to offer fine-grained, dynamic, and scalable access control for data, AI assets, and files managed through Data...

TejeshS_0-1753447608855.png TejeshS_1-1753447608858.png TejeshS_2-1753447608859.png TejeshS_3-1753447608861.jpeg
  • 2088 Views
  • 6 replies
  • 9 kudos
Latest Reply
cevipu
New Contributor II
  • 9 kudos

Is there already support in Beta for Volumes?

  • 9 kudos
5 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 411 Views
  • 1 replies
  • 2 kudos

Relationship in databricks Genie

Now you can define relations also directly in Genie. It includes options like “Many to One”, “One to Many”, “One to One”, “Many to Many”. Read more: - https://databrickster.medium.com/relationship-in-databricks-genie-f8bf59a9b578 - https://www.su...

genie.png genie.png
  • 411 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Databricks Employee
  • 2 kudos

Thank you for sharing with the Community, @Hubert-Dudek!

  • 2 kudos
Yogesh_Verma_
by Contributor
  • 375 Views
  • 1 replies
  • 3 kudos

Real-Time Mode in Apache Spark Structured Streaming

Real-Time Mode in Spark StreamingApache Spark™ Structured Streaming has been the backbone of mission-critical pipelines for years — from ETL to near real-time analytics and machine learning.Now, Databricks has introduced something game-changing: Real...

Yogesh_378691_1-1759318181584.png
  • 375 Views
  • 1 replies
  • 3 kudos
Latest Reply
Advika
Databricks Employee
  • 3 kudos

And now in Public Preview! Thank you for writing this up, @Yogesh_Verma_.

  • 3 kudos
BS_THE_ANALYST
by Esteemed Contributor II
  • 1464 Views
  • 16 replies
  • 26 kudos

(Episode 1: Getting Data In) - Learning Databricks one brick at a time, using the Free Edition

Episode 1: Getting Data InLearning Databricks one brick at a time, using the Free Edition.Project IntroWelcome to everyone reading. My name’s Ben, a.k.a BS_THE_ANALYST, and I’m going to share my experiences as I learn the world of Databricks. My obje...

BS_THE_ANALYST_14-1758564518121.png BS_THE_ANALYST_15-1758564518126.png Adobe Express - 2025-09-21 14-06-29.gif BS_THE_ANALYST_18-1758564518403.png
  • 1464 Views
  • 16 replies
  • 26 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 26 kudos

Thank you, @BS_THE_ANALYST , for sharing this. I didn’t have much time to read it last week since I was preparing for the Databricks Professional exam, but today I finally had the chance, and I have to say - it’s a great article.I really appreciate t...

  • 26 kudos
15 More Replies
BS_THE_ANALYST
by Esteemed Contributor II
  • 389 Views
  • 2 replies
  • 3 kudos

(Teaser CHALLENGE - Community Data Pull) - Upcoming Challenge For the Community

Hey everybody, I've been dying to share this with the community. Over the last few weeks, I've been thinking about how I can do a Data Pull from the Community to highlight some of the cool stuff we all do! 拾. Below is a snippet of visual from the Dat...

BS_THE_ANALYST_0-1759439818993.png BS_THE_ANALYST_2-1759440586037.png
  • 389 Views
  • 2 replies
  • 3 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 3 kudos

I'll aim to have the data for the challenge sorted and ready for next week! 﫡. I want to strip out some of the columns and figure out where is best to host the data . Potentially I could have it on the Databricks Marketplace or Github.All the best,BS

  • 3 kudos
1 More Replies
savlahanish27
by New Contributor II
  • 676 Views
  • 1 replies
  • 2 kudos

9 Powerful 🚀 Spark Optimization Techniques in Databricks (With Real Examples)

IntroductionOne of our ETL pipelines used to take 10 hours to complete. After tuning and scaling in Databricks, it finished in just about 1 hour — a 90% reduction in runtime.That’s the power of Spark tuning.Databricks, built on Apache Spark, is a po...

savlahanish27_1-1758713623697.png
  • 676 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Databricks Employee
  • 2 kudos

This is a fantastic breakdown of Spark optimization techniques, @savlahanish27!Definitely helpful for anyone working on performance tuning in Databricks.

  • 2 kudos
BS_THE_ANALYST
by Esteemed Contributor II
  • 445 Views
  • 2 replies
  • 12 kudos

(Episode 2: Reading Excel Files) - Learning Databricks one brick at a time, using the Free Edition

Episode 2: Reading Excel FilesLearning Databricks one brick at a time, using the Free Edition.You can download the accompanying Notebook and Excel files used in the demonstration over on my GitHub:Excel Files  & Notebook: https://github.com/BSanalyst...

BS_THE_ANALYST_0-1759097401864.png BS_THE_ANALYST_1-1759097401869.png BS_THE_ANALYST_2-1759097401874.png BS_THE_ANALYST_3-1759097401882.png
  • 445 Views
  • 2 replies
  • 12 kudos
Latest Reply
Pilsner
Contributor III
  • 12 kudos

@BS_THE_ANALYST that final snippet of code looks very clean! I saw that "sheet name = None" part and was a bit confused why you'd written that as I assumed that was just the default. Turns out the default is "sheet name = 0", which is simply the firs...

  • 12 kudos
1 More Replies
prinkan_intugle
by New Contributor II
  • 343 Views
  • 0 replies
  • 4 kudos

Semantic Modelling and Data Products using Agent

Hey Community!We have built something cool for Data Engineers in Databricks!Raw Files -> Semantic Model -> Data Products without writing ETL/ELT Code.Demo/ Guide - https://youtu.be/wjQYXrBwA-oNotebook - https://github.com/Intugle/data-tools/blob/main...

  • 343 Views
  • 0 replies
  • 4 kudos
Brahmareddy
by Esteemed Contributor
  • 556 Views
  • 2 replies
  • 7 kudos

Data Quality at Scale: My Experience Using Databricks and AWS

Over the past few years working as a data engineer, I’ve seen how quickly companies are moving their platforms to Databricks and AWS. The flexibility and scale these platforms provide are amazing, but one challenge always comes up again and again: ho...

  • 556 Views
  • 2 replies
  • 7 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 7 kudos

Hi @Brahmareddy very good insights , I can summarize this as follows: Area Best Practice ExampleSchema ManagementDefine schemas in JSON/YAML, enforce with Delta LakeGovernanceUse Unity Catalog for access, lineage, and ownershipMonitoringSet up Lakeho...

  • 7 kudos
1 More Replies
Purvansh
by New Contributor III
  • 4345 Views
  • 4 replies
  • 6 kudos

From Pilots to Production: Unlocking Enterprise AI Agents with Agent Bricks

Many enterprises today launch AI agents with high hopes but more often than not, those pilots never reach production. The culprit? Complexity, poor evaluations, ballooning costs, and governance gaps.Why do so many AI agent pilots never make it to pro...

Purvansh_0-1753617582776.png Purvansh_1-1753618234616.png Purvansh_2-1753618271144.png Purvansh_5-1753618555571.png
  • 4345 Views
  • 4 replies
  • 6 kudos
Latest Reply
Joneslara
New Contributor II
  • 6 kudos

Agent Bricks looks solid for scaling AI, and I’ve seen platforms like Agentra.io also tackle the enterprise workflow side of this challenge.

  • 6 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels