cancel
Showing results for 
Search instead for 
Did you mean: 
Community Articles
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

kanikvijay9
by New Contributor III
  • 431 Views
  • 2 replies
  • 10 kudos

Optimizing Delta Table Writes for Massive Datasets in Databricks

Problem StatementIn one of my recent projects, I faced a significant challenge: Writing a huge dataset of 11,582,763,212 rows and 2,068 columns to a Databricks managed Delta table.The initial write operation took 22.4 hours using the following setup:...

kanikvijay9_0-1762695454233.png kanikvijay9_1-1762695506126.png kanikvijay9_2-1762695536800.png kanikvijay9_3-1762695573841.png
  • 431 Views
  • 2 replies
  • 10 kudos
Latest Reply
kanikvijay9
New Contributor III
  • 10 kudos

Hey @Louis_Frolio ,Thank you for the thoughtful feedback and great suggestions!A few clarifications:AQE is already enabled in my setup, and it definitely helped reduce shuffle overhead during the write.Regarding Column Pruning, in this case, the fina...

  • 10 kudos
1 More Replies
MandyR
by Community Manager
  • 250 Views
  • 0 replies
  • 2 kudos

Another BrickTalks! Let's talk about bringing data intelligence from your Lakehouse into every app!

You asked, we delivered! Another BrickTalk is scheduled for Thursday, Nov 13 @ 9 AM PT with Pranav Aurora on how to bring data intelligence from your Lakehouse into every app and user, seamlessly and in real time. What you’ll learn: Use Lakebase (Po...

BrickTalks.png
  • 250 Views
  • 0 replies
  • 2 kudos
MandyR
by Community Manager
  • 433 Views
  • 3 replies
  • 11 kudos

Community Fellows: Shout Out to our Bricksters!

At Databricks, our Community members deserve to get a great experience in our forums, with quality answers from the experts. Who better to help out our customers than Databricks employees aka Bricksters! To work towards this goal, we created the Comm...

  • 433 Views
  • 3 replies
  • 11 kudos
Latest Reply
bidek56
Contributor
  • 11 kudos

Kudos to the DB team for keeping up with the community, but can you please work on your product as well?We are experiencing a lot of issues with your paid product: failures, crashes, slow starts and slow performance and the list goes on. Community wo...

  • 11 kudos
2 More Replies
Coffee77
by Contributor III
  • 290 Views
  • 1 replies
  • 1 kudos

‌‌Cómo crear clusters en Databricks paso a paso | All-Purpose, Jobs Compute, SQL Warehouses y Pools

Recently having some fun with Databricks, I created a series of videos in Spanish  that I'd like to share here. I hope some of them could be interesting for Spanish or LATAM community Not sure if this is the most proper board to share or there is ano...

  • 290 Views
  • 1 replies
  • 1 kudos
Latest Reply
Coffee77
Contributor III
  • 1 kudos

Añadido nuevo vídeo para crear clusters de tipo serverless para notebooks, jobs y DLTs https://youtu.be/RQvkssryjyQ?si=BkYI831mUK1vBE20

  • 1 kudos
TejeshS
by Contributor
  • 2644 Views
  • 3 replies
  • 9 kudos

Building a Metadata Table-Driven Framework Using LakeFlow Declarative (Formerly DLT) Pipelines

IntroductionScaling data pipelines across an organization can be challenging, particularly when data sources, requirements, and transformation rules are always changing. A metadata table-driven framework using LakeFlow Declarative (Formerly DLT) enab...

TejeshS_0-1753177497326.png
  • 2644 Views
  • 3 replies
  • 9 kudos
Latest Reply
NageshPatil
New Contributor II
  • 9 kudos

Helpful article @TejeshS  . I have a question like if I want to pass parameters from my workflow to pipeline, is it possible? if yes what will be the best approach.

  • 9 kudos
2 More Replies
BS_THE_ANALYST
by Esteemed Contributor III
  • 3252 Views
  • 17 replies
  • 29 kudos

(Episode 1: Getting Data In) - Learning Databricks one brick at a time, using the Free Edition

Episode 1: Getting Data InLearning Databricks one brick at a time, using the Free Edition.Project IntroWelcome to everyone reading. My name’s Ben, a.k.a BS_THE_ANALYST, and I’m going to share my experiences as I learn the world of Databricks. My obje...

BS_THE_ANALYST_14-1758564518121.png BS_THE_ANALYST_15-1758564518126.png Adobe Express - 2025-09-21 14-06-29.gif BS_THE_ANALYST_18-1758564518403.png
  • 3252 Views
  • 17 replies
  • 29 kudos
Latest Reply
Coffee77
Contributor III
  • 29 kudos

Really interesting post @BS_THE_ANALYST Caching up with Databricks stuff again

  • 29 kudos
16 More Replies
jsdmatrix
by Databricks Employee
  • 280 Views
  • 0 replies
  • 1 kudos

SQL Scripting in Apache Spark™ 4.0

The Apache Spark™ 4.0 introduces a new feature for SQL developers and data engineers: SQL Scripting. As such, this feature enhances the power and extends the flexibility of Spark SQL, enabling users to write procedural code within SQL queries, with t...

Screenshot 2025-10-31 at 2.43.13 PM.png Screenshot 2025-10-31 at 2.41.15 PM.png
  • 280 Views
  • 0 replies
  • 1 kudos
BS_THE_ANALYST
by Esteemed Contributor III
  • 1520 Views
  • 6 replies
  • 14 kudos

(Episode 3: Hands-on API Project) - Learning Databricks one brick at a time, using the Free Edition

Episode 3: APIsLearning Databricks one brick at a time, using the Free Edition.Project IntroWelcome to everyone reading. My name’s Ben, a.k.a BS_THE_ANALYST, and I’m going to share my experiences as I learn the world of Databricks. My objective is to...

BS_THE_ANALYST_0-1761119041020.png BS_THE_ANALYST_1-1761119068567.png BS_THE_ANALYST_2-1761119187093.png BS_THE_ANALYST_1-1761122579858.png
  • 1520 Views
  • 6 replies
  • 14 kudos
Latest Reply
JoyO
New Contributor II
  • 14 kudos

This is great, thanks for sharing Ben, will share with my data community.

  • 14 kudos
5 More Replies
BS_THE_ANALYST
by Esteemed Contributor III
  • 1394 Views
  • 3 replies
  • 16 kudos

(Episode 2: Reading Excel Files) - Learning Databricks one brick at a time, using the Free Edition

Episode 2: Reading Excel FilesLearning Databricks one brick at a time, using the Free Edition.You can download the accompanying Notebook and Excel files used in the demonstration over on my GitHub:Excel Files  & Notebook: https://github.com/BSanalyst...

BS_THE_ANALYST_0-1759097401864.png BS_THE_ANALYST_1-1759097401869.png BS_THE_ANALYST_2-1759097401874.png BS_THE_ANALYST_3-1759097401882.png
  • 1394 Views
  • 3 replies
  • 16 kudos
Latest Reply
SHIFTY
Contributor II
  • 16 kudos

Thanks for this, @BS_THE_ANALYST.  Hugely beneficial.

  • 16 kudos
2 More Replies
Hubert-Dudek
by Databricks MVP
  • 356 Views
  • 0 replies
  • 1 kudos

Migrate External Tables to Managed

With managed tables, you can reduce your storage and compute costs thanks to predictive optimization or file list caching. Now it is really time to migrate external tables to managed ones, thanks to ALTER SET MANAGED functionality. Read more: - h...

external_migration.png
  • 356 Views
  • 0 replies
  • 1 kudos
Senga98
by Contributor
  • 505 Views
  • 1 replies
  • 4 kudos

Hadoop Walked So Databricks Could Run

Are you familiar with this scenario: Your data team spends 80% of their time fixing infrastructure issues instead of extracting insights.In today’s data-driven world, organisations are drowning in data but starving for actionable insights. Traditiona...

  • 505 Views
  • 1 replies
  • 4 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 4 kudos

Great one!! @Senga98  All the best!

  • 4 kudos
VamsiDatabricks
by New Contributor II
  • 292 Views
  • 0 replies
  • 1 kudos

Validating pointer-based Delta comparison architecture using flatMapGroupsWithState in Structured St

Hi everyone,I’m leading an implementation where we’re comparing events from two real-time streams — a Source and a Target — in Databricks Structured Streaming (Scala).Our goal is to identify and emit “delta” differences between corresponding records ...

  • 292 Views
  • 0 replies
  • 1 kudos
DavidOBrien
by New Contributor
  • 12203 Views
  • 6 replies
  • 3 kudos

Editing value of widget parameter within notebook code

I have a notebook with a text widget where I want to be able to edit the value of the widget within the notebook and then reference it in SQL code. For example, assuming there is a text widget named Var1 that has input value "Hello", I would want to ...

  • 12203 Views
  • 6 replies
  • 3 kudos
Latest Reply
Ville_Leinonen
New Contributor II
  • 3 kudos

It seems that only way to use parameters in sql code block is to use dbutils.widget and you cannot change those parameters without removing widget and setting it up again in code

  • 3 kudos
5 More Replies
TejeshS
by Contributor
  • 3610 Views
  • 6 replies
  • 9 kudos

Attribute-Based Access Control (ABAC) in Databricks Unity Catalog

What Is ABAC and Why Does It Matter?Attribute-Based Access Control (ABAC) is a data governance model now available in Databricks, designed to offer fine-grained, dynamic, and scalable access control for data, AI assets, and files managed through Data...

TejeshS_0-1753447608855.png TejeshS_1-1753447608858.png TejeshS_2-1753447608859.png TejeshS_3-1753447608861.jpeg
  • 3610 Views
  • 6 replies
  • 9 kudos
Latest Reply
cevipu
New Contributor II
  • 9 kudos

Is there already support in Beta for Volumes?

  • 9 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels