cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Level Up with Databricks Specialist Sessions

How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path S...

  • 1176 Views
  • 0 replies
  • 3 kudos
3 weeks ago
The next BrickTalks about the latest and greatest in AI/BI is scheduled for Oct 28!

Hello all! The next BrickTalks session has been scheduled! In case you missed the last one, BrickTalks are live, expert-led sessions that give the Databricks Community members the opportunity to interact with our Bricksters, ask questions and get to ...

  • 198 Views
  • 1 replies
  • 4 kudos
Monday
BrickCon 2025 — Dec 3–5 | A Community Conference for Databricks Builders

  BrickCon 2025 | Dec 3–5, 2025 A community-driven conference built by and for Databricks users around the world. Get ready for an unforgettable experience at BrickCon 2025 — the first-ever community-led Databricks conference bringing together da...

  • 931 Views
  • 3 replies
  • 4 kudos
2 weeks ago
Solution Accelerator Series | #5 - Automating Product Review Summarization with LLMs

Stay on top of customer feedback at scale! The digital world makes it easier than ever for customers to share their opinions, and each review can shape your business decisions. With Large Language Models (LLMs), you can quickly extract and summarize ...

  • 115 Views
  • 1 replies
  • 2 kudos
Tuesday
🚀 Weekly Delta (8 - 14 October): A Look Back at This Week’s Top Community Highlights

We kept the October momentum going strong — with vibrant technical discussions, creative experiments, and deep dives into AI, streaming, and governance. Here’s a look at what stood out this week in the Databricks Community Fresh Content & Articles...

  • 809 Views
  • 1 replies
  • 5 kudos
a week ago
🌟 Community Sparks of the Week | September 26 – October 2 🌟

As we close out September and step into October, it’s incredible to see how our top contributors continue to lead with consistency, clarity, and collaboration. Week after week, they’ve stayed at the forefront of discussions, setting the tone for know...

  • 1547 Views
  • 4 replies
  • 9 kudos
3 weeks ago
Virtual Learning Festival: 10 October - 31 October 2025

Learn, Level Up, and Get Rewarded! Save the dates: October 10 – October 31, 2025! We're celebrating learning, growth, and career advancement with an exciting opportunity to level up in data engineering, data analysis, machine learning, and genera...

  • 168336 Views
  • 390 replies
  • 116 kudos
08-07-2025

Community Activity

spearitchmeta
by > Contributor
  • 0 Views
  • 0 replies
  • 0 kudos

How does Databricks AutoML handle null imputation for categorical features by default?

Hi everyone I’m using Databricks AutoML (classification workflow) on Databricks Runtime 10.4 LTS ML+, and I’d like to clarify how missing (null) values are handled for categorical (string) columns by default.From the AutoML documentation, I see that:...

  • 0 Views
  • 0 replies
  • 0 kudos
nickv
by > New Contributor II
  • 22 Views
  • 2 replies
  • 0 kudos

Multilingual embedding foundation model request

Are there any plans to offer a foundation model with multilingual support? Llama embed nemotron 8b or qwen3-embedding 8b would help out a lot for users that do not only process data in the English language.

  • 22 Views
  • 2 replies
  • 0 kudos
Latest Reply
nickv
New Contributor II
  • 0 kudos

Sorry I wasn't clear enough in my question: I mean embedding models specifically. 

  • 0 kudos
1 More Replies
gudurusreddy99
by > Visitor
  • 8 Views
  • 0 replies
  • 0 kudos

Databricks DLT Joins: Streaming table join with Delta table is reading 2 Billion records per batch

Databricks DLT Joins: Streaming table join with Delta table is reading 2 Billion records from Delta Table for each and every Micro batch.How to overcome this issue to not to read 2 Billion records for every micro batch.Your suggestions and feedback w...

  • 8 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 115 Views
  • 1 replies
  • 3 kudos

Solution Accelerator Series | #5 - Automating Product Review Summarization with LLMs

Stay on top of customer feedback at scale! The digital world makes it easier than ever for customers to share their opinions, and each review can shape your business decisions. With Large Language Models (LLMs), you can quickly extract and summarize ...

Screenshot 2025-10-21 at 8.49.36 PM.png
  • 115 Views
  • 1 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 3 kudos

Thanks for sharing @Sujitha . It seems like a perfect use case for LLMs

  • 3 kudos
fjrodriguez
by > New Contributor III
  • 372 Views
  • 2 replies
  • 1 kudos

Resolved! Ingestion Framework

I would to like to update my ingestion framework that is orchestrated by ADF, running couples Databricks notebook and copying the data to DB afterwards. I want to rely everything on Databricks i though this could be the design:Step 1. Expose target t...

  • 372 Views
  • 2 replies
  • 1 kudos
Latest Reply
fjrodriguez
New Contributor III
  • 1 kudos

Hey @saurabh18cs , It is taking longer than expected to expose Azure SQL tables in UC. I can do that through Foreign Catalog but this is not what i want due to is read-only. As far i can see external connection is for cloud object storage paths (ADLS...

  • 1 kudos
1 More Replies
nodeb
by > New Contributor
  • 81 Views
  • 2 replies
  • 0 kudos

Azure Databricks Control Plane connectivity issue after migrating to vWAN

Hello everyone,Recently, I received a client request to migrate our Azure Databricks environment from a Hub-and-Spoke architecture to a vWAN Hub architecture with an NVA (Network Virtual Appliance).Here’s a quick overview of the setup:The Databricks ...

  • 81 Views
  • 2 replies
  • 0 kudos
Latest Reply
nodeb
New Contributor
  • 0 kudos

The problem is fixed.

  • 0 kudos
1 More Replies
Barnita
by > New Contributor III
  • 17 Views
  • 4 replies
  • 1 kudos

How to run black code-formating on the notebooks using custom configurations in UI

Hi all,I’m currently exploring how we can format notebook code using Black (installed via libraries) with specific configurations.I understand that we can configure Black locally using a pyproject.toml file. However, I’d like to know if there’s a way...

  • 17 Views
  • 4 replies
  • 1 kudos
Latest Reply
Barnita
New Contributor III
  • 1 kudos

Hi @szymon_dybczak ,Thanks for your response. My team has been using the same setup you mentioned. I’d like to know if there’s a way to override the default configuration that Black uses in a cluster environment — for example, adjusting the line-leng...

  • 1 kudos
3 More Replies
saicharandeepb
by > New Contributor III
  • 96 Views
  • 4 replies
  • 1 kudos

How to Retrieve DBU Count per Compute Type for Accurate Cost Calculation?

Hello Everyone,We are currently working on a cost analysis initiative to gain deeper insights into our Databricks usage. As part of this effort, we are trying to calculate the hourly cost of each Databricks compute instance by utilizing the Azure Ret...

  • 96 Views
  • 4 replies
  • 1 kudos
Latest Reply
saicharandeepb
New Contributor III
  • 1 kudos

Hi everyone, just to clarify my question — I’m looking for the DBU count per compute type (per instance type), not the total DBU consumption per workload.In other words, I want to know the fixed DBU rate assigned to each compute SKU (for example, DS3...

  • 1 kudos
3 More Replies
Jpeterson
by > New Contributor III
  • 5479 Views
  • 8 replies
  • 4 kudos

Databricks SQL Warehouse, Tableau and spark.driver.maxResultSize error

I'm attempting to create a tableau extract on tableau server with a connection to databricks large sql warehouse. The extract process fails due to spark.driver.maxResultSize error.Using a databricks interactive cluster in the data science & engineer...

  • 5479 Views
  • 8 replies
  • 4 kudos
Latest Reply
Oliverarson
  • 4 kudos

It sounds like you're running into quite a frustrating issue with Databricks and Tableau! Adjusting the spark.driver.maxResultSize is a good idea, but if you're still facing challenges, consider streamlining your data selections or aggregating your r...

  • 4 kudos
7 More Replies
AlbertWang
by > Valued Contributor
  • 2591 Views
  • 1 replies
  • 1 kudos

Can I Replicate Azure Document Intelligence's Custom Table Extraction in Databricks?

I am using Azure Document Intelligence to get data from a table in a PDF file. The table's headers do not visually align with the values. Therefore, the standard and pre-built models cannot correctly read the data.I have built a custom-trained Azure ...

  • 2591 Views
  • 1 replies
  • 1 kudos
Latest Reply
dkushari
Databricks Employee
  • 1 kudos

Hi @AlbertWang, you can easily achieve this using AgenBricks - Information Extraction. Your PDFs will be converted to text using the ai_parse_document function and saved in a Databricks table. You can then create the agent using that text table to ge...

  • 1 kudos
Karthik_Karanm
by > New Contributor III
  • 840 Views
  • 1 replies
  • 0 kudos

Permission Denied for Genie Auto-Generated Service Principal on SQL Endpoint in Playground

Hi community,Use Genie in multi-agent systems | Databricks DocumentationI’ve developed a multi-agent Genie in Databricks and integrated it with vector indexes. The setup works fine during model logging and prediction. The system successfully register...

  • 840 Views
  • 1 replies
  • 0 kudos
Latest Reply
dkushari
Databricks Employee
  • 0 kudos

Hi @Karthik_Karanm - Can you ensure to add the Genie in the resources as mentioned in the TODO of the cell. To enable automatic authentication, specify the dependent Databricks resources when calling mlflow.pyfunc.log_model(). TODO: If your Unity Ca...

  • 0 kudos
janglais
by > Visitor
  • 25 Views
  • 0 replies
  • 0 kudos

DLT Pipeline with unknown deleted source data

Hello.. I need help. So the context is : - ERP data for company in my group is stored in sql tables - Currently, once per day we copy the last 2 months of data (creation date) from each table into our datalake landing zone (we can however do full cop...

  • 25 Views
  • 0 replies
  • 0 kudos
Mits11
by > New Contributor
  • 28 Views
  • 0 replies
  • 0 kudos

Community edition cluster - UI shows incorrect cores

Hi,I am a community edition user which gives me cluster ( as per below image)15GB of memory and 2 cores with one driver node ONLY.However,when I read a csv file of 181MB size,1) it generates 8 partitiones.As per default maxPartitionBytes is set to 12...

Mits11_1-1761165245208.png Mits11_3-1761165673802.png Mits11_2-1761165566839.png
  • 28 Views
  • 0 replies
  • 0 kudos
Rjdudley
by > Honored Contributor
  • 273 Views
  • 3 replies
  • 0 kudos

Resolved! AUTO CDC API and sequence column

The docs for AUTO CDC API stateYou must specify a column in the source data on which to sequence records, which Lakeflow Declarative Pipelines interprets as a monotonically increasing representation of the proper ordering of the source data.Can this ...

  • 273 Views
  • 3 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

Thanks Szymon, I'm familiar with the Postgre SQL implementation and was hoping Databricks would behave the same.

  • 0 kudos
2 More Replies
ankit001mittal
by > New Contributor III
  • 1852 Views
  • 1 replies
  • 2 kudos

DLT schema evolution/changes in the logs

Hi all,I want to figure out how to find when the schema evolution/changes are happening in the objects in DLT pipelines through the DLT logs.Could you please share some sample DLT logs which explains about the schema changes?Thank you for your help.

  • 1852 Views
  • 1 replies
  • 2 kudos
Latest Reply
mark_ott
Databricks Employee
  • 2 kudos

To find when schema evolution or changes are happening in objects within DLT (Delta Live Table) pipelines, you need to monitor certain entries within the DLT logs or Delta transaction logs that signal modifications to the underlying schema of a table...

  • 2 kudos
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Connect with peers through User Groups and stay updated by subscribing to Events. We are excited to see you engage!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog

IoT Ingestion Using Append Flows

IoT (Internet-of-Things) devices are disrupting the manufacturing industry. While estimates vary, most sources agree that there are tens of billions of IoT devices currently operating in manufacturing...

608Views 3kudos