cancel
Showing results for 
Search instead for 
Did you mean: 
Community Articles
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Databricks MVP
  • 352 Views
  • 0 replies
  • 0 kudos

Databricks Advent Calendar 2025 #14

Ingestion from SharePoint is now available directly in PySpark. Just define a connection and use spark-read or, even better, spark-readStream with an autoloader. Just specify the file type and options for that file (pdf, csv, Excel, etc.)

2025_14.png
  • 352 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Databricks MVP
  • 384 Views
  • 0 replies
  • 0 kudos

Databricks Advent Calendar 2025 #13

ZeroBus changes the game: you can now push event data directly into Databricks, even from on-prem. No extra event layer needed. Every Unity Catalog table can act as an endpoint.

2025_13.png
  • 384 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Databricks MVP
  • 378 Views
  • 0 replies
  • 1 kudos

Databricks Advent Calendar 2025 #12

All leading LLMs are available natively in Databricks: - ChatGPT 5.2 from the day of the premiere! - System catalog with AI schema in Unity Catalog has multiple LLMs ready to serve! - OpenAI, Gemini, and Anthropic are available side by side!

2025_12.png
  • 378 Views
  • 0 replies
  • 1 kudos
Hubert-Dudek
by Databricks MVP
  • 373 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #10

Databricks goes native on Excel. You can now ingest + query .xls/.xlsx directly in Databricks (SQL + PySpark, batch and streaming), with auto schema/type inference, sheet + cell-range targeting, and evaluated formulas, no extra libraries anymore.

2025_10.png
  • 373 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 301 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #9

Tags, whether manually assigned or automatically assigned by the “data classification” service, can be protected using policies. Column masking can automatically mask columns with a given tag for all except some with elevated access.

2025_9.png
  • 301 Views
  • 0 replies
  • 2 kudos
vinaygazula
by New Contributor II
  • 890 Views
  • 1 replies
  • 1 kudos

Building an AgenticLakehouse: Interacting with Databricks Workspace via LangGraph and MCP

This project, AgenticLakehouse, explores the cutting edge of "Agentic Data Analytics." I didn't just want a chatbot; I wanted a "living" interface for the Lakehouse. The result is a Multi-Agent System that intelligently orchestrates tasks, from query...

  • 890 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Looks great, solid LangGraph + MCP setup on Databricks Apps. Thanks for sharing, @vinaygazula!

  • 1 kudos
AbhaySingh
by Databricks Employee
  • 2846 Views
  • 0 replies
  • 4 kudos

Semantic Bridge to Unity Catalog

How Ontos bridges the gap between technical metadata and business meaning Here's a scenario that might sound familiar. You've got Unity Catalog humming along—tables are registered, lineage is tracked, access controls are in place. Technically, every...

  • 2846 Views
  • 0 replies
  • 4 kudos
Raman_Unifeye
by Honored Contributor III
  • 1659 Views
  • 4 replies
  • 5 kudos

Spark Jobs View on a serverless cluster!! Naah.....

Have you ever noticed (and wondered) that the wonderful Spark Job UI is no longer available in the databricks notebook if the cell is executed using 'serverless' cluster?Tradionally, whenever we run the spark code (action command), we used to see the...

  • 1659 Views
  • 4 replies
  • 5 kudos
Latest Reply
Senga98
Contributor
  • 5 kudos

Hi RamanThank you for the amazing insights! I am trying to understand more about SQL Warehouses - is it managed by Unity Catalog? From what I could gather, SQL Warehouse is a compute layer, not a data layer and therefore not managed by Unity Catalog....

  • 5 kudos
3 More Replies
Hubert-Dudek
by Databricks MVP
  • 361 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #7

Imagine all a data engineer or analyst needs to do to read from a REST API is use spark.read(), no direct request calls, no manual JSON parsing - just spark .read. That’s the power of a custom Spark Data Source. Soon, we will see a surge of open-sour...

2025_7.png 2025_7.png
  • 361 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 360 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #6

DBX is one of the most crucial projects of dblabs this year, and we can expect that more and more great checks from it will be supported natively in databricks. More about dbx on https://databrickslabs.github.io/dqx/

2025_6.png
  • 360 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 424 Views
  • 1 replies
  • 2 kudos

Databricks Advent Calendar 2025 #5

When something goes wrong, and your pattern is doing MERGEs per day in your jobs, backfill jobs will help you to reload many days in one shot.

2025_5.png
  • 424 Views
  • 1 replies
  • 2 kudos
Latest Reply
Raman_Unifeye
Honored Contributor III
  • 2 kudos

Full link of the actual blog for reference - https://www.databricks.com/blog/announcing-backfill-runs-lakeflow-jobs-higher-quality-downstream-data 

  • 2 kudos
Hubert-Dudek
by Databricks MVP
  • 619 Views
  • 1 replies
  • 4 kudos

Databricks Advent Calendar 2025

With the first day of December comes the first window of our Databricks Advent Calendar. It’s a perfect time to look back at this year’s biggest achievements and surprises — and to dream about the new “presents” the platform may bring us next year. ...

2025_1.png 2025_2.png
  • 619 Views
  • 1 replies
  • 4 kudos
Latest Reply
Advika
Community Manager
  • 4 kudos

Fantastic kickoff to the Databricks Advent Calendar 2025 , appreciate you steering the series, @Hubert-Dudek!

  • 4 kudos
Hubert-Dudek
by Databricks MVP
  • 333 Views
  • 0 replies
  • 4 kudos

Databricks Advent Calendar 2025 #4

With the new ALTER SET, it is really easy to migrate (copy/move) tables. Quite awesome also when you need to make an initial load and have an old system under Lakehouse Federation (foreign tables).

2025_4.png
  • 333 Views
  • 0 replies
  • 4 kudos
Labels