- 6180 Views
- 8 replies
- 7 kudos
Cross-filtering for AI/BI dashboards
AI/BI dashboards now support cross-filtering, which allows you to click on an element in one chart to filter and update related data in other charts.Cross-filtering allows users to interactively explore relationships and patterns across multiple visu...
- 6180 Views
- 8 replies
- 7 kudos
- 7 kudos
There does appear to now be a list of capsules indicating the filters applied along the top of Databricks AI/BI Dashboards. The capsules appear to include filter-selectors and also cross-filters added by clicking charts.Also, there is a now "Reset t...
- 7 kudos
- 180 Views
- 0 replies
- 2 kudos
Databricks Advent Calendar 2025 #17
Replacing records for the entire date with newly arriving data for the given date is a typical design pattern. Now, thanks to simple REPLACE USING in Databricks, it is easier than ever!
- 180 Views
- 0 replies
- 2 kudos
- 332 Views
- 2 replies
- 3 kudos
Databricks Advent Calendar 2025 #11
Real-time mode is a breakthrough that lets Spark utilize all available CPUs to process records with single-millisecond latency, while decoupling checkpointing from per-record processing.
- 332 Views
- 2 replies
- 3 kudos
- 229 Views
- 0 replies
- 0 kudos
Databricks Advent Calendar 2025 #16
For many data engineers who love PySpark, the most significant improvement of 2025 was the addition of merge to the dataframe API, so no more Delta library or SQL is needed to perform MERGE. p.s. I still prefer SQL MERGE inside spark.sql()
- 229 Views
- 0 replies
- 0 kudos
- 228 Views
- 0 replies
- 2 kudos
Databricks Advent Calendar 2025 #15
New Lakakebase experience is a game-changer for transactional databases. That functionality is fantastic. Autoscaling to zero makes it really cost-effective. Do you need to deploy to prod? Just branch the production database to the release branch, an...
- 228 Views
- 0 replies
- 2 kudos
- 824 Views
- 1 replies
- 3 kudos
Unity Catalog Lineage: Lineage That Just Works
I've been working with Unity Catalog's lineage capabilities for a while now, and I have to say—this is what lineage should have always been. Not a separate tool to configure. Not a manual process to maintain. Just automatic, real-time visibility into...
- 824 Views
- 1 replies
- 3 kudos
- 3 kudos
I have been using and implementing UC in various workspaces across industry, BYOL is the one I am really looking forward to implement next.Thanks @AbhaySingh for consolidating it here.
- 3 kudos
- 213 Views
- 0 replies
- 0 kudos
Databricks Advent Calendar 2025 #14
Ingestion from SharePoint is now available directly in PySpark. Just define a connection and use spark-read or, even better, spark-readStream with an autoloader. Just specify the file type and options for that file (pdf, csv, Excel, etc.)
- 213 Views
- 0 replies
- 0 kudos
- 865 Views
- 0 replies
- 1 kudos
Databricks News: Week 50: 8 December 2025 to 14 December 2025
Excel The big news this week is the possibility of native importing Excel files. Write operations are also possible. There is a possibility of choosing a data range. It also works with the streaming autoloader, currently in beta. GPT 5.2 The same day...
- 865 Views
- 0 replies
- 1 kudos
- 229 Views
- 0 replies
- 0 kudos
Databricks Advent Calendar 2025 #13
ZeroBus changes the game: you can now push event data directly into Databricks, even from on-prem. No extra event layer needed. Every Unity Catalog table can act as an endpoint.
- 229 Views
- 0 replies
- 0 kudos
- 238 Views
- 0 replies
- 1 kudos
Databricks Advent Calendar 2025 #12
All leading LLMs are available natively in Databricks: - ChatGPT 5.2 from the day of the premiere! - System catalog with AI schema in Unity Catalog has multiple LLMs ready to serve! - OpenAI, Gemini, and Anthropic are available side by side!
- 238 Views
- 0 replies
- 1 kudos
- 229 Views
- 0 replies
- 2 kudos
Databricks Advent Calendar 2025 #10
Databricks goes native on Excel. You can now ingest + query .xls/.xlsx directly in Databricks (SQL + PySpark, batch and streaming), with auto schema/type inference, sheet + cell-range targeting, and evaluated formulas, no extra libraries anymore.
- 229 Views
- 0 replies
- 2 kudos
- 169 Views
- 0 replies
- 2 kudos
Databricks Advent Calendar 2025 #9
Tags, whether manually assigned or automatically assigned by the “data classification” service, can be protected using policies. Column masking can automatically mask columns with a given tag for all except some with elevated access.
- 169 Views
- 0 replies
- 2 kudos
- 322 Views
- 1 replies
- 1 kudos
Building an AgenticLakehouse: Interacting with Databricks Workspace via LangGraph and MCP
This project, AgenticLakehouse, explores the cutting edge of "Agentic Data Analytics." I didn't just want a chatbot; I wanted a "living" interface for the Lakehouse. The result is a Multi-Agent System that intelligently orchestrates tasks, from query...
- 322 Views
- 1 replies
- 1 kudos
- 1 kudos
Looks great, solid LangGraph + MCP setup on Databricks Apps. Thanks for sharing, @vinaygazula!
- 1 kudos
- 860 Views
- 0 replies
- 4 kudos
Semantic Bridge to Unity Catalog
How Ontos bridges the gap between technical metadata and business meaning Here's a scenario that might sound familiar. You've got Unity Catalog humming along—tables are registered, lineage is tracked, access controls are in place. Technically, every...
- 860 Views
- 0 replies
- 4 kudos
- 163 Views
- 0 replies
- 1 kudos
Databricks Advent Calendar 2025 #8
Data classification automatically tags Unity Catalog tables and is now available in system tables as well.
- 163 Views
- 0 replies
- 1 kudos
-
Access Data
1 -
ADF Linked Service
1 -
ADF Pipeline
1 -
Advanced Data Engineering
3 -
agent bricks
1 -
Agentic AI
3 -
AI Agents
3 -
AI Readiness
1 -
Apache spark
1 -
ApacheSpark
1 -
Associate Certification
1 -
Auto-loader
1 -
Automation
1 -
AWSDatabricksCluster
1 -
Azure
1 -
Azure databricks
3 -
Azure devops integration
1 -
AzureDatabricks
2 -
BI Integrations
1 -
Big data
1 -
Billing and Cost Management
1 -
Blog
1 -
Caching
2 -
CICDForDatabricksWorkflows
1 -
Cluster
1 -
Cluster Policies
1 -
Cluster Pools
1 -
Community Event
1 -
Cost Optimization Effort
1 -
CostOptimization
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data
1 -
Data Analysis with Databricks
1 -
Data Engineering
5 -
Data Governance
1 -
Data Ingestion & connectivity
1 -
Data Mesh
1 -
Data Processing
1 -
Data Quality
1 -
Databricks Assistant
1 -
Databricks Community
1 -
Databricks Dashboard
2 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks Job
1 -
Databricks Lakehouse
1 -
Databricks Migration
2 -
Databricks Mlflow
1 -
Databricks Notebooks
1 -
Databricks Serverless
1 -
Databricks Support
1 -
Databricks Training
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksML
1 -
DBR Versions
1 -
Declartive Pipelines
1 -
DeepLearning
1 -
Delta Lake
2 -
Delta Live Table
1 -
Delta Live Tables
1 -
Delta Time Travel
1 -
Devops
1 -
DimensionTables
1 -
DLT
2 -
DLT Pipelines
3 -
DLT-Meta
1 -
Dns
1 -
Dynamic
1 -
Free Databricks
3 -
Free Edition
1 -
GenAI agent
2 -
GenAI and LLMs
2 -
GenAIGeneration AI
1 -
Generative AI
1 -
Genie
1 -
Governance
1 -
Hive metastore
1 -
Hubert Dudek
43 -
Lakeflow Pipelines
1 -
Lakehouse
1 -
Lakehouse Migration
1 -
Lazy Evaluation
1 -
Learning
1 -
Library Installation
1 -
Llama
1 -
LLMs
1 -
mcp
1 -
Medallion Architecture
2 -
Metric Views
1 -
Migrations
1 -
MSExcel
3 -
Multiagent
3 -
Networking
2 -
Partner
1 -
Performance
1 -
Performance Tuning
1 -
Private Link
1 -
Pyspark
2 -
Pyspark Code
1 -
Pyspark Databricks
1 -
Pytest
1 -
Python
1 -
Reading-excel
2 -
Scala Code
1 -
Scripting
1 -
SDK
1 -
Serverless
2 -
Spark
2 -
Spark Caching
1 -
SparkSQL
1 -
SQL
1 -
Sql Scripts
1 -
SQL Serverless
1 -
Students
1 -
Support Ticket
1 -
Sync
1 -
Training
1 -
Tutorial
1 -
Unit Test
1 -
Unity Catalog
5 -
Unity Catlog
1 -
Warehousing
1 -
Workflow Jobs
1 -
Workflows
3
- « Previous
- Next »
| User | Count |
|---|---|
| 87 | |
| 71 | |
| 44 | |
| 38 | |
| 33 |