- 282 Views
- 2 replies
- 1 kudos
Building a Large-Scale Supply Chain Simulation Platform on Databricks
A Data & AI–Driven Decision Engine for Modern Retail NetworksIntroductionIn modern retail, supply chains are no longer static networks — they are living, adaptive systems that must continuously respond to customer demand, fulfillment speed expectatio...
- 282 Views
- 2 replies
- 1 kudos
- 1 kudos
Great article! would love to know more as I have a very similar case with a concrete customer. Thanks, S.
- 1 kudos
- 28 Views
- 0 replies
- 1 kudos
It's time to treat AI as a peer, not a tool. What if your AI already knew Databricks?
We need to stop treating AI as a tool. It's time to treat it as a peer.I've been building a library of reusable skills for Claude — structured instructions that let AI agents handle complex, repetitive development workflows on Databricks and Azure AI...
- 28 Views
- 0 replies
- 1 kudos
- 57 Views
- 0 replies
- 0 kudos
I Tried Building an Agentic AI System for Construction. Here’s What Actually Worked.
Most construction teams don’t really have a data problem, at least not in the way we usually think about it. They already have dashboards everywhere. Finance has reports, project managers have schedule views, field teams have inspection logs. Everyon...
- 57 Views
- 0 replies
- 0 kudos
- 183 Views
- 0 replies
- 1 kudos
Databricks Transactions - Enforcing Business Validation Rules via SQL SIGNAL
Combining SIGNAL statement with ATOMIC transactions in Databricks saves us from managing commits & rollbacks along with managing custom validations seamlessly - something that modern big data ETL frameworks struggle to deliver cleanly. They give the ...
- 183 Views
- 0 replies
- 1 kudos
- 284 Views
- 1 replies
- 3 kudos
Secure Credit Card Partner Enablement Using Databricks Clean Rooms
How Digital Payment Lending Platforms Can Collaborate with Banks Without Exposing Sensitive Data1. Business Context & Regulatory RealityIn 2020, large Indian fintech platforms faced a unique regulatory constraint: NBFC‑led digital platforms were not ...
- 284 Views
- 1 replies
- 3 kudos
- 3 kudos
This is a solid breakdown of how secure data collaboration can be done without exposing sensitive information. The Clean Room approach really stands out because it shifts the model from data sharing to controlled computation, which is exactly what re...
- 3 kudos
- 447 Views
- 2 replies
- 7 kudos
Databricks Multi-Table Transactions - Part 1
If you've ever worked on an insurance data warehouse, or really any warehouse where data arrives from different systems at different times, you know the pain of keeping things in sync. I spent years building data warehouses for a property and casual...
- 447 Views
- 2 replies
- 7 kudos
- 7 kudos
This is a great write piece @Ashwin_DSA
- 7 kudos
- 209 Views
- 0 replies
- 1 kudos
𝗦𝗜𝗘𝗠 𝗶𝘀 𝗹𝗲𝗴𝗮𝗰𝘆. Here's why, and what becomes possible when you move security operations
I've spent years migrating SOC operations from traditional SIEM to Databricks. Not because it's trendy, but because SIEM has fundamental problems that no vendor update will fix: proprietary query languages that lock you in, no version control or test...
- 209 Views
- 0 replies
- 1 kudos
- 475 Views
- 9 replies
- 6 kudos
Streaming Failure Models: Why "It Didn't Crash" Is the Worst Outcome
Most Databricks streaming failures don't look dramatic.No cluster termination. No red wall of errors. The UI says RUNNING — and your customers start reporting nonsense.I wrote about the incident that changed how we think about streaming jobs on share...
- 475 Views
- 9 replies
- 6 kudos
- 6 kudos
Completely agree, production war stories are worth more than any documentation. I’ve eaten enough teeth on production data lake issues to write my own chapter on what can go wrong, whether that’s deploying Databricks in financial institutions or bein...
- 6 kudos
- 127 Views
- 0 replies
- 1 kudos
Built a NiFi processor for Zerobus Ingest - gotchas the docs won’t tell you
Zerobus went GA on February 23rd. Connector ecosystem: empty. I run NiFi for security telemetry so I built the processor myself. Apache 2.0, source on GitHub.NiFi uses NAR packaging — each archive gets its own classloader. The Zerobus Java SDK is JNI...
- 127 Views
- 0 replies
- 1 kudos
- 350 Views
- 0 replies
- 2 kudos
Databricks Multi Table Transactions - All Data or Nothing
Databricks introduces multi-table transactions, allowing operations across multiple Delta tables to execute as a single atomic unit. Delta Lake has provided ACID guarantees at the table level, but ensuring atomicity across multiple tables previously ...
- 350 Views
- 0 replies
- 2 kudos
- 258 Views
- 1 replies
- 2 kudos
Multi-Task on a Shared Cluster — Why That's Also Not Enough
Part 2 of 3 — Databricks Streaming ArchitectureThe instinct after Part 1 was obvious.If running eight queries in one task means one failure can hide while others keep running — split them into multiple tasks. Separate concerns. Give each component it...
- 258 Views
- 1 replies
- 2 kudos
- 2 kudos
Part 1: Streaming Failure Models: Why "It Didn't Crash" Is the Worst OutcomePart 3: One Cluster per Task — Proven, Ready, and Waiting
- 2 kudos
- 207 Views
- 0 replies
- 1 kudos
Enterprise Data Platform Architecture on Azure with Databricks
Hi everyone,I recently wrote an article on designing an enterprise-scale data platform architecture using Azure and Databricks.The article covers:• End-to-end architecture for enterprise data platforms• Data ingestion using Azure Data Factory and Kaf...
- 207 Views
- 0 replies
- 1 kudos
- 229 Views
- 0 replies
- 4 kudos
One Policy to Mask Them All: ABAC + VARIANT in Unity Catalog
Databricks ABAC lets you apply a single schema-level policy across columns of any data type — no more managing one mask function per type. Here's how to use the VARIANT data type to make it work. If you've implemented column masking in Unity Catalog,...
- 229 Views
- 0 replies
- 4 kudos
- 155 Views
- 0 replies
- 1 kudos
One Cluster per Task — Proven, Ready, and Waiting
Part 3 of 3: Databricks Streaming ArchitectureBy the end of Part 1 & Part 2, we knew what the real answer was. We just hadn’t committed to it yet.Not because it wouldn’t work. We tested it. We documented it. The code was ready. The answer was one clu...
- 155 Views
- 0 replies
- 1 kudos
- 225 Views
- 0 replies
- 2 kudos
Building a Hybrid Lakehouse: Strategic Use of Apache Hudi and Delta Lake in Databricks
Apache Hudi and Delta Lake are built for different workloads. Hudi is optimised for high-frequency writes; Delta Lake is built for fast, reliable reads. Using one format across the entire data platform forces an unnecessary trade-off high ingestion c...
- 225 Views
- 0 replies
- 2 kudos
-
Access Data
1 -
ADF Linked Service
1 -
ADF Pipeline
1 -
Advanced Data Engineering
3 -
agent bricks
1 -
Agentic AI
3 -
AI Agents
3 -
AI Readiness
1 -
Apache spark
3 -
Apache Spark 3.0
1 -
ApacheSpark
1 -
Associate Certification
1 -
Auto-loader
1 -
Automation
1 -
AWSDatabricksCluster
1 -
Azure
1 -
Azure databricks
3 -
Azure Databricks Job
2 -
Azure Delta Lake
2 -
Azure devops integration
1 -
AzureDatabricks
2 -
BI Integrations
1 -
Big data
1 -
Billing and Cost Management
1 -
Blog
1 -
Caching
2 -
CDC
1 -
CICDForDatabricksWorkflows
1 -
Cluster
1 -
Cluster Policies
1 -
Cluster Pools
1 -
Collect
1 -
Community Event
1 -
CommunityArticle
2 -
Cost Optimization Effort
1 -
CostOptimization
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data
1 -
Data Analysis with Databricks
1 -
Data Driven AI Roadmap
1 -
Data Engineering
6 -
Data Governance
1 -
Data Ingestion
1 -
Data Ingestion & connectivity
1 -
Data Mesh
1 -
Data Processing
1 -
Data Quality
1 -
databricks
1 -
Databricks Assistant
2 -
Databricks Community
1 -
Databricks Dashboard
2 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks Job
1 -
Databricks Lakehouse
1 -
Databricks Migration
3 -
Databricks Mlflow
1 -
Databricks Notebooks
1 -
Databricks Serverless
1 -
Databricks Support
1 -
Databricks Training
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksML
1 -
DBR Versions
1 -
Declartive Pipelines
1 -
DeepLearning
1 -
Delta Lake
4 -
Delta Live Table
1 -
Delta Live Tables
1 -
Delta Time Travel
1 -
Devops
1 -
DimensionTables
1 -
DLT
2 -
DLT Pipelines
3 -
DLT-Meta
1 -
Dns
1 -
Dynamic
1 -
Free Databricks
3 -
Free Edition
1 -
GenAI agent
2 -
GenAI and LLMs
2 -
GenAIGeneration AI
2 -
Generative AI
1 -
Genie
1 -
Governance
1 -
Governed Tag
1 -
Hive metastore
1 -
Hubert Dudek
43 -
Hybrid Lakehouse
1 -
LakeBase
1 -
Lakeflow Pipelines
1 -
Lakehouse
2 -
Lakehouse Migration
1 -
Lazy Evaluation
1 -
Learn Databricks
1 -
Learning
1 -
Library Installation
1 -
Llama
1 -
LLMs
1 -
mcp
1 -
Medallion Architecture
2 -
Metric Views
1 -
Migrations
1 -
MSExcel
3 -
Multiagent
3 -
Networking
2 -
NotMvpArticle
1 -
Partitioning
1 -
Partner
1 -
Performance
2 -
Performance Tuning
2 -
Private Link
1 -
Pyspark
2 -
Pyspark Code
1 -
Pyspark Databricks
1 -
Pytest
1 -
Python
1 -
Reading-excel
2 -
Scala Code
1 -
Scripting
1 -
SDK
1 -
Serverless
2 -
Spark
4 -
Spark Caching
1 -
Spark Performance
1 -
SparkSQL
1 -
SQL
1 -
Sql Scripts
1 -
SQL Serverless
1 -
Students
1 -
Support Ticket
1 -
Sync
1 -
Training
1 -
Tutorial
1 -
Unit Test
1 -
Unity Catalog
6 -
Unity Catlog
1 -
Variant
1 -
Warehousing
1 -
Workflow Jobs
1 -
Workflows
6 -
Zerobus
1
- « Previous
- Next »
| User | Count |
|---|---|
| 87 | |
| 71 | |
| 44 | |
| 41 | |
| 41 |