- 291 Views
- 2 replies
- 2 kudos
Resolved! My First Month Learning Databricks - Key Takeaways So Far.
Hey everyone I recently started my Databricks learning journey about a month ago, and I wanted to share what I’ve learned so far from one beginner to another.Here are a few highlights:1️⃣ Understanding the Lakehouse Concept - Realized how Databricks ...
- 291 Views
- 2 replies
- 2 kudos
- 2 kudos
I was planning to build an ETL pipeline, but I hadn’t considered using MLflow to predict sales and ratings. Thanks for the suggestion, I’ll work on creating this demo soon to test and enhance my skills.
- 2 kudos
- 487 Views
- 2 replies
- 5 kudos
I Tried Teaching Databricks About Itself — Here’s What Happened
Hi All, How are you doing today?I wanted to share something interesting from my recent Databricks work — I’ve been playing around with an idea I call “Real-Time Metadata Intelligence.” Most of us focus on optimizing data pipelines, query performance,...
- 487 Views
- 2 replies
- 5 kudos
- 5 kudos
I like the core idea. You are mining signals the platform already emits.I would start rules first, track small files ratio and average file size trend, watch skew per partition and shuffle bytes per input gigabyte. Compare job time to input size to c...
- 5 kudos
- 98 Views
- 0 replies
- 1 kudos
Last chance to register for our LIVE Lakebase BrickTalks session!
Join us tomorrow, Thursday, Nov 13 at 9 am PT for the latest BrickTalks! We'll talk about bringing data intelligence from your Lakehouse into every app. Register now. What you’ll learn: Use Lakebase (PostgreSQL-compatible, serverless OLTP) to serve...
- 98 Views
- 0 replies
- 1 kudos
- 168 Views
- 0 replies
- 1 kudos
How Upgrading to Databricks Runtime 16.4 sped up our Python script by 10x
Wanted to share something that might save others time and money. We had a complex Databricks script that ran over 1.5 hours, when the target was under 20 minutes. Initially tried scaling up the cluster, but real progress came from simply upgrading th...
- 168 Views
- 0 replies
- 1 kudos
- 139 Views
- 0 replies
- 1 kudos
Control Databricks Costs with AI & BI Dashboards - Video Summary
In this video, I try to showcase in a very simplified way how to enable and setup AI & BI dashboards to control costs and take actions. I hope this could be useful. I think it is a superb feature to get insights on costs while straightforward to setu...
- 139 Views
- 0 replies
- 1 kudos
- 284 Views
- 2 replies
- 10 kudos
Optimizing Delta Table Writes for Massive Datasets in Databricks
Problem StatementIn one of my recent projects, I faced a significant challenge: Writing a huge dataset of 11,582,763,212 rows and 2,068 columns to a Databricks managed Delta table.The initial write operation took 22.4 hours using the following setup:...
- 284 Views
- 2 replies
- 10 kudos
- 10 kudos
Hey @Louis_Frolio ,Thank you for the thoughtful feedback and great suggestions!A few clarifications:AQE is already enabled in my setup, and it definitely helped reduce shuffle overhead during the write.Regarding Column Pruning, in this case, the fina...
- 10 kudos
- 5268 Views
- 6 replies
- 5 kudos
Cross-filtering for AI/BI dashboards
AI/BI dashboards now support cross-filtering, which allows you to click on an element in one chart to filter and update related data in other charts.Cross-filtering allows users to interactively explore relationships and patterns across multiple visu...
- 5268 Views
- 6 replies
- 5 kudos
- 5 kudos
There does appear to now be a list of capsules indicating the filters applied along the top of Databricks AI/BI Dashboards. The capsules appear to include filter-selectors and also cross-filters added by clicking charts.Also, there is a now "Reset t...
- 5 kudos
- 213 Views
- 0 replies
- 2 kudos
Another BrickTalks! Let's talk about bringing data intelligence from your Lakehouse into every app!
You asked, we delivered! Another BrickTalk is scheduled for Thursday, Nov 13 @ 9 AM PT with Pranav Aurora on how to bring data intelligence from your Lakehouse into every app and user, seamlessly and in real time. What you’ll learn: Use Lakebase (Po...
- 213 Views
- 0 replies
- 2 kudos
- 330 Views
- 3 replies
- 11 kudos
Community Fellows: Shout Out to our Bricksters!
At Databricks, our Community members deserve to get a great experience in our forums, with quality answers from the experts. Who better to help out our customers than Databricks employees aka Bricksters! To work towards this goal, we created the Comm...
- 330 Views
- 3 replies
- 11 kudos
- 11 kudos
Kudos to the DB team for keeping up with the community, but can you please work on your product as well?We are experiencing a lot of issues with your paid product: failures, crashes, slow starts and slow performance and the list goes on. Community wo...
- 11 kudos
- 222 Views
- 1 replies
- 1 kudos
Cómo crear clusters en Databricks paso a paso | All-Purpose, Jobs Compute, SQL Warehouses y Pools
Recently having some fun with Databricks, I created a series of videos in Spanish that I'd like to share here. I hope some of them could be interesting for Spanish or LATAM community Not sure if this is the most proper board to share or there is ano...
- 222 Views
- 1 replies
- 1 kudos
- 1 kudos
Añadido nuevo vídeo para crear clusters de tipo serverless para notebooks, jobs y DLTs https://youtu.be/RQvkssryjyQ?si=BkYI831mUK1vBE20
- 1 kudos
- 2309 Views
- 3 replies
- 7 kudos
Building a Metadata Table-Driven Framework Using LakeFlow Declarative (Formerly DLT) Pipelines
IntroductionScaling data pipelines across an organization can be challenging, particularly when data sources, requirements, and transformation rules are always changing. A metadata table-driven framework using LakeFlow Declarative (Formerly DLT) enab...
- 2309 Views
- 3 replies
- 7 kudos
- 7 kudos
Helpful article @TejeshS . I have a question like if I want to pass parameters from my workflow to pipeline, is it possible? if yes what will be the best approach.
- 7 kudos
- 2646 Views
- 17 replies
- 29 kudos
(Episode 1: Getting Data In) - Learning Databricks one brick at a time, using the Free Edition
Episode 1: Getting Data InLearning Databricks one brick at a time, using the Free Edition.Project IntroWelcome to everyone reading. My name’s Ben, a.k.a BS_THE_ANALYST, and I’m going to share my experiences as I learn the world of Databricks. My obje...
- 2646 Views
- 17 replies
- 29 kudos
- 29 kudos
Really interesting post @BS_THE_ANALYST Caching up with Databricks stuff again
- 29 kudos
- 224 Views
- 0 replies
- 1 kudos
SQL Scripting in Apache Spark™ 4.0
The Apache Spark™ 4.0 introduces a new feature for SQL developers and data engineers: SQL Scripting. As such, this feature enhances the power and extends the flexibility of Spark SQL, enabling users to write procedural code within SQL queries, with t...
- 224 Views
- 0 replies
- 1 kudos
- 1232 Views
- 6 replies
- 14 kudos
(Episode 3: Hands-on API Project) - Learning Databricks one brick at a time, using the Free Edition
Episode 3: APIsLearning Databricks one brick at a time, using the Free Edition.Project IntroWelcome to everyone reading. My name’s Ben, a.k.a BS_THE_ANALYST, and I’m going to share my experiences as I learn the world of Databricks. My objective is to...
- 1232 Views
- 6 replies
- 14 kudos
- 14 kudos
This is great, thanks for sharing Ben, will share with my data community.
- 14 kudos
- 967 Views
- 3 replies
- 16 kudos
(Episode 2: Reading Excel Files) - Learning Databricks one brick at a time, using the Free Edition
Episode 2: Reading Excel FilesLearning Databricks one brick at a time, using the Free Edition.You can download the accompanying Notebook and Excel files used in the demonstration over on my GitHub:Excel Files & Notebook: https://github.com/BSanalyst...
- 967 Views
- 3 replies
- 16 kudos
- 16 kudos
Thanks for this, @BS_THE_ANALYST. Hugely beneficial.
- 16 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
1 -
ADF Linked Service
1 -
ADF Pipeline
1 -
Advanced Data Engineering
3 -
Agentic AI
1 -
AI Agents
2 -
AI Readiness
1 -
Apache spark
1 -
ApacheSpark
1 -
Associate Certification
1 -
Automation
1 -
AWSDatabricksCluster
1 -
Azure
1 -
Azure databricks
3 -
Azure devops integration
1 -
AzureDatabricks
2 -
Big data
1 -
Billing and Cost Management
1 -
Blog
1 -
Caching
2 -
CICDForDatabricksWorkflows
1 -
Cluster
1 -
Cluster Policies
1 -
Cluster Pools
1 -
Community Event
1 -
Cost Optimization Effort
1 -
CostOptimization
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data
1 -
Data Analysis with Databricks
1 -
Data Engineering
5 -
Data Governance
1 -
Data Ingestion & connectivity
1 -
Data Mesh
1 -
Data Processing
1 -
Data Quality
1 -
Databricks Assistant
1 -
Databricks Community
1 -
Databricks Dashboard
2 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks Job
1 -
Databricks Migration
2 -
Databricks Mlflow
1 -
Databricks Notebooks
1 -
Databricks Support
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksML
1 -
DBR Versions
1 -
Declartive Pipelines
1 -
DeepLearning
1 -
Delta Lake
2 -
Delta Live Table
1 -
Delta Live Tables
1 -
Delta Time Travel
1 -
Devops
1 -
DimensionTables
1 -
DLT
2 -
DLT Pipelines
3 -
DLT-Meta
1 -
Dns
1 -
Dynamic
1 -
Free Databricks
3 -
GenAI agent
1 -
GenAI and LLMs
2 -
GenAIGeneration AI
1 -
Generative AI
1 -
Genie
1 -
Governance
1 -
Hive metastore
1 -
Hubert Dudek
1 -
Lakeflow Pipelines
1 -
Lakehouse
1 -
Lakehouse Migration
1 -
Lazy Evaluation
1 -
Learning
1 -
Library Installation
1 -
Llama
1 -
Medallion Architecture
1 -
Metric Views
1 -
Migrations
1 -
MSExcel
2 -
Multiagent
1 -
Networking
2 -
Partner
1 -
Performance
1 -
Performance Tuning
1 -
Private Link
1 -
Pyspark
2 -
Pyspark Code
1 -
Pyspark Databricks
1 -
Pytest
1 -
Python
1 -
Reading-excel
1 -
Scala Code
1 -
Scripting
1 -
SDK
1 -
Serverless
2 -
Spark
2 -
Spark Caching
1 -
SparkSQL
1 -
SQL
1 -
SQL Serverless
1 -
Support Ticket
1 -
Sync
1 -
Tutorial
1 -
Unit Test
1 -
Unity Catalog
4 -
Unity Catlog
1 -
Warehousing
1 -
Workflow Jobs
1 -
Workflows
3
- « Previous
- Next »
| User | Count |
|---|---|
| 71 | |
| 43 | |
| 38 | |
| 31 | |
| 23 |