- 451 Views
- 7 replies
- 4 kudos
How to Optimize Data Pipeline Development on Databricks for Large-Scale Workloads?
Hi everyone,I’m working on building and optimizing data pipelines in Databricks, especially for large-scale workloads, and I want to learn from others who have hands-on experience with performance tuning, architecture decisions, and best practices.I’...
- 451 Views
- 7 replies
- 4 kudos
- 4 kudos
Optimizing Databricks pipelines for large-scale workloads mostly comes down to smart architecture + efficient Spark practices.Key tips from real-world users:Use Delta Lake – for ACID transactions, incremental updates, and schema enforcement.Partition...
- 4 kudos
- 1384 Views
- 2 replies
- 2 kudos
Databricks community group in Kerala
Calling All Data Enthusiasts in Kerala! Hey everyone,I'm excited about the idea of launching a Databricks Community Group here in Kerala! This group would be a hub for learning, sharing knowledge, and networking among data enthusiasts, analysts, a...
- 1384 Views
- 2 replies
- 2 kudos
- 2 kudos
Great initiative! It's good to see the tech community growing here. I’m representing Fegno Technologies, a web and mobile app development company in Kochi. We are always keen to stay updated on the latest data engineering trends and cloud platforms.
- 2 kudos
- 117 Views
- 0 replies
- 4 kudos
Databricks Advent Calendar 2025 #4
With the new ALTER SET, it is really easy to migrate (copy/move) tables. Quite awesome also when you need to make an initial load and have an old system under Lakehouse Federation (foreign tables).
- 117 Views
- 0 replies
- 4 kudos
- 3476 Views
- 5 replies
- 4 kudos
Find value in any column in a table
Hi,I'm not sure if this is a possible scenario, but is there, by any chance a way to query all the columns of a table for searching a value? Explanation: I want to search for a specific value in all the columns of a databricks table. I don't know whi...
- 3476 Views
- 5 replies
- 4 kudos
- 174 Views
- 2 replies
- 5 kudos
Databricks Advent Calendar 2025 #3
One of the biggest gifts is that we can finally move Genie to other environments by using the API. I hope DABS comes soon.
- 174 Views
- 2 replies
- 5 kudos
- 5 kudos
@Hubert-Dudek - sure, willlook forward to this one.
- 5 kudos
- 115 Views
- 0 replies
- 1 kudos
Databricks Advent Calendar Edition
Santa is coming to Lakehouse Town! From now until Christmas, I’ll be your guide on a Databricks journey – sharing one powerful Databricks feature every single day. https://www.linkedin.com/posts/bianca-stratulat_databricks-lakehouse-dataengineering-a...
- 115 Views
- 0 replies
- 1 kudos
- 121 Views
- 1 replies
- 1 kudos
How to build a chatbot in Databricks for ad‑hoc analytics questions?
Hi everyone,I’m exploring the idea of creating a chatbot within Databricks that can handle ad‑hoc business analytics queries. For example, I’d like users to be able to ask questions such as:“How many sales did we have in 2025?”“Which products had the...
- 121 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @laeforceable , if you have data in databricks platform then you can use Genie Room Capability to ask questions related to data..
- 1 kudos
- 337 Views
- 3 replies
- 2 kudos
Resolved! Azure databrics Learning tutorials ADB+SQL,ADB+PYSPARK,ADB+PYTHON
Suggest me the best learning tutorials of Azure databricks with the combinations of pyspark,python,sql is the any learning web based tutorials from Databricks suggest me the best one from scratch to advanced
- 337 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @vijaypodili ,I can recommend Data Engineering Learning Path on Databricks Academy:https://customer-academy.databricks.com/On Udemy, there's an excellent course that covers all the important aspects of working with Databricks on a daily basis:Data...
- 2 kudos
- 248 Views
- 2 replies
- 4 kudos
Resolved! Unexpected Script Execution Differences on databricks.com vs Mobile-Triggered Runtimes
I’m noticing some unusual inconsistencies in how scripts execute on databricks.com compared to when the same workflow is triggered through a mobile-based API. On Databricks, the script runs perfectly when executed directly inside a cluster notebook. ...
- 248 Views
- 2 replies
- 4 kudos
- 4 kudos
Hi Ellie,What you’re seeing is actually quite common , the same script can behave slightly differently when: run interactively in a notebook on a cluster, vsrun as a job / via API trigger (or from a mobile wrapper hitting that API). It’s usually not ...
- 4 kudos
- 3585 Views
- 9 replies
- 0 kudos
Resolved! Programatic selection of serverless compute for notebooks environment version
Hello,I have a case where I am executing notebooks from an external system using databricks api /api/2.2/jobs/runs/submit. This has always been non problematic with the job compute, but due to the quite recent serverless for notebooks support being i...
- 3585 Views
- 9 replies
- 0 kudos
- 0 kudos
Not so sure about the general but in eu-west-3, we could specify the serverless environment version using DAB using the `environments` block and `spec` params:resources: jobs: pipeline: name: "[${bundle.target}]pipeline" webhook_not...
- 0 kudos
- 1799 Views
- 2 replies
- 0 kudos
informatica jobs from data bricks
Hi TeamHow can we call informatica jobs from data bricks? could you please suggest on this.Regards,Phanindra
- 1799 Views
- 2 replies
- 0 kudos
- 0 kudos
unsure how above answer helps here @Phani1 - The only way I could think of is - call Informatica jobs (specifically Informatica Cloud Data Integration (CDI) mappings or tasks) from Databricks by leveraging REST APIs.Direct API - Trigger an Informatic...
- 0 kudos
- 209 Views
- 2 replies
- 0 kudos
Databricks Java SDK retrieving job task values
Greetings,I have a Job that consists of notebook tasks running python code.Some of the task set task values using dbutils.jobs.taskValues.set(key=key, value=value)as described here How do I retrieve those task values using Databricks Java SDK v0.69.0...
- 209 Views
- 2 replies
- 0 kudos
- 0 kudos
Unfortunately you can’t read dbutils.jobs.taskValues directly via the Java SDK. But you could return a JSON payload via dbutils.notebook.exit and read it with getRunOutput. Databricks exposes the notebook “exit” result through Jobs GetRunOutput, then...
- 0 kudos
- 284 Views
- 5 replies
- 6 kudos
Resolved! Synchronising metadata (e.g., tags) across schemas under Unity Catalog (Azure)
Hello all,I hope you are doing great!I want to synchronise metadata (e.g., description, comments, tags) across schemas under the Unity Catalog (e.g., test.dev, test.uat). For example, under the schema test.dev, there is a sales table with multiple co...
- 284 Views
- 5 replies
- 6 kudos
- 6 kudos
It's completely fine, and I do understand. Thank you for your time and effort here!
- 6 kudos
- 221 Views
- 3 replies
- 1 kudos
Resolved! Databricks Scenarios
I’m a data engineer with some experience in Databricks. I’m looking for real-life scenarios that are commonly encountered by data engineers. Could you also provide details on how to implement these scenarios?
- 221 Views
- 3 replies
- 1 kudos
- 1 kudos
Generic topic. Here are few latest article to help you on thishttps://community.databricks.com/t5/get-started-guides/getting-started-with-databricks-build-a-simple-lakehouse/tac-p/139492#M29https://community.databricks.com/t5/announcements/big-book-o...
- 1 kudos
- 162 Views
- 2 replies
- 1 kudos
What are the best ways to implement transcription in podcast apps?
I am starting this discussion for everyone who can answer my query.
- 162 Views
- 2 replies
- 1 kudos
- 1 kudos
1. Use Speech-to-Text Models via MLflowIntegrate open-source models like OpenAI Whisper, Hugging Face Wav2Vec2, or AssemblyAI API.Log the model in MLflow for versioning and reproducibility.Deploy as a Databricks Model Serving endpoint for real-time t...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
14 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 128 | |
| 62 | |
| 57 | |
| 42 |