cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Explore learning resources, share insights, and collaborate with peers to enhance your skills in data engineering, machine learning, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Certifications

Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...

874 Posts

Training offerings

Explore discussions on Databricks training programs and offerings within the Community. Get insights...

198 Posts

Get Started Discussions

Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...

2715 Posts

Activity in Community Discussions

tarunnagar
by > Contributor
  • 254 Views
  • 4 replies
  • 0 kudos

Best Development Strategies for Building Reusable Data Engineering Components in Databricks

I’m looking to gather insights from data engineers, architects, and developers who have experience building scalable pipelines in Databricks. Specifically, I want to understand how to design, implement, and manage reusable data engineering components...

  • 254 Views
  • 4 replies
  • 0 kudos
Latest Reply
Davidwilliamkt
  • 0 kudos

The best strategy is to build modular, parameterized, Delta-optimized functions and package them into reusable Python modules, while keeping Databricks notebooks only for orchestration. This creates consistent, scalable, and easily maintainable data ...

  • 0 kudos
3 More Replies
curious_rabbit
by > Visitor
  • 19 Views
  • 1 replies
  • 0 kudos

Getting Genie to Generate SPC (Control) Charts Reliably

Hi everyone!I’m working on getting Genie to accurately generate Statistical Process Control (SPC) charts when prompted.  I'm looking for suggestions on how to best approach this. So far, I’ve tried using pre-defined SQL queries to select the data, bu...

  • 19 Views
  • 1 replies
  • 0 kudos
Latest Reply
curious_rabbit
  • 0 kudos

Or here is hopefully a more elegant way to phrase my question:To visualise a control diagram in Genie for an end-user, should I a) instruct Genie how to create an SPC chart with SQL on the fly, of b) create a background job (pre-defined SQL query in ...

  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 56 Views
  • 1 replies
  • 2 kudos

Databricks Advent Calendar 2025 #11

Real-time mode is a breakthrough that lets Spark utilize all available CPUs to process records with single-millisecond latency, while decoupling checkpointing from per-record processing.

2025_11.png
  • 56 Views
  • 1 replies
  • 2 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 2 kudos

useful one

  • 2 kudos
greengil
by > New Contributor
  • 139 Views
  • 6 replies
  • 2 kudos

Create function issue

Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...

  • 139 Views
  • 6 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...

  • 2 kudos
5 More Replies
Poorva21
by > New Contributor
  • 160 Views
  • 2 replies
  • 3 kudos

How realistic is truly end-to-end LLMOps on Databricks?

Databricks is positioning the platform as a full stack for LLM development — from data ingestion → feature/embedding pipelines → fine-tuning (Mosaic AI) → evaluation → deployment (Model Serving) → monitoring (Lakehouse Monitoring).I’m curious about r...

  • 160 Views
  • 2 replies
  • 3 kudos
Latest Reply
Poorva21
New Contributor
  • 3 kudos

Thank You @Gecofer for taking the time to share such a clear, experience-backed breakdown of where Databricks shines and where real-world LLM Ops architectures still need supporting components. Your explanation was incredibly practical and resonates ...

  • 3 kudos
1 More Replies
DBXDeveloper111
by > New Contributor II
  • 106 Views
  • 3 replies
  • 3 kudos

Resolved! Software engineering in data bricks

I'm a software engineer and a bit new to databricks.  My goal is to create a model serving endpoint, that interfaces with several ML models. Traditionally this would look like:API--> Service --> DataNow using databricks, my understanding is that it w...

  • 106 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Just register model and then deploy service endpoint to serve this model.

  • 3 kudos
2 More Replies
saurabh18cs
by > Honored Contributor II
  • 85 Views
  • 2 replies
  • 0 kudos

Results of october badge challenge - Partner Learning

Hi Databricks,our learners wants to know when are we disclosing the results of october badge challenge - learningsBr

  • 85 Views
  • 2 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @saurabh18cs! Have you checked with your company's Databricks Partner Admin or Databricks Support? They’re the right contacts to provide the official timeline for the Partner Learning Badge Challenge results.

  • 0 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 72 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #10

Databricks goes native on Excel. You can now ingest + query .xls/.xlsx directly in Databricks (SQL + PySpark, batch and streaming), with auto schema/type inference, sheet + cell-range targeting, and evaluated formulas, no extra libraries anymore.

2025_10.png
  • 72 Views
  • 0 replies
  • 2 kudos
bianca_unifeye
by > Contributor
  • 45 Views
  • 1 replies
  • 1 kudos

Webinar: From PoC to Production: Delivering with Confidence with Databricks

  Our final webinar of December is here and we are closing the year with a powerhouse session!SpeakersAs many organisations still get stuck in the PoC phase, we’re bringing clarity, structure, and real delivery practices to help teams move from promi...

1764757532452 (1).jpg
  • 45 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Appreciate you sharing this with the community, @bianca_unifeye!

  • 1 kudos
lance-gliser
by > New Contributor
  • 4363 Views
  • 8 replies
  • 0 kudos

Databricks apps - Volumes and Workspace - FileNotFound issues

I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test:  def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...

  • 4363 Views
  • 8 replies
  • 0 kudos
Latest Reply
ksboli
New Contributor II
  • 0 kudos

I also cannot read from a Volume from a databricks app and would be interested in a solution

  • 0 kudos
7 More Replies
tarunnagar
by > Contributor
  • 133 Views
  • 3 replies
  • 2 kudos

Using Databricks for Real-Time App Data

I’m exploring how to handle real-time data for an application and I keep seeing Databricks recommended as a strong option — especially with its support for streaming pipelines, Delta Live Tables, and integrations with various event sources. That said...

  • 133 Views
  • 3 replies
  • 2 kudos
Latest Reply
Suheb
New Contributor III
  • 2 kudos

Databricks is very effective for real-time app data because it supports streaming data processing using Apache Spark and Delta Lake. It helps handle large data volumes, provides low-latency analytics, and makes it easier to build scalable event-drive...

  • 2 kudos
2 More Replies
bharathjs
by > New Contributor II
  • 11278 Views
  • 7 replies
  • 2 kudos

Alter table to add/update multiple column comments

I was wondering if there's a way to alter table and add/update comments for multiple columns at once using SQL or API calls. For instance -  ALTER TABLE <table_name> CHANGE COLUMN <col1> COMMENT '<comment1>',CHANGE COLUMN <col2> COMMENT 'comment2' ; ...

  • 11278 Views
  • 7 replies
  • 2 kudos
Latest Reply
dxwell
New Contributor II
  • 2 kudos

The correct SQL syntax for this is:ALTER TABLE your_table_name ALTER COLUMN col1 COMMENT 'comment1', col2 COMMENT 'comment2', col3 COMMENT 'comment3'; 

  • 2 kudos
6 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 76 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #9

Tags, whether manually assigned or automatically assigned by the “data classification” service, can be protected using policies. Column masking can automatically mask columns with a given tag for all except some with elevated access.

2025_9.png
  • 76 Views
  • 0 replies
  • 2 kudos
Renounce3295
by > New Contributor II
  • 6813 Views
  • 8 replies
  • 4 kudos

Not able to serve or interact with LLMs in Databricks free

Hi there,Just testing the new Databricks free edition. Was trying to play around with LLMs, but I', not able to create serving endpoints with foundational model entities, interact with pay-per-token foundational model APIs or use them in Databricks a...

  • 6813 Views
  • 8 replies
  • 4 kudos
Latest Reply
chaowu2016
New Contributor
  • 4 kudos

I got the same error using the free version.

  • 4 kudos
7 More Replies
tarunnagar
by > Contributor
  • 145 Views
  • 3 replies
  • 0 kudos

How to Connect Databricks with Web and Mobile Apps

Hi everyone,I’m exploring ways to leverage Databricks for building data-driven web and mobile applications and wanted to get some insights from this community. Databricks is great for processing large datasets, running analytics, and building machine...

  • 145 Views
  • 3 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

Check Databricks Apps - you pass databricks resources and then use databricks-sdk to interact with them.

  • 0 kudos
2 More Replies