cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

greengil
by New Contributor II
  • 317 Views
  • 8 replies
  • 2 kudos

Create function issue

Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...

  • 317 Views
  • 8 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...

  • 2 kudos
7 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 30 Views
  • 0 replies
  • 0 kudos

Databricks Advent Calendar 2025 #16

For many data engineers who love PySpark, the most significant improvement of 2025 was the addition of merge to the dataframe API, so no more Delta library or SQL is needed to perform MERGE. p.s. I still prefer SQL MERGE inside spark.sql()

2025_16.png
  • 30 Views
  • 0 replies
  • 0 kudos
DataYoga
by New Contributor
  • 5018 Views
  • 4 replies
  • 0 kudos

Informatica ETLs

I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...

  • 5018 Views
  • 4 replies
  • 0 kudos
Latest Reply
zalmane
New Contributor
  • 0 kudos

We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and fin...

  • 0 kudos
3 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 48 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #15

New Lakakebase experience is a game-changer for transactional databases. That functionality is fantastic. Autoscaling to zero makes it really cost-effective. Do you need to deploy to prod? Just branch the production database to the release branch, an...

2025_15.png
  • 48 Views
  • 0 replies
  • 2 kudos
Peter_Theil
by New Contributor
  • 115 Views
  • 3 replies
  • 3 kudos

Databricks partner journey for small firms

Hello,We are a team of 5 ( DE/ Architects ) exploring the idea of starting a small consulting company focused on Databricks as a SI partner and wanted to learn from others who have gone through the partnership journey.I would love to understand how t...

  • 115 Views
  • 3 replies
  • 3 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 3 kudos

If I’m being completely honest, I haven’t seen any. As you can imagine, partner organizations tend to keep things pretty close to the vest for a variety of reasons. That said, once a new partner is officially enrolled, they are granted access to an e...

  • 3 kudos
2 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 69 Views
  • 0 replies
  • 0 kudos

Databricks Advent Calendar 2025 #14

Ingestion from SharePoint is now available directly in PySpark. Just define a connection and use spark-read or, even better, spark-readStream with an autoloader. Just specify the file type and options for that file (pdf, csv, Excel, etc.)

2025_14.png
  • 69 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 101 Views
  • 0 replies
  • 1 kudos

Databricks Advent Calendar 2025 #12

All leading LLMs are available natively in Databricks: - ChatGPT 5.2 from the day of the premiere! - System catalog with AI schema in Unity Catalog has multiple LLMs ready to serve! - OpenAI, Gemini, and Anthropic are available side by side!

2025_12.png
  • 101 Views
  • 0 replies
  • 1 kudos
tarunnagar
by Contributor
  • 346 Views
  • 4 replies
  • 0 kudos

Best Development Strategies for Building Reusable Data Engineering Components in Databricks

I’m looking to gather insights from data engineers, architects, and developers who have experience building scalable pipelines in Databricks. Specifically, I want to understand how to design, implement, and manage reusable data engineering components...

  • 346 Views
  • 4 replies
  • 0 kudos
Latest Reply
Davidwilliamkt
New Contributor
  • 0 kudos

The best strategy is to build modular, parameterized, Delta-optimized functions and package them into reusable Python modules, while keeping Databricks notebooks only for orchestration. This creates consistent, scalable, and easily maintainable data ...

  • 0 kudos
3 More Replies
curious_rabbit
by New Contributor
  • 92 Views
  • 1 replies
  • 0 kudos

Getting Genie to Generate SPC (Control) Charts Reliably

Hi everyone!I’m working on getting Genie to accurately generate Statistical Process Control (SPC) charts when prompted.  I'm looking for suggestions on how to best approach this. So far, I’ve tried using pre-defined SQL queries to select the data, bu...

  • 92 Views
  • 1 replies
  • 0 kudos
Latest Reply
curious_rabbit
New Contributor
  • 0 kudos

Or here is hopefully a more elegant way to phrase my question:To visualise a control diagram in Genie for an end-user, should I a) instruct Genie how to create an SPC chart with SQL on the fly, of b) create a background job (pre-defined SQL query in ...

  • 0 kudos
Poorva21
by New Contributor II
  • 227 Views
  • 2 replies
  • 4 kudos

How realistic is truly end-to-end LLMOps on Databricks?

Databricks is positioning the platform as a full stack for LLM development — from data ingestion → feature/embedding pipelines → fine-tuning (Mosaic AI) → evaluation → deployment (Model Serving) → monitoring (Lakehouse Monitoring).I’m curious about r...

  • 227 Views
  • 2 replies
  • 4 kudos
Latest Reply
Poorva21
New Contributor II
  • 4 kudos

Thank You @Gecofer for taking the time to share such a clear, experience-backed breakdown of where Databricks shines and where real-world LLM Ops architectures still need supporting components. Your explanation was incredibly practical and resonates ...

  • 4 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels