cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Self-Paced Learning Festival: 09 January - 30 January 2026

Grow Your Skills and Earn Rewards! Mark your calendar: January 09 – January 30, 2026 Join us for a three-week event dedicated to learning, upskilling, and advancing your career in data engineering, analytics, machine learning, and generative AI. ...

  • 56569 Views
  • 154 replies
  • 87 kudos
12-09-2025
🎤 Call for Presentations: Data + AI Summit 2026 is Open!

June 15–18, 2026 Are you building the future with data and AI? Then this is your moment. The Call for Proposals for Data + AI Summit 2026 is officially open, and we want to hear from builders, practitioners, and innovators across the data and AI com...

  • 4801 Views
  • 4 replies
  • 6 kudos
4 weeks ago
Level Up with Databricks Specialist Sessions

How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path S...

  • 3748 Views
  • 2 replies
  • 9 kudos
10-02-2025
Solution Accelerator Series | Scale cybersecurity analytics with Splunk and Databricks

Strengthening cybersecurity isn’t a one-time project — it’s an ongoing process of adapting to new and complex threats. The focus is on modernizing your data infrastructure and analytics capabilities to make security operations smarter, faster, and mo...

  • 418 Views
  • 1 replies
  • 2 kudos
Tuesday
🎬 Databricks Community 2025 Highlights | A Year, Built Together

2025 wasn’t just another year on the Databricks Community.It was a year of showing up, helping each other, and building trust.  Watch the 2025 Community Wrap-Up This video is a small reflection of that journey.It highlights the key contributors of 2...

  • 615 Views
  • 12 replies
  • 17 kudos
Wednesday
🌟 Community Pulse: Your Weekly Roundup! December 22, 2025 – January 04, 2026

The calendar changed. The conversations didn’t. The year turned somewhere between a solved thread and a new question. No pause. No reset – just learning, continuing right where it left off. Here’s how it unfolded   Voices That Moved the Conversatio...

  • 175 Views
  • 2 replies
  • 5 kudos
Tuesday

Community Activity

Dhruv-22
by > Contributor II
  • 40 Views
  • 2 replies
  • 0 kudos

Feature request: Allow to set value as null when not present in schema evolution

I want to raise a feature request as follows.Currently, in the Automatic schema evolution for merge when a column is not present in the source dataset it is not changed in the target dataset. For e.g.%sql CREATE OR REPLACE TABLE edw_nprd_aen.bronze.t...

Dhruv22_0-1767970990008.png Dhruv22_1-1767971051176.png Dhruv22_2-1767971116934.png Dhruv22_3-1767971213212.png
  • 40 Views
  • 2 replies
  • 0 kudos
Latest Reply
ManojkMohan
Honored Contributor II
  • 0 kudos

@Dhruv-22 ProblemWhen using MERGE INTO ... WITH SCHEMA EVOLUTION, if a column exists in the target table but is not present in the source dataset, that column is left unchanged on matched rows.Solution ThinkingThis can be emulated by introspecting th...

  • 0 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 9 Views
  • 0 replies
  • 0 kudos

Secret magic commands

Secret magic commands, there are a lot of them. Check my blogs to see which one can simplify your daily work. First one is %%writefile, which can be used to write a new file, for example, another notebook #databricks more magic commands:- https://dat...

magic1post.png
  • 9 Views
  • 0 replies
  • 0 kudos
Nivethan_Venkat
by Valued Contributor
  • 810 Views
  • 3 replies
  • 6 kudos

[PARTNER BLOG] Building an Agent-Native Data Quality Manager with Databricks Apps and DQX

Introduction: The Data Quality ParadoxWhat We BuiltThe Technology StackArchitecture Deep DiveThe Four-Layer ArchitectureLayer 1: UsersLayer 2: Databricks Apps PlatformLayer 3: Compute LayerLayer 4: Data LayerWhy This Design?The User Experience: A Com...

Nivethan_Venkat_0-1767562092725.png Nivethan_Venkat_0-1767563087238.png Nivethan_Venkat_0-1767563749696.png Nivethan_Venkat_2-1767563989250.png
  • 810 Views
  • 3 replies
  • 6 kudos
Latest Reply
Shivika
Databricks Employee
  • 6 kudos

Thanks for this. Can you please share the correct GiT Repo link?

  • 6 kudos
2 More Replies
Jim_Anderson
by Databricks Employee
  • 575 Views
  • 2 replies
  • 2 kudos

FAQ for Databricks Learning Festival (Virtual): 09 January - 30 January 2026

General Q: How can I check whether I have completed the required modules? Login to your Databricks Customer Academy Account and select ‘My Activities’ within the user menu in the top left. Under the Courses tab, you can verify whether all courses wit...

  • 575 Views
  • 2 replies
  • 2 kudos
Latest Reply
willmusgrave
New Contributor II
  • 2 kudos

Hi Jim, I believe this post might have a typo. "The voucher will be valid for approximately 90 days and will expire before 30 January 2026. You must schedule and sit for the exam before this date. We strongly recommend booking the slots early, since ...

  • 2 kudos
1 More Replies
Siladitya
by > New Contributor II
  • 28 Views
  • 1 replies
  • 0 kudos

Looking for a Databricks Data Engineer Associate

Hi everyone,I’m currently preparing for a Databricks certification and wanted to check if there are any exam vouchers, discounts, or community programs available. Paying the full exam fee is a bit difficult for me at the moment.If anyone has informat...

  • 28 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Siladitya ,You're lucky Currently, there is a Databricks Learning Festival. You can get 50% discount for any exam if you complete one learning path on databricks academy. Self-Paced Learning Festival: 09 January - 30 Janu... - Databricks Communi...

  • 0 kudos
RevanthV
by > New Contributor III
  • 39 Views
  • 3 replies
  • 2 kudos

Data validation with df writes using append mode

Hi Team,Recently i came across a situation where I had to write a huge data and it took 6 hrs to complete...later when i checked the target data , I saw 20% of the total records written incorrectly or corrupted because the source data itself was corr...

  • 39 Views
  • 3 replies
  • 2 kudos
Latest Reply
RevanthV
New Contributor III
  • 2 kudos

Hey @K_Anudeep , thanks a lot for tagging me into the GitHub issue.. This is exactly what I want " validate and commit" feature and i se you have already raised a PR for the same with a new option called . I will try this out and check if it satisfie...

  • 2 kudos
2 More Replies
ramsai
by > New Contributor
  • 51 Views
  • 5 replies
  • 2 kudos

Updating Job Creator to Service Principal

Regarding data governance best practices: I have jobs created by a user who has left the organization, and I need to change the job creator to a service principal. Currently, it seems the only option is to clone the job and update it. Is this the rec...

  • 51 Views
  • 5 replies
  • 2 kudos
Latest Reply
Sanjeeb2024
Contributor III
  • 2 kudos

I agree with @nayan_wylde , for auditing, creator is important and it should in immutable by nature. 

  • 2 kudos
4 More Replies
Sangamithra_pra
by > New Contributor II
  • 5391 Views
  • 15 replies
  • 3 kudos

Resolved! Exam got suspended

I am writing regarding the suspension of my Databricks Data Engineer Associate exam on 21.03.2024 due to suspected behavior flagged by my proctor. During the session, the proctor requested a full room scan, which I complied with, and everything was c...

  • 5391 Views
  • 15 replies
  • 3 kudos
Latest Reply
mukul27wno
Visitor
  • 3 kudos

Hi Databricks team @cert-ops @Advika My exam got suspended due to 'The system automatically flags and suspends exams when certain technical, behavioral, or environmental'everything was correct and to the pointcan you please check and reschedule my ex...

  • 3 kudos
14 More Replies
Rose_15
by > New Contributor
  • 67 Views
  • 3 replies
  • 0 kudos

Databricks SQL Warehouse fails when streaming ~53M rows via Python (token/session expiry)

Hello Team,I am facing a consistent issue when streaming a large table (~53 million rows) from a Databricks SQL Warehouse using Python (databricks-sql-connector) with OAuth authentication.I execute a single long-running query and fetch data in batche...

  • 67 Views
  • 3 replies
  • 0 kudos
Latest Reply
Sanjeeb2024
Contributor III
  • 0 kudos

Hi @Rose_15 - Thanks for the details. It is better to do a planning like number of tables, size and number of records and better to extract the files to a cloud storage and reload the data using any mechanism. Once your extraction is complete, you wi...

  • 0 kudos
2 More Replies
joseph_in_sf
by > New Contributor III
  • 42 Views
  • 0 replies
  • 0 kudos

Turn PRDs into Databricks PySpark pipelines with Agentic AI using Docker, Ollama, and QWEN3:4b

I’ve seen many examples of AI that can help you code individual routines and such, mostly junior-level coding help. The goal of this POC is to give the AI a general PRD containing coding examples and have it generate a Databricks pipeline from such.R...

  • 42 Views
  • 0 replies
  • 0 kudos
Mavvy
by > New Contributor
  • 19 Views
  • 0 replies
  • 0 kudos

Request for more Vocareum lab time

I didn't realized I drained my lab hours by not closing my screen or ending my labs when I paused for the day. Is there a way to get more lab time on Vocareum? 

  • 19 Views
  • 0 replies
  • 0 kudos
jfvizoso
by > New Contributor II
  • 12908 Views
  • 6 replies
  • 0 kudos

Can I pass parameters to a Delta Live Table pipeline at running time?

I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a parameter. I know you can have settings in the pipeline that you use in the DLT notebook, but it seems you can only assign values to them when crea...

  • 12908 Views
  • 6 replies
  • 0 kudos
Latest Reply
Sudharsan
New Contributor II
  • 0 kudos

@DeepakAI : May I know, how you resolved it?

  • 0 kudos
5 More Replies
Phani1
by > Databricks MVP
  • 3135 Views
  • 8 replies
  • 0 kudos

Triggering DLT Pipelines with Dynamic Parameters

Hi Team,We have a scenario where we need to pass a dynamic parameter to a Spark job that will trigger a DLT pipeline in append mode. Can you please suggest an approach for this?Regards,Phani

  • 3135 Views
  • 8 replies
  • 0 kudos
Latest Reply
Sudharsan
New Contributor II
  • 0 kudos

@koji_kawamura : I have more or less the same scenario say I have 3 tables.The sources and targets are different but I would like to use a generic pipeline and pass in the source and target as a parameter and run them parallely. @sas30 : can you be m...

  • 0 kudos
7 More Replies
Saad_Recruiter
by > Visitor
  • 18 Views
  • 0 replies
  • 0 kudos

Databricks Developer Atlanta GA

Hello, This is Saad from DS Technologies. I’m a Tech Recruiter connecting with professionals for current and upcoming technology opportunities.   Job Title: Databricks DeveloperLocation: Atlanta, GADuration: 18 to 24 months Contract Job Description:L...

  • 18 Views
  • 0 replies
  • 0 kudos
Saad_Recruiter
by > Visitor
  • 31 Views
  • 0 replies
  • 0 kudos

Databricks Developer Atlanta

Hello, This is Saad from DS Technologies. I’m a Tech Recruiter connecting with professionals for current and upcoming technology opportunities.   Job Title: Databricks DeveloperLocation: Atlanta, GADuration: 18 to 24 months Contract Job Description:L...

  • 31 Views
  • 0 replies
  • 0 kudos
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Connect with peers through User Groups and stay updated by subscribing to Events. We are excited to see you engage!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog

[PARTNER BLOG] Zerobus Ingest on Databricks

Introduction TL;DR ZeroBus Ingest is a serverless, Kafka-free ingestion service in Databricks that allows applications and IoT devices to stream data directly into Delta Lake with low latency and mini...

814Views 3kudos