cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
PSA: Community Edition retires on January 1, 2026. Move to the Free Edition today to keep your work.

Databricks Free Edition is the new home for personal learning and exploration on Databricks. It’s perpetually free and built on modern Databricks - the same Data Intelligence Platform used by professionals. Free Edition lets you learn professional da...

  • 1865 Views
  • 1 replies
  • 4 kudos
3 weeks ago
🎤 Call for Presentations: Data + AI Summit 2026 is Open!

June 15–18, 2026 Are you building the future with data and AI? Then this is your moment. The Call for Proposals for Data + AI Summit 2026 is officially open, and we want to hear from builders, practitioners, and innovators across the data and AI com...

  • 2261 Views
  • 4 replies
  • 6 kudos
3 weeks ago
Level Up with Databricks Specialist Sessions

How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path S...

  • 3402 Views
  • 2 replies
  • 9 kudos
10-02-2025
Celebrating Our First Brickster Champion: Louis Frolio

Our Champion program has always celebrated the customers who go above and beyond to engage, help others, and uplift the Community. Recently, we have seen remarkable participation from Bricksters as well—and their impact deserves recognition too. Begi...

  • 1542 Views
  • 7 replies
  • 14 kudos
11-21-2025
🌟 Community Pulse: Your Weekly Roundup! December 12 – 21, 2025

Learning doesn’t pause, and neither does the impact this Community continues to create!Across threads and time zones, the knowledge kept moving. Catch up on the highlights   Voices Shaping the Week        Featuring the voices that brought clarity, ...

  • 519 Views
  • 2 replies
  • 1 kudos
a week ago

Community Activity

Cert-Team
by Databricks Employee
  • 102592 Views
  • 217 replies
  • 17 kudos

Suspended Exam? Don't worry! Our support team can help!

Taking exams is so stressful already, and if your exam gets suspended it is extra stressful. Don't worry! Our support team is awesome and will help you get back on track. The most important step is to file a ticket in our Help Center. Be sure to inc...

  • 102592 Views
  • 217 replies
  • 17 kudos
Latest Reply
bhupesh952
New Contributor II
  • 17 kudos

My exam was suspended during the system check, before I was able to join the test. While verifying my camera and microphone, the exam was auto-suspended without any warning or explanation.The check-in process took approximately 10 minutes, and shortl...

  • 17 kudos
216 More Replies
Gaurav_784295
by > New Contributor III
  • 3476 Views
  • 4 replies
  • 1 kudos

pyspark.sql.utils.AnalysisException: Non-time-based windows are not supported on streaming DataFrames/Datasets

pyspark.sql.utils.AnalysisException: Non-time-based windows are not supported on streaming DataFrames/DatasetsGetting this error while writing can any one please tell how we can resolve it

  • 3476 Views
  • 4 replies
  • 1 kudos
Latest Reply
siva-anantha
Contributor
  • 1 kudos

I share the same perspective as @preetmdata on this

  • 1 kudos
3 More Replies
nkrish
by > Visitor
  • 27 Views
  • 4 replies
  • 0 kudos

Calling exe from notebook

How to call exe (c-sharp code based) from data bricks notebook?#csharp exe 

  • 27 Views
  • 4 replies
  • 0 kudos
Latest Reply
mukul1409
New Contributor
  • 0 kudos

Hi @nkrish Databricks notebooks cannot run a Windows based C sharp executable. Databricks compute runs on Linux and does not support executing native Windows exe files. Because of this, a C sharp exe cannot be called directly from a Databricks notebo...

  • 0 kudos
3 More Replies
SrikanthData_07
by > Visitor
  • 4 Views
  • 0 replies
  • 0 kudos

Data Engineer Associate Certification in 2026

Hi Community,I’m planning to take the Data Engineer Associate Certification in 2026 to advance my skills and career. Are there any ongoing promotions or opportunities to get a 50% discount voucher for the exam? It would be incredibly helpful if you c...

  • 4 Views
  • 0 replies
  • 0 kudos
kasinathanrames
by > New Contributor II
  • 13364 Views
  • 23 replies
  • 21 kudos

Free Certification voucher

looking for free certification voucher

  • 13364 Views
  • 23 replies
  • 21 kudos
Latest Reply
DavidClamp
New Contributor II
  • 21 kudos

Yes please I'd like a free certification voucher 

  • 21 kudos
22 More Replies
siva_pusarla
by > New Contributor II
  • 176 Views
  • 4 replies
  • 0 kudos

workspace notebook path not recognized by dbutils.notebook.run() when running from a workflow/job

result = dbutils.notebooks.run("/Workspace/YourFolder/NotebookA", timeout_seconds=600, arguments={"param1": "value1"}) print(result)I was able to execute the above code manually from a notebook.But when i run the same notebook as a job, it fails stat...

  • 176 Views
  • 4 replies
  • 0 kudos
Latest Reply
siva-anantha
Contributor
  • 0 kudos

@siva_pusarla: We use the following pattern and it works,1) Calling notebook - constant location used by Job.            + src/framework                   + notebook_executor.py2) Callee notebooks - dynamic            + src/app/notebooks             ...

  • 0 kudos
3 More Replies
dj4
by > New Contributor II
  • 218 Views
  • 4 replies
  • 2 kudos

Azure Databricks UI consuming way too much memory & laggy

This especially happens when the notebook is large with many cells. Even if I clear all the outputs scrolling the notebook is way too laggy. When I start running the code the memory consumption is 3-4GB minimum even if I am not displaying any data/ta...

  • 218 Views
  • 4 replies
  • 2 kudos
Latest Reply
siva-anantha
Contributor
  • 2 kudos

@dj4: Are you in a corporate proxy environment?Databricks Browser UI uses Web Sockets and sometimes the performance issues happen due to the security checks in the traffic. 

  • 2 kudos
3 More Replies
jeremy98
by > Honored Contributor
  • 2985 Views
  • 12 replies
  • 0 kudos

restarting the cluster always running doesn't free the memory?

Hello community,I was working on optimising the driver memory, since there are code that are not optimised for spark, and I was planning temporary to restart the cluster to free up the memory.that could be a potential solution, since if the cluster i...

Screenshot 2025-03-04 at 14.49.44.png
  • 2985 Views
  • 12 replies
  • 0 kudos
Latest Reply
siva-anantha
Contributor
  • 0 kudos

@jeremy98 : Please review the cluster's event logs to understand the trend of the GC related issues. Example in below snapshot.Typically, productive jobs are executed using Job clusters; and they stop as soon as the work is completed. Could you pleas...

  • 0 kudos
11 More Replies
Aravind17
by > New Contributor III
  • 36 Views
  • 3 replies
  • 0 kudos

Not received free voucher after completing Data Engineer Associate learning path

I have completed the Data Engineer Associate learning path, but I haven’t received the free certification voucher yet.I’ve already sent multiple emails to the concerned support team regarding this issue, but unfortunately, I haven’t received any resp...

  • 36 Views
  • 3 replies
  • 0 kudos
Latest Reply
Aravind17
New Contributor III
  • 0 kudos

I have sent the mail to the databricks team and they replied 

  • 0 kudos
2 More Replies
Jim_Anderson
by Databricks Employee
  • 47 Views
  • 0 replies
  • 1 kudos

FAQ for Databricks Learning Festival (Virtual): 09 January - 30 January 2026

General Q: How can I check whether I have completed the required modules? Login to your Databricks Customer Academy Account and select ‘My Activities’ within the user menu in the top left. Under the Courses tab, you can verify whether all courses wit...

  • 47 Views
  • 0 replies
  • 1 kudos
chalabit
by > Visitor
  • 13 Views
  • 0 replies
  • 0 kudos

Import .py files module does not work on VNET injected workspace

We have problem with import any python files as module on VNET injected workspace.For same folder structure (see bellow), the imports works on serverless clusters or in databricks managed workspace (i.e. create new azure databricks workspace without ...

chalabit_3-1767349860220.png chalabit_2-1767349733641.png chalabit_1-1767349674098.png
  • 13 Views
  • 0 replies
  • 0 kudos
Advika
by Community Manager
  • 519 Views
  • 2 replies
  • 2 kudos

🌟 Community Pulse: Your Weekly Roundup! December 12 – 21, 2025

Learning doesn’t pause, and neither does the impact this Community continues to create!Across threads and time zones, the knowledge kept moving. Catch up on the highlights   Voices Shaping the Week        Featuring the voices that brought clarity, ...

Screenshot 2025-11-07 at 8.12.47 PM.png
  • 519 Views
  • 2 replies
  • 2 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 2 kudos

Thanks for the mention @Advika 

  • 2 kudos
1 More Replies
Brahmareddy
by > Esteemed Contributor
  • 144 Views
  • 1 replies
  • 1 kudos

Happy New Year 2026 : Building, Learning, and Growing Together in the Year of Data + AI

Happy New Year to the Austin Databricks Community!A new year always feels like a fresh notebook. Clean pages, big ideas, and the excitement of building something better than before. As we step into this year, one thing is clear: Data + AI is no longe...

  • 144 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Happy New Year, @Brahmareddy You truly boosted the day with this post. Excited to bring plans to life, learn together, and keep strengthening the Databricks Community.

  • 1 kudos
Aravind17
by > New Contributor III
  • 19 Views
  • 1 replies
  • 0 kudos

Not received free voucher after completing Data Engineer Associate learning path

I have completed the Data Engineer Associate learning path, but I haven’t received the free certification voucher yet.I’ve already sent multiple emails to the concerned support team regarding this issue, but unfortunately, I haven’t received any resp...

  • 19 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Aravind17! This post appears to duplicate the one you recently posted. A response has already been provided to your recent post. I recommend continuing the discussion in that thread to keep the conversation focused and organised.

  • 0 kudos
yzhang
by > Contributor
  • 2242 Views
  • 6 replies
  • 2 kudos

iceberg with partitionedBy option

I am able to create a UnityCatalog iceberg format table:    df.writeTo(full_table_name).using("iceberg").create()However, if I am adding option partitionedBy I will get an error.  df.writeTo(full_table_name).using("iceberg").partitionedBy("ingest_dat...

  • 2242 Views
  • 6 replies
  • 2 kudos
Latest Reply
LazyGenius
New Contributor II
  • 2 kudos

I found weird behavior here while creating table using SQLIf you are creating new table and have added partition column at the last of the column mapping it won't work but if you add it at the beginning it will work!!For example :-Below query will wo...

  • 2 kudos
5 More Replies
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Connect with peers through User Groups and stay updated by subscribing to Events. We are excited to see you engage!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog

[PARTNER BLOG] Zerobus Ingest on Databricks

Introduction TL;DR ZeroBus Ingest is a serverless, Kafka-free ingestion service in Databricks that allows applications and IoT devices to stream data directly into Delta Lake with low latency and mini...

421Views 1kudos