Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
The message you are trying to access is permanently deleted.
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insig...
Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...
In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...
I have completed Databricks Fundamentals Accreditation, but I haven't received the certification badge.Below attached here the certification, and my email address is: deelakawalagama@outlook.com.Also, I would like to add this badge to my LinkedIn pro...
Hi, I’m preparing for the Databricks Certified Data Engineer Associate exam.Could you please let me know how I can get a voucher or discount code for the exam?Are there any ongoing events, trainings, or promotions that provide vouchers?Thanks in adva...
Real-time mode is a breakthrough that lets Spark utilize all available CPUs to process records with single-millisecond latency, while decoupling checkpointing from per-record processing.
For many data engineers who love PySpark, the most significant improvement of 2025 was the addition of merge to the dataframe API, so no more Delta library or SQL is needed to perform MERGE. p.s. I still prefer SQL MERGE inside spark.sql()
Hi, all. I'm getting stuck on the second step of the "Get Started with Databricks Free Edition" course. It's obviously instructing me to download a zip file from a GitHub repository, but there is no link provided for the repository. What am I missing...
I am also stuck at the same step. There is no link to the dataset mentioned or anyway to download it
i want to do certification in Databricks Certified Data Engineer Associate is there any discount ? and what will it cost in India .
Hello @Ritesh-Dhumne! Databricks offers 50% certification vouchers during its Learning Festival events.The upcoming Learning Festival is scheduled for January 9 – 30, 2026, where you can earn a discount voucher. You can find more details here: Self-P...
I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...
We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and fin...
New Lakakebase experience is a game-changer for transactional databases. That functionality is fantastic. Autoscaling to zero makes it really cost-effective. Do you need to deploy to prod? Just branch the production database to the release branch, an...
HiI am planning to take Databricks Certified Data Engineer Professional Exam. The fee is highly unaffordable for me. I am looking for discount vouchers. Please let me know if there is a way to get discount on the exam.Thank you!RegardsKrishna
Another learning festival is coming with 50% discount vouchers!
Hello,We are a team of 5 ( DE/ Architects ) exploring the idea of starting a small consulting company focused on Databricks as a SI partner and wanted to learn from others who have gone through the partnership journey.I would love to understand how t...
If I’m being completely honest, I haven’t seen any. As you can imagine, partner organizations tend to keep things pretty close to the vest for a variety of reasons. That said, once a new partner is officially enrolled, they are granted access to an e...
Ingestion from SharePoint is now available directly in PySpark. Just define a connection and use spark-read or, even better, spark-readStream with an autoloader. Just specify the file type and options for that file (pdf, csv, Excel, etc.)
Excel The big news this week is the possibility of native importing Excel files. Write operations are also possible. There is a possibility of choosing a data range. It also works with the streaming autoloader, currently in beta. GPT 5.2 The same day...
Hi Guys,Has anyone sat the Databricks Certified Data Analyst Associate exam so far? I’m planning to take this exam next month and would really appreciate it if you could share your experience.How was the difficulty level overall? Were the questions m...
Hi @Sadie_james,I have cleared the DA Ass. Exam a month ago. Would suggest the only guide you need is already provided by Databricks - https://www.databricks.com/learn/certification/data-analyst-associateDo take the related Data Analysis with Databri...
ZeroBus changes the game: you can now push event data directly into Databricks, even from on-prem. No extra event layer needed. Every Unity Catalog table can act as an endpoint.
| User | Count |
|---|---|
| 212 | |
| 193 | |
| 94 | |
| 86 | |
| 76 |