Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
The message you are trying to access is permanently deleted.
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insig...
I received a certification voucher that must be used before May 2nd, 2026. I am scheduled to take the exam on-site in my country, but nationwide internet degradation and slow speeds are expected to last for several weeks, as announced in the news.My ...
Hello @rababid Please file a ticket with our support team and allow the support team 24-48 hours for a resolution. Please note that we cannot provide support via community. Thanks & Regards,@cert-ops
We are implementing an incremental load for semi-structured data [ complex nested Json file ]using Auto Loader. To handle schema drifts—such as new fields, changes in column order, or data type and precision modifications (e.g., Decimal and Integer)—...
Could you please help us with this? We are currently blocked in development. When new fields are added to the source JSON, the nested field names are captured in the _rescued_data (rescue) column. However, we do not get any indication of the type of ...
Based on the documentation, the Agents option should appear in the left sidebar under the AI/ML section. I don't see it, is because the region? I'm in us-east-2What should I do to see it? Thank you
@jose_moreira If you're using Free Edition, there are limitations that exists - give it a read https://docs.databricks.com/aws/en/getting-started/free-edition-limitations?spm=a2ty_o01.29997173.0.0.5bd055fbzevP3M#unsupported-features
We are extracting data from SQL Server/Oracle using ADF and storing it in Parquet format. When reading the files in Databricks using spark.read.parquet, decimal values are getting truncated—for example, 1245.1111111189979 becomes 1245.111111118. This...
Thank for your response.
Calling all Chennai residents! Join the Chennai User Group on Community! Are you passionate about our vibrant city of Chennai? Do you love connecting with like-minded individuals, expanding your knowledge, and contributing to a thriving Community...
Thank you, will that be happening in future or any what's up group is available now?
Hi @Cert-Team and @Cert-TeamOPS,I am writing to express my concern regarding the issues I faced during my Databricks Associate certification exam scheduled on 8th April 2026. Unfortunately, my exam was suspended due to eye-blinking detection, which ...
@Abarna_13 Please raise a ticket at https://help.databricks.com/s/contact-us.Do not share personal details over the community, not recommended.
"I am on a Premium AWS trial workspace (dbc-30503d28-2210). I have two issues:Personal access tokens are grayed out and I cannot generate themMy cluster cannot make outbound HTTP requests to external APIs (getting NameResolutionError when calling api...
Hi @Cert-Team, @SujithaI was attempting my Databricks Certified Data Engineer Professional exam on 12th April 2026 at 5:45 PM IST. The exam was going smoothly, and I had only 17 questions remaining when it was abruptly suspended. There was no warning...
Hello @Chhibber43724 ,Thank you for filing a ticket with our support team, Support team will respond shortly. Please note that we cannot provide support or handle exam suspensions via community.Thanks & Regards,@cert-ops
Hi Team,I am currently preparing for the Data Engineer Professional Certification exam planned for May,2026, and I had a question regarding the recent updates to the course content.Earlier, the course Data Engineer Learning Plan included detailed mod...
Hi Sumit,Thank you for your response, and apologies for my long mail.I noticed that the "Advanced techniques with Spark Declarative Pipeline" content has been significantly revised and now includes newer concepts such as flows and Iceberg reads, whic...
Hello,I have created a connection to a SQL Server. I have created a foreign catalog using this connection.When I show the catalog in the catalog explorer I can see the schemas and I can also see the tables and views in one schema. In another schema,...
@benno did you got this solved? In the documentation there is no clear mention about support for views.
Hey Databricks Community members, If you’re on databricks community asking questions about:• Spark jobs that are too slow• Delta Lake issues• Broken pipelines• Or just trying to learn it databricks in a practical way..We created bricksnotes.com for y...
Hi @Cert-Team @Cert-TeamOPS,Thank you for reaching out regarding my suspended exam (Ticket #00891706). I have already replied to the support team with my preferred rescheduling dates and times:15th April 2026, 8:00 PM IST16th April 2026, 8:00 PM ISTH...
@Chhibber43724 Patience is the key. Do not panic, the issue will be resolved. Posting here again and again won't make much a difference. Teams are working continuously to fix everyone's issue.
After completing all the relevant courses for the certification, I haven’t received the coupon code yet.
@YaminiPazhani Please raise a ticket along with screenshots of the completed modules at https://help.databricks.com/s/contact-us
I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...
@Walter_C Could you please share if there is any latest update on this ?
Hi Team,Could you please help with this ticket #00891919Thanks !
Hello @Kunal55 ,Thank you for filing a ticket with our support team, Support team will respond shortly. Please note that we cannot provide support or handle exam suspensions via community.Thanks & Regards,@cert-ops
| User | Count |
|---|---|
| 229 | |
| 214 | |
| 94 | |
| 82 | |
| 64 |