cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Smitha1
by Valued Contributor II
  • 3903 Views
  • 10 replies
  • 9 kudos

Resolved! Request for reattempt voucher. Databricks Certified Associate Developer for Apache Spark 3.0 exam

Hi,I gave Databricks Certified Associate Developer for Apache Spark 3.0 exam today but missed by one percent. I got 68.33% and pass is 70%.I am planning to reattempt the exam, could you kindly give me another opportunity and provide reattempt voucher...

  • 3903 Views
  • 10 replies
  • 9 kudos
Latest Reply
shriya
New Contributor II
  • 9 kudos

Hi,I gave Databricks Certified Associate Developer for Apache Spark 3.0 Python exam yesterday but missed by three percent. I got 66.66% and pass is 70%.I am planning to reattempt the exam, could you kindly give me another opportunity and provide reat...

  • 9 kudos
9 More Replies
LiliL
by New Contributor
  • 566 Views
  • 1 replies
  • 1 kudos
  • 566 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Lili Levin​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
Aviral-Bhardwaj
by Esteemed Contributor III
  • 4868 Views
  • 2 replies
  • 2 kudos

Resolved! can anyone help with Spill Question

Spill occurs as a result of executing various wide transformations. However, diagnosing a spill requires one to proactively look for key indicators.Where in the Spark UI are two of the primary indicators that a partition is spilling to disk?a-   Exec...

  • 4868 Views
  • 2 replies
  • 2 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 2 kudos

@Aviral Bhardwaj​  I feel it is Option e. Stage and executor log files. Consolidated details at the Stage LevelDetails at the task and Executor Level Please let me know if you feel any other option is better.

  • 2 kudos
1 More Replies
Smitha1
by Valued Contributor II
  • 908 Views
  • 3 replies
  • 1 kudos

December exam voucher for Databricks Certified Associate Developer for Apache Spark 3.0 exam

Dear @Jose Gonzalez​  Hope you're having great day. This is of HIGH priority for me, I've to schedule exam in December before slots are full.I gave Databricks Certified Associate Developer for Apache Spark 3.0 exam on 30th Nov but missed by one perc...

  • 908 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Smitha Nelapati​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. 

  • 1 kudos
2 More Replies
venkat-bodempud
by New Contributor III
  • 1148 Views
  • 3 replies
  • 0 kudos

Databricks-PowerBI-Architecture-Help

Hello Community,We are currently working designing Power BI reports, the data source is databricks. We have all our reporting data in bronze/silver layer of databricks. we want to create summarized/aggregated tables in Gold layer and we want to conne...

  • 1148 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @bodempudi venkat​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us...

  • 0 kudos
2 More Replies
Simmy
by New Contributor II
  • 1376 Views
  • 3 replies
  • 1 kudos

Databricks Repos are pushing all changes to GitHub

When I make any changes within a repo, when I go to commit and push to GitHub if I uncheck any changes that I don't want pushed, still get pushed to Github. Any help would be appreciated

  • 1376 Views
  • 3 replies
  • 1 kudos
Latest Reply
Simmy
New Contributor II
  • 1 kudos

Hi @Vidula Khanna​ Problem is now resolved thanks, didn't have to do anything different, the functionality just started working as expected.

  • 1 kudos
2 More Replies
self-employed
by Contributor
  • 1604 Views
  • 3 replies
  • 6 kudos

Resolved! Can anyone help me to understand one question in PracticeExam-DataEngineerAssociate?

It is the practice exam for data engineer associateThe question is:A data engineering team has created a series of tables using Parquet data stored in an external system. The team is noticing that after appending new rows to the data in the external ...

  • 1604 Views
  • 3 replies
  • 6 kudos
Latest Reply
suny
New Contributor II
  • 6 kudos

Not an answer, just asking the databricks folks to clarify:I would also like to understand this. If there is no event emitted from the external parquet table (push) , and no active pulling or refreshing from the delta table side (pull), how is the un...

  • 6 kudos
2 More Replies
Smitha1
by Valued Contributor II
  • 1038 Views
  • 3 replies
  • 2 kudos

December exam free voucher for Databricks Certified Associate Developer for Apache Spark 3.0 exam.

Dear @Vidula Khanna​  Hope you're having great day. This is of HIGH priority for me, I've to schedule exam in December before slots are full.I gave Databricks Certified Associate Developer for Apache Spark 3.0 exam on 30th Nov but missed by one perc...

  • 1038 Views
  • 3 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

hey @Smitha Nelapati​ ,you can attend the below webinars and get the 75% off in Jan ​ 

  • 2 kudos
2 More Replies
Smitha1
by Valued Contributor II
  • 3932 Views
  • 10 replies
  • 6 kudos

Resolved! onsite exam center registration Databricks Certified Associate Developer for Apache Spark 3

Dear All @Nadia Elsayed​  @Vidula Khanna​ @Harshjot Singh​ @Jose Gonzalez​ @Joseph Kambourakis​ Hope you are well and had a good weekend.I am still waiting to receive voucher after redeeming points which is due this weekMy issue is slots are full to ...

  • 3932 Views
  • 10 replies
  • 6 kudos
Latest Reply
nphau
Valued Contributor
  • 6 kudos

I have the same problem as you. I submitted a ticket to Databricks "Help to re-schedule assessment day in webassessor", but they responsed as below: " Please accept my apologies for the inconvenience caused and the delay in responding. I'm sorry to i...

  • 6 kudos
9 More Replies
THIAM_HUATTAN
by Valued Contributor
  • 2660 Views
  • 6 replies
  • 5 kudos

Error in Databricks code?

https://www.databricks.com/notebooks/recitibikenycdraft/data-preparation.htmlCould someone help to see in that Step 3: Prepare Calendar Info# derive complete list of dates between first and last datesdates = ( spark .range(0,days_between).withCol...

  • 2660 Views
  • 6 replies
  • 5 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 5 kudos

Hi @THIAM HUAT TAN​ In your notebook, you are creating a integer column days_between with the codedays_between = (last_date - first_date).days + 10Logically speaking, what the nb trying to do is to fetch all the dates between two dates to do a foreca...

  • 5 kudos
5 More Replies
Shalabh007
by Honored Contributor
  • 2762 Views
  • 5 replies
  • 19 kudos

Practice Exams for Databricks Certified Data Engineer Professional exam

Can anyone help with official Practice Exams set for Databricks Certified Data Engineer Professional exam, like we have below for Databricks Certified Data Engineer AssociatePractice exam for the Databricks Certified Data Engineer Associate exam

  • 2762 Views
  • 5 replies
  • 19 kudos
Latest Reply
Nayan7276
Valued Contributor II
  • 19 kudos

hi @Shalabh Agarwal​ I am not able to find any official practice paper. it is still not available.

  • 19 kudos
4 More Replies
Anonymous
by Not applicable
  • 1103 Views
  • 4 replies
  • 19 kudos

Resolved! How to prepare for Databricks Data Engineer Professional

Hi all,Could you please help suggest me some resource to prepare for " Databricks Data Engineer Professional" exam?I have also take the course in Databricks Accademy but seems not enough for this exam?Thank you so much!!!Best Regards,Nhan Nguyen

  • 1103 Views
  • 4 replies
  • 19 kudos
Latest Reply
Unforgiven
Valued Contributor III
  • 19 kudos

waitting some one post more details and experience road map to take exam

  • 19 kudos
3 More Replies
ClaudeR
by New Contributor III
  • 2808 Views
  • 4 replies
  • 1 kudos

Resolved! Can someone help me understand how compute pricing works.

Im looking at using Databricks internally for some Data Science projects. I am however very confused to how the pricing works and would like to obviously avoid high spending right now. Internal documentation and within Databricks All-Purpose Compute...

  • 2808 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Claude Repono​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 1 kudos
3 More Replies
sunil_smile
by Contributor
  • 3777 Views
  • 5 replies
  • 6 kudos

Apart from notebook , is it possible to deploy an application (Pyspark , or R+spark) as a package or file and execute them in Databricks ?

Hi,With the help of Databricks-connect i was able to connect the cluster to my local IDE like Pycharm and Rstudio desktop version and able to develop the application and committed the code in Git.When i try to add that repo to the Databricks workspac...

image image
  • 3777 Views
  • 5 replies
  • 6 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 6 kudos

may be you will be interested our db connect . not sure if that resolve your issue to connect with 3rd party tool and setup ur supported IDE notebook serverhttps://docs.databricks.com/dev-tools/databricks-connect.html

  • 6 kudos
4 More Replies
afshinR
by New Contributor III
  • 518 Views
  • 1 replies
  • 1 kudos

Hi, could you please help me with my question? i have not get any answers.

Hi,could you please help me with my question? i have not get any answers.

  • 518 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @afshin riahi​ , Yes, Definitely I can help you with it.Please wait while I or someone from the community gets back with a response.Thank you for your patience .

  • 1 kudos
Labels