- 450 Views
- 1 replies
- 1 kudos
Hi I have similiar issue:https://community.databricks.com/s/question/0D58Y00008oTUQPSA4/voucher-code-error
- 450 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @Rishabh Jain​ , Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training and our team will get back to you shortly.
- 1147 Views
- 3 replies
- 1 kudos
I would like to register for the Data Engineer Associate exam. I have also passed the Databricks Lakehuse Fundamentals accreditation (id: 61076337), but on the registration site: https://www.webassessor.com/ does not allow me to purchase the exam, bu...
- 1147 Views
- 3 replies
- 1 kudos
Latest Reply
Hi @Antonio Rodilosso​ ,As far as I know voucher is not compulsory. You can register for an exam by paying the amount. Its absolutely possible to register for an exam without voucherTo earn a voucher you can check the below url.http://msdatalab.net/h...
2 More Replies
- 3015 Views
- 3 replies
- 2 kudos
I am attempting to use autoloader to add a number of csv files to a delta table. The underlying csv files have spaces in the attribute names though (i.e. 'Account Number' instead of 'AccountNumber'). When I run my autoload, I get the following error ...
- 3015 Views
- 3 replies
- 2 kudos
Latest Reply
@Hubert Dudek​ thanks for your response! I was able to use what you proposed above to generate the schema. The issue is that the schema sets all attributes to STRING values and renames them numerically ('_c0', '_c1', etc.). Although this allows us to...
2 More Replies
- 1063 Views
- 3 replies
- 6 kudos
I am updating the delta table in Databricks as follows segments_data.alias('segments_old').merge(
segments_data_new.alias("updates"),
"segments_old.source_url = updates.source_url",
).whenMatchedUpdate(
set={"segments"...
- 1063 Views
- 3 replies
- 6 kudos
- 1211 Views
- 2 replies
- 5 kudos
We have multiple environments where the same tables are added so it's really hard to manually update the schema of the table across all the environments. We know that it's not ideal to update table schema a lot but our product is still evolving and s...
- 1211 Views
- 2 replies
- 5 kudos
Latest Reply
Thanks for the reply @Pat Sienkiewicz​ .
1 More Replies
by
elgeo
• Valued Contributor II
- 934 Views
- 1 replies
- 1 kudos
- 934 Views
- 1 replies
- 1 kudos
Latest Reply
elgeo
Valued Contributor II
Hello. Any update on this please? Thank you in advance
- 1223 Views
- 0 replies
- 3 kudos
Using the Azure Databricks R Shiny example, I am easily able to instantiate a R Shiny web front session. The URL that is provided by Databricks provides the proxy session that allows my targeted audience to directly access my `R Shiny` app using tha...
- 1223 Views
- 0 replies
- 3 kudos
- 838 Views
- 4 replies
- 1 kudos
Have completed my certificate on 27th October for Databricks Data Analyst Associate but haven't received my Certificate/ Badge for the same.
- 838 Views
- 4 replies
- 1 kudos
Latest Reply
Hello Priyanshu, your certificate was issued today. Please check your email as I resent the information.Thank you
3 More Replies
by
Aritra
• New Contributor II
- 997 Views
- 4 replies
- 0 kudos
i am running into issues importing the scalable-machine-learning-with-apache-spark library into databricks. specifically, cloning from git library or %pip install from git library directly to Databricks. Any help is appreciated
- 997 Views
- 4 replies
- 0 kudos
Latest Reply
Hi @Aritra Guha​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
3 More Replies
by
talha
• New Contributor III
- 2089 Views
- 5 replies
- 0 kudos
Executed a spark-submit job through databricks cli with the following job configurations.{
"job_id": 123,
"creator_user_name": "******",
"run_as_user_name": "******",
"run_as_owner": true,
"settings": {
"name": "44aa-8447-c123aad310",
...
- 2089 Views
- 5 replies
- 0 kudos
Latest Reply
talha
New Contributor III
Not really sure if running spark on local mode. But have used alternate property spark.executor.memoryand passed it as --conf and now it works
4 More Replies
- 248 Views
- 0 replies
- 1 kudos
I haven't received my Databricks Certified Data Engineer AssociateI have pass my certification exam, Databricks Certified Data Engineer Associate on 27 October 2022.. I am yet to receive a certificate or badge. Any help is much appreciated. I have a ...
- 248 Views
- 0 replies
- 1 kudos
by
elgeo
• Valued Contributor II
- 1385 Views
- 0 replies
- 4 kudos
Hello experts. We are trying to clarify how to clean up the large amount of files that are being accumulated in the _delta_log folder (json, crc and checkpoint files). We went through the related posts in the forum and followed the below:SET spark.da...
- 1385 Views
- 0 replies
- 4 kudos
by
327753
• New Contributor III
- 1265 Views
- 4 replies
- 6 kudos
When developing locally, I can write %debug in a new cell after encountering an error, and jump into the function that the error originated from. In Databricks, this freezes the notebook indefinitely.For example:In [1]:def query_data(): df_full = qu...
- 1265 Views
- 4 replies
- 6 kudos
Latest Reply
I just upgraded my personal node and %debug worked! I appreciate the reminder to use pdb() itself when appropriate too. I'm still interested in whether we should have any concerns about upgrading our main cluster - please do let me know, and then I'l...
3 More Replies
by
ebyhr
• New Contributor II
- 3514 Views
- 5 replies
- 3 kudos
I sometimes get the below error recently in version 10.4 LTS. Any solution to fix the intermittent failure? I added retry logic in our code, but Databricks query succeeded (even though it threw an exception) and it leads to the unexpected table statu...
- 3514 Views
- 5 replies
- 3 kudos
Latest Reply
Hi @Yuya Ebihara​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...
4 More Replies
- 409 Views
- 0 replies
- 2 kudos
Hi, I have several clusters, some with a 45% max spot price, some more important with a higher value. Want to know what is the best way to configure this but cannot find anything (a value of how many nodes of the last run were On-demand will do the t...
- 409 Views
- 0 replies
- 2 kudos