by
744291
• New Contributor III
- 744 Views
- 4 replies
- 0 kudos
Anyone having extra voucher who are not plannning to give exams
- 744 Views
- 4 replies
- 0 kudos
Latest Reply
Hi @Rituparna Das​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so...
3 More Replies
- 5450 Views
- 2 replies
- 0 kudos
I want to define a column with null values in my dataframe using pyspark. This column will later be used for other calculations.What is the difference between creating it in these two different ways?df.withColumn("New_Column", lit(None))df.withColumn...
- 5450 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Sara Corral​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback ...
1 More Replies
- 800 Views
- 4 replies
- 0 kudos
Dear DB community!As I know, when the resource creator is out of the project all resources they create will be deleted as well. So the question is that: can I assign the owner role to a group, that will help protect the resource from deletion or not...
- 800 Views
- 4 replies
- 0 kudos
Latest Reply
Hi @trung nguyen​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback...
3 More Replies
by
kll
• New Contributor III
- 1129 Views
- 2 replies
- 0 kudos
I am attempting to render a map within jupyter notebook and keep bumping into output limit. Below is my code: import pydeck as pdk
import pandas as pd
COLOR_BREWER_BLUE_SCALE = [
[240, 249, 232],
[204, 235, 197],
[168, 221, 181],
...
- 1129 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Keval Shah​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...
1 More Replies
by
17780
• New Contributor II
- 2986 Views
- 3 replies
- 0 kudos
I created and used a Databricks Account for testing purposes. I want to delete that account. In the Databricks Account Web UI, there is no menu to delete an account. How should I delete it?
- 2986 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @??? ???????​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback ...
2 More Replies
by
KVNARK
• Honored Contributor II
- 1039 Views
- 2 replies
- 6 kudos
Can anyone let me know how we can load the database file into Azure Databricks from the azure blob.
- 1039 Views
- 2 replies
- 6 kudos
Latest Reply
Hi @KVNARK .​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback wil...
1 More Replies
- 4175 Views
- 3 replies
- 0 kudos
I have a basic 2 task job. The 1st notebook (task) checks whether the source file has changes and if so then refreshes a corresponding materialized view. In case we have no changes then I use dbutils.jobs.taskValues.set(key = "skip_job", value = 1) &...
- 4175 Views
- 3 replies
- 0 kudos
Latest Reply
@Michael Papadopoulos​ usually that should not be the case i think, as for task level we have 3 level notifications ( success, failure,start), where as whole job level skip option is available to discard notification . will see if some one from commu...
2 More Replies
- 1655 Views
- 5 replies
- 0 kudos
Hi,From which data bricks runtime will support Optimize and compaction
- 1655 Views
- 5 replies
- 0 kudos
Latest Reply
Optimize and compaction are operations commonly used in Apache Spark for optimizing and improving the performance of data storage and processing. Databricks, which is a cloud-based platform for Apache Spark, provides support for these operations on v...
4 More Replies
- 7784 Views
- 5 replies
- 1 kudos
Hi!I was following guide outlined here:https://kb.databricks.com/en_US/python/import-custom-ca-cert(also tried this: https://stackoverflow.com/questions/73043589/configuring-tls-ca-on-databricks)to add ca root certificate into Databricks cluster, but...
- 7784 Views
- 5 replies
- 1 kudos
Latest Reply
In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run %sh openssl s_client -connect <hostname>:<port>-showcerts -CAf...
4 More Replies
by
beer
• New Contributor II
- 846 Views
- 3 replies
- 0 kudos
I took the exam yesterday and passed the test. I haven't received any email from Databricks Academy. How long would it take to receive the certification?
- 846 Views
- 3 replies
- 0 kudos
by
Rik
• New Contributor III
- 722 Views
- 2 replies
- 0 kudos
I have disabled the IP Access List on my workspace and am trying to add an IP list through the IP Access List API. However, when adding a list, I get the INVALID_STATE response.The docs mention this is because:"If the new list would block the calling...
- 722 Views
- 2 replies
- 0 kudos
Latest Reply
"One possible workaround could be to (1) temporarily enable the IP Access List feature, (2) add the necessary IP addresses to the list, and then (3) disable the feature again. This way, you can add the IP addresses you need without blocking the curre...
1 More Replies
- 2243 Views
- 9 replies
- 0 kudos
Hello team,I have attended the webinar Databricks Certification Overview Series- Data Engineer on Jan 17Completed the Databricks Lakehouse fundamentals accreditation and Completed the survey.As per communication it is expected that I will receive Dat...
- 2243 Views
- 9 replies
- 0 kudos
Latest Reply
Hi @Indika Debnath​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
8 More Replies