- 1046 Views
- 2 replies
- 20 kudos
Understanding Unity Catalog in Databricks In Databricks, the Unity Catalog is a data catalog that allows you to store, access, and manage data within your Databricks workspace. It provides a unified interface for working with data across different s...
- 1046 Views
- 2 replies
- 20 kudos
Latest Reply
Nice one!Keep sharing such informative posts.
1 More Replies
by
tanjil
• New Contributor III
- 1278 Views
- 2 replies
- 2 kudos
Hello, I have the following minimum example working example using multiprocessing:from multiprocessing import Pool
files_list = [('bla', 1, 3, 7), ('spam', 12, 4, 8), ('eggs', 17, 1, 3)]
def f(t):
print('Hello from child process', flush = Tr...
- 1278 Views
- 2 replies
- 2 kudos
Latest Reply
No errors are generated. The code executes successfully, but there the print statement for "Hello from child process" does not work.
1 More Replies
by
Optum
• New Contributor III
- 5302 Views
- 10 replies
- 4 kudos
Hello,I'm trying to write to a Delta Table in my Databricks instance from a remote Spark session on a different cluster with the Simba Spark driver. I can do reads, but when I attempt to do a write, I get the following error:{ df.write.format("jdbc...
- 5302 Views
- 10 replies
- 4 kudos
Latest Reply
Atanu
Esteemed Contributor
Could you try setting the flag to ignore transactions? I’m not sure what the exact flag is, but there should be more details in the JDBC manual on how to do this
9 More Replies
- 613 Views
- 2 replies
- 0 kudos
My DB portal says the platform version is v3.86, and provides a link to all the releases. But none of those releases state the "platform version". And I can't find "v3.86" by searching in the Databricks docs.So, how does one find the documentation fo...
- 613 Views
- 2 replies
- 0 kudos
Latest Reply
Hey @Samuel Yang ,Here you will find all detailshttps://docs.databricks.com/release-notes/index.htmlThanksAviral Bhardwaj
1 More Replies
- 1385 Views
- 3 replies
- 4 kudos
Is there a way to have databricks pull a diagram directly from visio? I've tried to use the embed links from visio but the image won't render. I'm trying to get around loading the image to DBFS as there may be updates to the image that I want it to g...
- 1385 Views
- 3 replies
- 4 kudos
- 971 Views
- 7 replies
- 0 kudos
Hi Databricks Team, I passed the associate data engineer exam day before but still haven't received on accredible or on db academy. My registered email id for exam is sharath.koushik@gmail.com. Could you please help ?
- 971 Views
- 7 replies
- 0 kudos
Latest Reply
Hi @Sharath K Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
6 More Replies
by
shamly
• New Contributor III
- 1742 Views
- 3 replies
- 2 kudos
Dear Friends,I have a csv and it looks like this‡‡Id‡‡,‡‡Version‡‡,‡‡Questionnaire‡‡,‡‡Date‡‡‡‡123456‡‡,‡‡Version2‡‡,‡‡All questions have been answered accurately and the guidance in the questionnaire was understood and followed‡‡,‡‡2010-12-16 00:01:...
- 1742 Views
- 3 replies
- 2 kudos
Latest Reply
This is working fine, from pyspark.sql.functions import regexp_replace
path="dbfs:/FileStore/df/test.csv"
dff = spark.read.option("header", "true").option("inferSchema", "true").option('multiline', 'true').option('encoding', 'UTF-8').option("delimi...
2 More Replies
- 1112 Views
- 6 replies
- 8 kudos
Hello World! This my first databricks community post. Looking forward to contribute from my end. Peace out!@Dinesh Mergu
- 1112 Views
- 6 replies
- 8 kudos
Latest Reply
Welcome to the community @Parijat Dhar !!
5 More Replies
- 2965 Views
- 9 replies
- 4 kudos
Hi!I have a delta table and a process that reading a stream from this table.I need to drop the NOT NULL constraint from some of the columns of this table.The first drop command does not affect the reading stream.But the second command results in erro...
- 2965 Views
- 9 replies
- 4 kudos
Latest Reply
Hi @Anatoly Tikhonov Hope everything is going great.Does @Kaniz Fatma response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
8 More Replies
by
gj0904
• New Contributor III
- 1758 Views
- 8 replies
- 5 kudos
Hi ThereI successfully passed the exam on 27th Oct 2022 - but I haven't received the certificate yet.
- 1758 Views
- 8 replies
- 5 kudos
Latest Reply
Hi @Gaurav Jhamb Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...
7 More Replies
- 1656 Views
- 6 replies
- 0 kudos
I've successfully passed Databricks Data Engineer Associate Certified exam but still have not received the certificate. Could you help on it please.
- 1656 Views
- 6 replies
- 0 kudos
Latest Reply
Hi @Venkata Sai Anuroop Samudrala Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to he...
5 More Replies
- 867 Views
- 2 replies
- 1 kudos
How can I configure a DLT pipeline to use an existing running cluster? I don't see where in the settings to set the pipeline to use an existing cluster. Instead it wants to always standup a new cluster.
- 867 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Justin Stuparitz Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
1 More Replies
- 1587 Views
- 3 replies
- 0 kudos
- 1587 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Alejandro Martinez Hope everything is going great.Does @Sivaprasad C S response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
2 More Replies