โ08-28-2024 12:50 AM
I am trying to write logs to delta table, but after running for sometime the job is getting stuck at saveAsTable.
Traceback (most recent call last):
File "/databricks/spark/python/pyspark/errors/exceptions.py", line 228, in deco
return f(*a, **kw)
File "/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o3007.saveAsTable.
Any suggestions would be helpful.
โ08-28-2024 01:29 AM
@bhakti looks like some permission issues is there . can you please share the code to take more reference.
โ08-28-2024 02:16 AM
โ08-28-2024 03:17 AM
@bhakti @RohitKulkarni Both can be the reason either the permission issue or schema mismatch .
โ08-28-2024 03:37 AM
@bhakti: Please run the below script :
try:
df.write.format("delta").mode("append").saveAsTable("table_name")
except Exception as e:
print(f"Error: {str(e)}")
and let us know the error.
Thanks
Rohit
โ08-28-2024 02:04 AM
@bhakti : There is issue with data type mis-match.
โ08-28-2024 02:06 AM
โ08-28-2024 04:12 AM
Hey @bhakti ! Please provide the full stack trace / error message. Your log doesn't provide any strong clue, the failure during write might occur for various reasons.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group