cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

saveAsTable sometimes works sometimes dont

RobDineen
Contributor

I have the following Spark (Save As Table) example.  sometimes it works fine, sometimes it fails

Code below with file listed in "/temp" directory. 

RobDineen_0-1730992730937.png

This has worked fine as it is, but when I have to create a new Cluster, as I am using the community edition.

It fails, is there a transaction log I have to Vacuum or any other areas I need to clean up first?

Any help would be appreciated

Thank you all

 

4 REPLIES 4

-werners-
Esteemed Contributor III

if you use saveAsTable without any .format() option, it will be saved as a delta lake table.
You also do not specify a location so it gets saved to the default location.  I don't know what that is on community edition, but if something is already present with the same name, the write will fail.
You can try to use .mode("overwrite") with the write.
Or clean up the location first (that is of course only an option for test purposes).

I have updated the location so it is explicit. 

RobDineen_0-1730993930451.png

but if I use mode("overwrite") that fails as well. 

RobDineen_1-1730994110107.png

 

 

-werners-
Esteemed Contributor III

the overwrite goes after the write.
What you can do is this:
First write to a new table like testTable1 (delta lake).
The run the write again using write.mode("overwrite")... on the same table, should work.

RobDineen
Contributor

 

This seems to work, along with explicitly dropping the Database, and re running all code within Notebook.

Thank you

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group