I have a table A that is used in a spark.sql and joins with multiple other tables to get data. this data will be overwritten to the same table A.
When i tried this, i get an error consistently as below:
ERROR: An error occurred while calling o382.saveAsTable. Trace: py4j.Py4JException: Method saveAsTable([class java.util.HashSet]) does not exist at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:344) at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:352) at py4j.Gateway.invoke(Gateway.java:297) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750)
I tried writing same dataframe into new table and it works fine.
Are there some considerations to make when a dataframe containing results of a table is overwritten to same table?