I have a set of Spark Dataframes that I convert into Temp Views to run Spark SQL with. Then, I delete them after my logic/use is complete. The delete step throws an odd error that I am not sure how to fix. Looking for some tips on fixing it. As a note, the cluster is on Unity as a Shared Cluster.
df.createOrReplaceTempView(prefix_updates)
target.toDF().createOrReplaceTempView(prefix_main)
sql_begin = 'SELECT '+prefix_updates+'.* FROM '+prefix_updates+' INNER JOIN '+prefix_main+' ON '
merged_inserts = spark.sql(sql_begin+merge_match_ins+" WHERE "+prefix_updates+".ReplicationUTCDateTime > "+prefix_main+".ReplicationUTCDateTime")
merged_inserts.write.format("delta").mode("append").saveAsTable(catalog+'.'+schema+'.'+table_name)
spark.catalog.dropTempView(prefix_updates)
spark.catalog.dropTempView(prefix_main)
The code before spark.catalog.dropTempView works just fine without error. But when adding in the View delete statements I get this error now:
Py4JError: An error occurred while calling o1620.dropTempView. Trace:
py4j.security.Py4JSecurityException: Method public boolean org.apache.spark.sql.internal.CatalogImpl.dropTempView(java.lang.String) is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl
at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473)
at py4j.Gateway.invoke(Gateway.java:305)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:195)
at py4j.ClientServerConnection.run(ClientServerConnection.java:115)
at java.lang.Thread.run(Thread.java:750)