cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Error Using spark.catalog.dropTempView()

aicd_de
New Contributor III

I have a set of Spark Dataframes that I convert into Temp Views to run Spark SQL with. Then, I delete them after my logic/use is complete. The delete step throws an odd error that I am not sure how to fix. Looking for some tips on fixing it. As a note, the cluster is on Unity as a Shared Cluster.

df.createOrReplaceTempView(prefix_updates)
target.toDF().createOrReplaceTempView(prefix_main)
 
sql_begin = 'SELECT '+prefix_updates+'.* FROM '+prefix_updates+' INNER JOIN '+prefix_main+' ON '
 
merged_inserts = spark.sql(sql_begin+merge_match_ins+" WHERE "+prefix_updates+".ReplicationUTCDateTime > "+prefix_main+".ReplicationUTCDateTime")
        
      merged_inserts.write.format("delta").mode("append").saveAsTable(catalog+'.'+schema+'.'+table_name)
 
spark.catalog.dropTempView(prefix_updates)
spark.catalog.dropTempView(prefix_main)

The code before spark.catalog.dropTempView works just fine without error. But when adding in the View delete statements I get this error now:

Py4JError: An error occurred while calling o1620.dropTempView. Trace:

py4j.security.Py4JSecurityException: Method public boolean org.apache.spark.sql.internal.CatalogImpl.dropTempView(java.lang.String) is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl

at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473)

at py4j.Gateway.invoke(Gateway.java:305)

at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)

at py4j.commands.CallCommand.execute(CallCommand.java:79)

at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:195)

at py4j.ClientServerConnection.run(ClientServerConnection.java:115)

at java.lang.Thread.run(Thread.java:750)

1 ACCEPTED SOLUTION

Accepted Solutions

aicd_de
New Contributor III

            spark.sql("DROP TABLE "+prefix_updates)

            spark.sql("DROP TABLE "+prefix_main)

Fixed it for me.

View solution in original post

1 REPLY 1

aicd_de
New Contributor III

            spark.sql("DROP TABLE "+prefix_updates)

            spark.sql("DROP TABLE "+prefix_main)

Fixed it for me.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.