from pyspark.sql import SparkSession
from pyspark import SparkContext, SparkConf
from pyspark.storagelevel import StorageLevel
spark = SparkSession.builder.appName('TEST').config('spark.ui.port','4098').enableHiveSupport().getOrCreate()
df4 = spark.sql(' \
select * from hive_schema.table_name limit 1')
print("query completed " )
df4.unpersist()
df4.count()
df4.show()
I have execute above code to clear the dataframe release the memory. However, df4.show() still works and shows the data. Could you please help me with right method to free memory occupied by a spark DF please ?