@ppatel:
If you are using insertInto with overwrite=True on a Hive external table in PySpark, it might not work as expected. This is because Hive external tables are not managed by Hive and the table data is stored externally. When you use overwrite=True, it tries to overwrite the table data, which is not possible as the data is stored outside of Hive. Instead, you can try using insertInto with overwrite=False. This will append the data to the existing data in the external table. If you want to completely replace the data in the external table, you can try deleting the data from the external location and then using
insertInto with overwrite=False to load the new data.
Alternatively, you can create a managed table in Hive and use insertInto with overwrite=True to overwrite the data in the table. However, this will create a new directory in HDFS and copy the data to that directory, which may not be desirable if you have a large amount of data.