Pyspark insertinto on Hive external table doesn't work if overwrite is true

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ03-21-2023 03:07 PM
Getting an Hive Exceptionโ
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ03-22-2023 01:31 AM
Can you share the command you used and the exception screenshot please?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-01-2023 10:23 PM
@ppatel:
If you are using insertInto with overwrite=True on a Hive external table in PySpark, it might not work as expected. This is because Hive external tables are not managed by Hive and the table data is stored externally. When you use overwrite=True, it tries to overwrite the table data, which is not possible as the data is stored outside of Hive. Instead, you can try using insertInto with overwrite=False. This will append the data to the existing data in the external table. If you want to completely replace the data in the external table, you can try deleting the data from the external location and then using
insertInto with overwrite=False to load the new data.
Alternatively, you can create a managed table in Hive and use insertInto with overwrite=True to overwrite the data in the table. However, this will create a new directory in HDFS and copy the data to that directory, which may not be desirable if you have a large amount of data.

