Hello all,
I am saving my data frame as a Delta Table to S3 and AWS Glue using pyspark and `saveAsTable`, so far I can do this but something curious happens when I try to change the `path` (as an option or as an argument of `saveAsTable`).
The location in my Glue table is not updated to the correct path, instead it adds the suffix __PLACEHOLDER__, for example if I want to save the data frame as `my_table` in a bucket `s3://my-bucket/data/my_table` on the Glue table I will see the location as `s3://my-bucket/my_table-__PLACEHOLDER__`. However I still can query my table through SQL or pyspark.
My current workaround is to save the table and next to update the location on Glue using boto3.
Do you know if it is possible to make `saveAsTable` work as expected ? Or do you have another workaround ?