I'm trying to write a table in AWS Redshift using the following code:
try:
(df_source.write
.format("redshift")
.option("dbtable", f"{redshift_schema}.{table_name}")
.option("tempdir", tempdir)
.option("url", url)
.option("user", user)
.option("password", password)
.option("forward_spark_s3_credentials", True)
.option("tempformat", "CSV GZIP")
.mode("overwrite")
.save())
except Exception as e:
raise Exception(f'ERRO: {str(e)}')
However I'm getting the following error:
"Exception: ERRO: An error occurred while calling o450.save. : org.apache.spark.SparkSecurityException: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any file."
I've seen a solution to have an admin run a sql command like GRANT SELECT ON ANY FILE TO `user1`
But it's not working and I don't think I'd like to give this grant to every user/group. So, why is this happening and how can I solve it in a more proper manner? I also have unity catalog enabled. Thank you!