spark config not working in job cluster
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 08:04 PM
I am trying to rename a delta table like this:
spark.conf.set("spark.databricks.delta.alterTable.rename.enabledOnAWS", "true")
spark.sql("ALTER TABLE db1.rz_test5 RENAME TO db1.rz_test6")
The data is on aws s3, that's why I have to use spark config in the notebook. This works fine in the interactive cluster.
But when I test it in a job using a job cluster it failed with error:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
The runtime for the interactive cluster is: 13.3 LTS
no spark configuration in the cluster.
The runtime for the job cluster in the job is: 14.3 LTS
no spark configuration in the cluster.
Both clusters use the same instance profile.
2 REPLIES 2
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 11:44 PM
Hello,
it would be interesting to test with the same runtime version.
Does it work with a job running on a 13.3 ?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-10-2024 12:17 AM
No, 13.3 not working and has the same result.

