cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

spark config not working in job cluster

thot
New Contributor II

I am trying to rename a delta table like this:

spark.conf.set("spark.databricks.delta.alterTable.rename.enabledOnAWS", "true")
spark.sql("ALTER TABLE db1.rz_test5 RENAME TO db1.rz_test6")
 
The data is on aws s3, that's why I have to use spark config in the notebook. This works fine in the interactive cluster.
But when I test it in a job using a job cluster it failed with error:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
 
The runtime for the interactive cluster is: 13.3 LTS
no spark configuration in the cluster.
 
The runtime for the job cluster in the job is: 14.3 LTS 
no spark configuration in the cluster.
 
Both clusters use the same instance profile.
 
2 REPLIES 2

MaximeGendre
New Contributor III

Hello,
it would be interesting to test with the same runtime version.
Does it work with a job running on a 13.3 ?

thot
New Contributor II

No, 13.3 not working and has the same result.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group