Hello @vishva-fivetran how are you?
To set the spark.driver.maxResultSize property, you can do so in the cluster Spark config. The property spark.driver.maxResultSize can be set to a value higher than the value reported in the exception message. For example, if you want to set it to 17g, you would add 'spark.driver.maxResultSize 17g' to your Spark config.
Like this:
After that, please save and restart your cluster.
Be aware that setting a high limit can lead to out-of-memory errors in the driver, depending on spark.driver.memory and the memory overhead of objects in the JVM. Therefore, it's important to set an appropriate limit to prevent such errors.
Best,
Alessandro