Hi @SachinJanani ,
1 - Databricks Serverless SQL does not support custom Spark extensions (Advanced Spark Configs/Libraries etc). This is because the serverless environment is designed to be highly optimized and managed by Databricks, which limits the ability to add custom extensions.
2 - Databricks Serverless SQL does not support the use of a custom metastore that isnโt Hive-compatible. According to the documentation, serverless SQL warehouses cannot be deployed if Hive metastore credentials are defined at the workspace level. However, AWS Glue can be used as the workspace legacy metastore. https://docs.databricks.com/en/admin/sql/serverless.html
3 - It is not possible to apply custom Spark configurations in Databricks Serverless SQL. The serverless environment is managed by Databricks, and custom configurations are not supported to ensure stability and performance.
Thanks!