cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks SQL as alternative to Spark thrift server

SachinJanani
New Contributor II

We are currently using Spark as our SQL engine with Thrift Server but are evaluating Databricks Serverless SQL as a potential alternative. We have a few specific questions:

  1. Does Databricks Serverless SQL support custom Spark extensions?
  2. Can we configure Databricks Serverless SQL to use a custom metastore, especially one that isnโ€™t Hive-compatible?
  3. Is it possible to apply custom Spark configurations in Databricks Serverless SQL?

Any insights into these questions would be greatly appreciated!

1 ACCEPTED SOLUTION

Accepted Solutions

NandiniN
Databricks Employee
Databricks Employee

Hi @SachinJanani ,

1 - Databricks Serverless SQL does not support custom Spark extensions (Advanced Spark Configs/Libraries etc). This is because the serverless environment is designed to be highly optimized and managed by Databricks, which limits the ability to add custom extensions.

2 - Databricks Serverless SQL does not support the use of a custom metastore that isnโ€™t Hive-compatible. According to the documentation, serverless SQL warehouses cannot be deployed if Hive metastore credentials are defined at the workspace level. However, AWS Glue can be used as the workspace legacy metastore. https://docs.databricks.com/en/admin/sql/serverless.html

3 - It is not possible to apply custom Spark configurations in Databricks Serverless SQL. The serverless environment is managed by Databricks, and custom configurations are not supported to ensure stability and performance.

Thanks!

View solution in original post

1 REPLY 1

NandiniN
Databricks Employee
Databricks Employee

Hi @SachinJanani ,

1 - Databricks Serverless SQL does not support custom Spark extensions (Advanced Spark Configs/Libraries etc). This is because the serverless environment is designed to be highly optimized and managed by Databricks, which limits the ability to add custom extensions.

2 - Databricks Serverless SQL does not support the use of a custom metastore that isnโ€™t Hive-compatible. According to the documentation, serverless SQL warehouses cannot be deployed if Hive metastore credentials are defined at the workspace level. However, AWS Glue can be used as the workspace legacy metastore. https://docs.databricks.com/en/admin/sql/serverless.html

3 - It is not possible to apply custom Spark configurations in Databricks Serverless SQL. The serverless environment is managed by Databricks, and custom configurations are not supported to ensure stability and performance.

Thanks!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now