cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks SQL as alternative to Spark thrift server

SachinJanani
New Contributor II

We are currently using Spark as our SQL engine with Thrift Server but are evaluating Databricks Serverless SQL as a potential alternative. We have a few specific questions:

  1. Does Databricks Serverless SQL support custom Spark extensions?
  2. Can we configure Databricks Serverless SQL to use a custom metastore, especially one that isn’t Hive-compatible?
  3. Is it possible to apply custom Spark configurations in Databricks Serverless SQL?

Any insights into these questions would be greatly appreciated!

1 ACCEPTED SOLUTION

Accepted Solutions

NandiniN
Databricks Employee
Databricks Employee

Hi @SachinJanani ,

1 - Databricks Serverless SQL does not support custom Spark extensions (Advanced Spark Configs/Libraries etc). This is because the serverless environment is designed to be highly optimized and managed by Databricks, which limits the ability to add custom extensions.

2 - Databricks Serverless SQL does not support the use of a custom metastore that isn’t Hive-compatible. According to the documentation, serverless SQL warehouses cannot be deployed if Hive metastore credentials are defined at the workspace level. However, AWS Glue can be used as the workspace legacy metastore. https://docs.databricks.com/en/admin/sql/serverless.html

3 - It is not possible to apply custom Spark configurations in Databricks Serverless SQL. The serverless environment is managed by Databricks, and custom configurations are not supported to ensure stability and performance.

Thanks!

View solution in original post

1 REPLY 1

NandiniN
Databricks Employee
Databricks Employee

Hi @SachinJanani ,

1 - Databricks Serverless SQL does not support custom Spark extensions (Advanced Spark Configs/Libraries etc). This is because the serverless environment is designed to be highly optimized and managed by Databricks, which limits the ability to add custom extensions.

2 - Databricks Serverless SQL does not support the use of a custom metastore that isn’t Hive-compatible. According to the documentation, serverless SQL warehouses cannot be deployed if Hive metastore credentials are defined at the workspace level. However, AWS Glue can be used as the workspace legacy metastore. https://docs.databricks.com/en/admin/sql/serverless.html

3 - It is not possible to apply custom Spark configurations in Databricks Serverless SQL. The serverless environment is managed by Databricks, and custom configurations are not supported to ensure stability and performance.

Thanks!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group