Serverless Compute
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi Team,
We are using Azure Databricks Serverless Compute to execute workflows and notebooks. My question is :
Does serverless compute support Maven library installations?
I appreciate any insights or suggestions you might have. Thanks in advance for your help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
-
Support for Maven Libraries:
- You can install libraries from a public Maven repository when working with serverless compute. This can include specifying Maven coordinates like
{ "coordinates": "org.jsoup:jsoup:1.7.2" }
. - Custom Maven repositories are also supported, provided they allow unauthenticated access. Libraries are resolved in the Databricks Control Plane, meaning the repository must be accessible from there.
- You can install libraries from a public Maven repository when working with serverless compute. This can include specifying Maven coordinates like
-
Limitations:
- While Maven library installation is supported, Serverless Compute introduces certain constraints. For instance:
- Classic Databricks features such as init scripts and cluster-scoped libraries are not supported. Instead, Databricks recommends leveraging environments for dependency management in Serverless Compute.
- Libraries installed may not persist across multiple executions, as serverless compute environments are optimized for fast start-up and may be stateless.
- Other features, such as Spark-level observability (e.g., accessing Spark UI, detailed driver/executor logs) or tight control over Spark configurations, might also be limited.
- While Maven library installation is supported, Serverless Compute introduces certain constraints. For instance:
-
Recommendations:
- Use environments to cache dependencies where possible, as this reduces the need to reinstall libraries with every execution.
- Specify Maven repositories clearly if using custom ones, ensuring they are accessible from the Databricks Control Plane.
- Test libraries in a similar environment before deploying them in serverless compute to ensure compatibility.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tuesday
Thanks for the response. When we went through the official documentation, it was mentioned that Serverless compute does not support JAR file installation. Please find below the reference.
https://docs.databricks.com/aws/en/compute/serverless/best-practices
Could you please suggest how to install the JDBC driver to write data to an external source? Thanks in advance.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wednesday
So, it appears that the there is conflicting documentation on this topic. I checked with our internal documention and what I found was that you CANNOT install JDBC or ODBC drivers on Serverless. See limitations here: https://docs.databricks.com/aws/en/compute/serverless/limitations
Many apologies for the confusion.
Regards, Lou.

