cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Can the default cluster Serverless of Databricks install Scala packages

liu
New Contributor III

Can the default cluster Serverless of Databricks install Scala packages

I need to use the spark-sftp package, but it seems that serverless is different from purpose compute, and I can only install python packages?
There is another question. I can use purpose compute to put sftp files into dbfs through the paramiko package. Why do I get an error saying the folder cannot be found when using serverless to put sftp files into External Data(s3)

2 REPLIES 2

-werners-
Esteemed Contributor III

no scala, you can't even run scala notebooks.

about the sftp: the serverless compute is way more limited than general purpose clusters.
what folder can't be found? dbfs or s3?


liu
New Contributor III

@-werners- 
Thank you for your reply

dbfs and s3 both

Although I found a new method to fulfill my needs, unfortunately, there are still some issues.