Can the default cluster Serverless of Databricks install Scala packages
I need to use the spark-sftp package, but it seems that serverless is different from purpose compute, and I can only install python packages?
There is another question. I can use purpose compute to put sftp files into dbfs through the paramiko package. Why do I get an error saying the folder cannot be found when using serverless to put sftp files into External Data(s3)