Hello @HASSAN_UPPAL123 The class name is correct, for the jar please try downloading the latest from here
This issue may also be a classpath issue were the jar is not exported correctly in your client setup, I see similar issues/ suggested solutions ...
Hello @arpitaj111 ,
I think a good starting point would be to check the resources section at Databricks website (https://www.databricks.com/resources)
I recommend starting with the ebook "Big Book of Data Engineering: 2nd Edition" (https://www.databr...
Hello @sensanjoy ,
Based on this stacktrace, the cluster used in this case looks as shared access mode with Unity Catalog enabled cluster to me.
Shared access mode clusters do not allow such connections.
This use case can be addressed by:
Using a sin...
Hello @runninsavvy ,
The following code sample can be used in such case
val argArray = Array(1, 2, 3)
val argMap = Map("param" -> argArray.mkString(","))
spark.sql("SELECT 1 IN (SELECT explode(split(:param, ',')))",argMap).show()
Hello @brendanc19 ,Currently only available integration for this case is by using serverless or pro SQL warehouses, the current job clusters can't be used.Regards