@ashutosh0710 you can inspect the libraries available in a cluster at /databricks/jars to understand which jars and versions are available, or similarly simply inspect the Spark UI Environment Classpath list.
@Brad nvm about my previous comment, after further research, I can tell this feature has been in Databricks for years using https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html, so the listing will not be the problem in your case. I...
@Dp15
My apologies for the confusion, I've edited my previous comment (and wrong statement[1]) to make it clear:
[1] So, you can run code inside your udf, but you can't initialize the sparksession there or reuse one.
Executing "spark.sql" statem...
Hi @Dp15
The error stack trace doesn't point to an error with running Spark SQL inside the SQL UDF but instead with respect to the call to the SparkSession created from inside the UDF; this is not permitted. Furthermore, the UDF runs in a separate e...
Hi @bhanuteja_1
The scala.Product is a core class in the Scala std library used for tuples, case classes, etc. There seem to be a classloading problem or more likely a jar conflict. Are you deploying a job using custom jars, uber jars, and having de...