- 1069 Views
- 3 replies
- 1 kudos
I'm using Python Wheel Task in Databricks job with WHEEL dependencies. However, the cluster installed the dependencies as JAR instead of WHEEL. Is this an expected behavior or a bug?
- 1069 Views
- 3 replies
- 1 kudos
Latest Reply
There you can see a complete template project with a python wheel task and Databricks Asset Bundles. Please, follow the instructions for deployment.https://github.com/andre-salvati/databricks-template
2 More Replies
- 631 Views
- 1 replies
- 0 kudos
I have created a job which run a jar files, but i have this error NoClassDefFoundError: com/google/cloud/hadoop/gcsio/GoogleCloudStorageFileSystemOptions$TimestampUpdatePredicateCaused by: ClassNotFoundException: com.google.cloud.hadoop.gcsio.GoogleC...
- 631 Views
- 1 replies
- 0 kudos
Latest Reply
Hey Aubert, seems you are missing dependent class in jar. Either package the dependent classes in jar or add them into class path.
- 13901 Views
- 2 replies
- 4 kudos
We are having Databricks Job running with main class and JAR file in it. Our JAR file code base is in Scala. Now, when our job starts running, we need to log Job ID and Run ID into the database for future purpose. How can we achieve this?
- 13901 Views
- 2 replies
- 4 kudos
Latest Reply
Here is a blog with code and examples on how to achieve this https://medium.com/@canadiandataguy/how-to-get-the-job-id-and-run-id-for-a-databricks-job-b0da484e66f5
1 More Replies
by
Chanu
• New Contributor II
- 990 Views
- 2 replies
- 2 kudos
Hi, I would like to understand Databricks JAR based workflow tasks. Can I interpret JAR based runs to be something like a spark-submit on a cluster? In the logs, I was expecting to see the spark-submit --class com.xyz --num-executors 4 etc., And, the...
- 990 Views
- 2 replies
- 2 kudos
Latest Reply
Hi, I did try using the Workflows>Jobs>CreateTask>JarTaskType>UploadedMyJAR and Class and created JobCluster and tested this task. This JAR reads some tables as input, does some transformations and output as writing some other tables. I would like t...
1 More Replies
- 2878 Views
- 3 replies
- 1 kudos
I have a JAR I want to be installed as a library on all clusters. I have tried both wget /databricks/jars/ some_repoandcp /dbfs/FileStore/jars/name_of_jar.jar /databricks/jars/clusters start up but the JAR is not installed as a library. I am aware th...
- 2878 Views
- 3 replies
- 1 kudos
Latest Reply
Found a solution.echo /databricks/databricks-hive /databricks/jars /databricks/glue | xargs -n 1 cp /dbfs/FileStore/jars/NAME_OF_THE_JAR.jarhad to first add the jar as a library through the GUI via Create -> Library then uploaded the downloaded JAR. ...
2 More Replies
- 23933 Views
- 4 replies
- 0 kudos
I am trying to create a JAR for a Azure Databricks job but some code that works when using the notebook interface does not work when calling the library through a job. The weird part is that the job will complete the first run successfully but on an...
- 23933 Views
- 4 replies
- 0 kudos
Latest Reply
I am facing similar issue when trying to use from_utc_timestamp function. I am able to call the function from databricks notebook but when I use the same function inside my java jar and running as a job in databricks, it is giving below error. Analys...
3 More Replies
- 1434 Views
- 2 replies
- 1 kudos
We tried moving our scala script from standalone cluster to databricks platform. Our script is compatible with following version:Spark: 2.4.8 Scala: 2.11.12The databricks cluster has spark/scala following with version:Spark: 3.2.1. Scala: 2.121: we ...
- 1434 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Monika Samant​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...
1 More Replies
- 3765 Views
- 8 replies
- 14 kudos
Hello,We have some Scala code which is compiled and published to an Azure DevOps Artifacts feed.The issue is we're trying to now add this JAR to a Databricks job (through Terraform) to automate the creation.To do this I'm trying to authenticate using...
- 3765 Views
- 8 replies
- 14 kudos
Latest Reply
As of right now, Databricks can't use non-public Maven repositories as resolving of the maven coordinates happens in the control plane. That's different from the R & Python libraries. As workaround you may try to install libraries via init script or ...
7 More Replies
- 1700 Views
- 2 replies
- 3 kudos
I mounted the ADLS to my Azure Databricks resource and I keep on getting this error when I try to install a JAR from a container:Library installation attempted on the driver node of cluster 0331-121709-buk0nvsq and failed. Please refer to the followi...
- 1700 Views
- 2 replies
- 3 kudos
Latest Reply
Hi @Oussama KIASSI​ , The error message says :- Failure to initialize configurationInvalid configuration value detected for fs.azure.account.keyYou can't use the storage account access key to access data using the abfss protocol. You need to provide ...
1 More Replies
by
lily1
• New Contributor III
- 2347 Views
- 3 replies
- 2 kudos
When I execute a function in google-cloud-bigquery:2.7.0 jar, it executes a function in gax:2.12.2 jar and then this gax jar file executes a function in guava jar. And this guava jar file is a Databricks default library which is located at /databrick...
- 2347 Views
- 3 replies
- 2 kudos
Latest Reply
Hey there @Lily Kim​ Hope you are doing well!Thank you for posting your question. We are happy that you were able to find the solution.Would you please like to mark the answer as best?We'd love to hear from you.
2 More Replies
- 2177 Views
- 1 replies
- 1 kudos
How to identify the jars used to load a particular class. I am sure I packed the classes correctly in my application jar. However, looks like the class is loaded from a different jar. I want to understand the details so that I can ensure to use the r...
- 2177 Views
- 1 replies
- 1 kudos
Latest Reply
Adding the below configurations at the cluster level can help to print more logs to identify the jars from which the class is loaded. spark.executor.extraJavaOptions=-verbose:class spark.driver.extraJavaOptions=-verbose:class