06-08-2022 10:55 AM
I need to retrieve job id and run id of the job from a jar file in Scala.
When I try to compile below code in IntelliJ, below error is shown.
import com.databricks.dbutils_v1.DBUtilsHolder.dbutils
object MainSNL {
@throws(classOf[Exception])
def main(args: Array[String]): Unit = {
dbutils.notebook.getContext.tags("jobId").toString()
dbutils.notebook.getContext.tags("runId").toString()
}
}
error: Symbol 'type com.databricks.backend.common.rpc.CommandContext' is missing from the classpath.
[ERROR] This symbol is required by 'method com.databricks.dbutils_v1.NotebookUtils.getContext'.
[ERROR] Make sure that type CommandContext is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[ERROR] A full rebuild may help if 'NotebookUtils.class' was compiled against an incompatible version of com.databricks.backend.common.rpc.
[ERROR] dbutils.notebook.getContext.tags("jobId").toString()
[ERROR] ^
[ERROR] one error found
06-30-2022 05:03 AM
This is resolved by passing {{job_id}} and {{run_id}} as parameters to the jar
06-09-2022 03:41 AM
Hi @Sundeep P, We have a similar discussion on this community thread here. Please let us know if it helps.
06-09-2022 03:45 AM
06-09-2022 06:17 AM
Do you use databricks-connect with Intellij?
06-09-2022 07:34 AM
No
06-17-2022 05:09 AM
Hi @Kaniz Fatma Are there any alternatives to this, please advise
06-29-2022 11:22 AM
Another method of getting access to this data is through the spark context, and this will not require use of the dbutils library.
If you check the environment tab of the spark cluster UI, you'll see that Databricks adds a number of properties to the spark config that can be easily retrieved using the native spark apis.
An example of the value of spark.databricks.clusterUsageTags.clusterName is "job-12345678901-run-987654-default" and you can retrieve this using spark.conf.get()
06-30-2022 05:03 AM
This is resolved by passing {{job_id}} and {{run_id}} as parameters to the jar
07-05-2022 03:08 AM
Maybe its worth going through the Task Parameter variables section of the below doc
https://docs.databricks.com/data-engineering/jobs/jobs.html#task-parameter-variables
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group