I've created workflow job (let say job A) and set up job cluster configuration for it.Now I want to create another workflow job (job B) but use almost the same settings for job cluster.I can see cluster settings in JSON (for both jobs) but I can't ed...
I have a workflow that is running upon a job cluster and contains a task that requires prophet library from PyPI:{
"task_key": "my_task",
"depends_on": [
{
"task_key": "<...>...
Using terraform exporter I got .tf file for job but the only mention of cluster is job_cluster_key = "My_job_cluster"And there are no settings for this specific cluster. If I export clusters using terraform exporter I can get only all-purpose but not...
Interesting approach, thank you. But when cloning this job compute it is cloned as an all-purpose compute. And what I need is to clone it as an other job compute.
Hi @Vartika Nain​ I finally used a different solution - to use Databricks Container Services and run job cluster using my custom image with preinstalled library.
Hi! I am facing a similar issue.I tried to use this oneFROM databricksruntime/standard:10.4-LTS
ENV DEBIAN_FRONTEND=noninteractive
RUN apt update && apt install -y maven && rm -rf /var/lib/apt/lists/*
RUN /databricks/python3/bin/pip install datab...