@Sulfikkar Basheer Shylaja Is your cluster using a custom docker image? The problem here is, the local filesystem path (file:/) is only supported if you use a custom docker image. In regular DBR runtimes, the init script can only be pulled from dbfs...
Hi @Sulfikkar Basheer Shylaja , Why don't you store the init-script on DBFS and just pass the dbfs:/ path of the init script in Pulumi? You could just run this code on a notebook-%python
dbutils.fs.put("/databricks/init-scripts/set-private-pip-repos...
Hi @data engineer can you elaborate on this issue? If you are running Python UDF on a shared access mode cluster, it might not work. Can you try this on a personal compute cluster/single-user access mode cluster?
Hi @Lawrence Chen verify if multiple repos are not working on the same branch in Databricks. Make sure there are no changes made to these notebooks from outside databricks.
Hi @Akihiko Nagata , have you checked the jobs API? You can run a job on the existing cluster that can use the notebook of concern. I believe this is the only way.https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsRunsSubmit