init script won't work if you meant export PYTHONPATH env setting. Databricks shell overwrites it when it starts the python interpreter. One approach we make it work is if the code is under /dbfs, we do editable install at init script, e.g. pip insta...
Thanks @Ohad Raviv​ . I will try your approach.spark.executorEnv.PYTHONPATH works only for worker node not driver node. And it needs to set at the cluster initialization stage (under Spark tab). After cluster initialized, databricks overwrite it even...
For worker node, you can set spark config in cluster setting: spark.executorEnv.PYTHONPATH However you need to make sure you append your Workspace path at the end as worker node needs other system python path. This seems to be a hack to me. I hope da...