I'm trying to set PYTHONPATH env variable in the cluster configuration: `PYTHONPATH=/dbfs/user/blah`. But in the driver and executor envs it is probably getting overridden and i don't see it.
`%sh echo $PYTHONPATH` outputs:
`PYTHONPATH=/databricks/spark/python:/databricks/spark/python/lib/py4j-0.10.9.5-src.zip:/databricks/jars/spark--driver--driver-spark_3.3_2.12_deploy.jar:/WSFS_NOTEBOOK_DIR:/databricks/spark/python:/databricks/python_shell`
and `import sys; print(sys.path)`:
```
'/databricks/python_shell/scripts', '/local_disk0/spark-c87ff3f0-1b67-4ec4-9054-079bba1860a1/userFiles-ea2f1344-51c6-4363-9112-a0dcdff663d0', '/databricks/spark/python', '/databricks/spark/python/lib/py4j-0.10.9.5-src.zip', '/databricks/jars/spark--driver--driver-spark_3.3_2.12_deploy.jar', '/databricks/python_shell', '/usr/lib/python39.zip', '/usr/lib/python3.9', '/usr/lib/python3.9/lib-dynload', '', '/local_disk0/.ephemeral_nfs/envs/pythonEnv-267a0576-e6bd-4505-b257-37a4560e4756/lib/python3.9/site-packages', '/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages', '/databricks/python/lib/python3.9/site-packages', '/usr/local/lib/python3.9/dist-packages', '/usr/lib/python3/dist-packages', '/databricks/python/lib/python3.9/site-packages/IPython/extensions', '/root/.ipython'
```
if i work from Repos it does add the repo to everywhere `/Workspace/Repos/user@domain.com/my_repo`, but then i need all my modules to be straight there and it is not convenient.
please let me know if there's a work-around to set a `/dbfs/` path in all nodes without ugly trick of ***** UDF, but straight from the cluster init script or the best would be dynamic `spark.conf` property.