Using init scripts on UC enabled shared access mode clusters
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-13-2023 11:29 AM
I know that UC enabled shared access mode clusters do not allow init script usage and I have tried multiple workarounds to use the required init script in the cluster(pyodbc-install.sh, in my case) including installing the pyodbc package as a workspace library and using that library in the cluster and also using magic commands to install the init script directly in the notebooks but have received errors with both those workarounds. Is there any other ways to use the init scripts in the UC enabled shared access mode cluster?
I have attached the error being received when I try to install the pyodbc library. I am using an admin account as well.
Library installation attempted on the driver node of cluster 0516-171623-3i9on4it and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(/bin/su, libraries, -c, bash /local_disk0/.ephemeral_nfs/cluster_libraries/python/python_start_clusterwide.sh /local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/pip install 'pyodbc 4.0.39' --disable-pip-version-check) exited with code 1. WARNING: The directory '/home/libraries/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you should use sudo's -H flag.
ERROR: Invalid requirement: 'pyodbc 4.0.39'
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-30-2023 11:50 AM
@ah0896 it worked when using a workspace level shell scipt file and pointing to its path with workspace scope.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-01-2023 01:23 PM
@Anonymous @Retired_mod can anyone form databricks confirm on above issue please, there seems to be bit conflict on using custom scripts support on shared access mode cluster with unity catalog enabled please
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-06-2023 01:41 AM
I'm very interested as well! If not possible, then very curious what is the best practice of installing cluster wide custom libraries on shared access mode clusters with unity catalog
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-02-2025 08:41 PM
Hello all,
Below workaround was efficient to me
1) pyodbc-install.sh is uploaded in a Volume
2) the shared cluster is able to navigate to the Volume to select the init script
3) the Databricks runtime is 15.4 LTS
4) the Allowlist has been updated to allow the init script
5) the cluster spins up with no problem
6) the pyodbc==5.2.0 library has been installed (although it made no difference) but
If you are using from ADF use below volumes option and provide ADF SP access to volume path


- « Previous
-
- 1
- 2
- Next »