I know that UC enabled shared access mode clusters do not allow init script usage and I have tried multiple workarounds to use the required init script in the cluster(pyodbc-install.sh, in my case) including installing the pyodbc package as a workspace library and using that library in the cluster and also using magic commands to install the init script directly in the notebooks but have received errors with both those workarounds. Is there any other ways to use the init scripts in the UC enabled shared access mode cluster?
I have attached the error being received when I try to install the pyodbc library. I am using an admin account as well.
Library installation attempted on the driver node of cluster 0516-171623-3i9on4it and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(/bin/su, libraries, -c, bash /local_disk0/.ephemeral_nfs/cluster_libraries/python/python_start_clusterwide.sh /local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/pip install 'pyodbc 4.0.39' --disable-pip-version-check) exited with code 1. WARNING: The directory '/home/libraries/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you should use sudo's -H flag.
ERROR: Invalid requirement: 'pyodbc 4.0.39'