install Oracle Instant Client
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-13-2024 12:47 AM
Hello all,
I want to install Oracle Instant Client to be able to use python-oracledb in Thick-Mode because one of our databases is old and cannot be reached in Thin-Mode. I have tried the solution from this post, but it doesn't help me. It seems that the init script does not create the folder with the Instant Client. Is it still possible to install the Instant Client using DBFS?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-03-2024 06:28 AM
Hi @VKe,
I'm reaching out in case you're still troubleshooting connecting to your Oracle database from Databricks. Here are some key points to consider:
1. Initialization Script Location:
- While older examples used DBFS for init scripts, Unity Catalog might require storing them in volumes or Azure Blob Storage (ABFS) for persistence.
2. Shared Environment Permissions:
- Shared environments no longer provide root access. The spark-users group lacks permissions for folders created by root in the init script.
3. Single User Environment Solution (Script):
This script installs the Oracle JDBC driver and sets environment variables. It assumes a single user environment create a shell script file and upload it to Volumes folder of your choice (please do not forget to convert EOL to Unix if using windows to create the file):
#!/bin/bash # Install libaio (if needed) sudo apt-get update && sudo apt-get install -y libaio1 # Download Oracle Instant Client wget --quiet -O /tmp/instantclient-basiclite-linuxx64.zip https://download.oracle.com/otn_software/linux/instantclient/instantclient-basiclite-linuxx64.zip # Create Oracle directory mkdir /opt/oracle # Unzip and move Instant Client yes | unzip /tmp/instantclient-basiclite-linuxx64.zip -d /databricks/driver/oracle_ctl/ mv /databricks/driver/oracle_ctl/instantclient* /databricks/driver/oracle_ctl/instantclient # Set environment variables export LD_LIBRARY_PATH=/databricks/driver/oracle_ctl/instantclient:$LD_LIBRARY_PATH export ORACLE_HOME=/databricks/driver/oracle_ctl/instantclient export PATH=$ORACLE_HOME:$PATH # Update library discovery echo "/databricks/driver/oracle_ctl/instantclient" > /etc/ld.so.conf.d/oracle-instantclient.conf ldconfig
4. Unrestricted Environment without Unity Catalog (Code):
For unrestricted environments without Unity Catalog, use this code in your cluster notebook:
dbutils.fs.put("/databricks/init/init_oc.sh", """ #!/bin/bash sudo apt-get update && sudo apt-get install -y libaio1 mkdir -p /opt/oracle wget -q -O /opt/oracle/instantclient-basic.zip https://download.oracle.com/otn_software/linux/instantclient/instantclient-basiclite-linuxx64.zip cd /opt/oracle unzip -o instantclient-basic.zip mv instantclient* instantclient export LD_LIBRARY_PATH=/opt/oracle/instantclient:$LD_LIBRARY_PATH export ORACLE_HOME=/opt/oracle/instantclient export PATH=$ORACLE_HOME:$PATH echo "/opt/oracle/instantclient" > /etc/ld.so.conf.d/oracle-instantclient.conf ldconfig """, True)
Let me know if you have any questions or if this works for you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 04:39 AM
Hi @Zair I used your code but it still not working for me. My environment variables are still empty. This is what i did
1. I ran the script to download the instant client into a workspace directory as dbfs is no longer supported for init files
2. I added the script path to my cluster and restarted my cluster but nothing shows up
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 06:27 AM
What about shared compute, is it possible? Neither of those scripts is working for me really...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-08-2025 06:29 AM
I used a personal compute and it worked

