cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I install a non python/R/maven/jar library into a cluster?

Bradley
New Contributor III

I'm trying to install a non standard package into the cluster using the init scripts. The package I'm trying to install needs to be downloaded using wget, and uncompressed using tar. Then added to the PATH, or at least I need to know where the downloaded files live.

Here are the contents of my init scripts:

wget https://github.com/COMBINE-lab/salmon/releases/download/v1.8.0/salmon-1.8.0_linux_x86_64.tar.gz
tar xzvf salmon-1.8.0_linux_x86_64.tar.gz
SALMON_PATH=$(readlink -f ./salmon-1.8.0_linux_x86_64/bin/)
export PATH="$SALMON_PATH:$PATH"

But this isn't working, at least in a notebook.

7 REPLIES 7

Prabakar
Esteemed Contributor III
Esteemed Contributor III

@Bradley (Customer), You can try to provide the directory option while extracting the tar file.

Or you can download the file to the dbfs location and then use the init script to copy it to the path where it should be in the cluster.

Anonymous
Not applicable

@Bradley (Customer), wget will download the file on the driver, so then it will need to be moved to the filesystem. You can see the distributed file system with %sh ls /dbfs/

User16764241763
Honored Contributor

@Bradley Wright​  Could you please share the error message you are receiving when using the INIT script?

Kaniz
Community Manager
Community Manager

Hi @Bradley Wright​ , We haven’t heard from you on the last response from @Arvind Ravish​, and I was checking back to see if you have a resolution yet. If you have any solution, please do share that with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Bradley
New Contributor III

Thank you for the support. Yes, I was able to find a working solution.

  1. I placed the files into the distributed file system, dbfs. For others, this can be done manually using the databricks cli, or using the init scripts. In this case I found it easier to use the databricks cli.
  2. I appended the path to the binary file to the OS environment path. I'm using python so for me it looks like this:
os.environ['PATH'] += ':/dbfs/FileStore/salmon/bin/'

And that was it! I stored the file paths to the input data in a dataframe, then used spark to iterate across the rows of the dataframe calling a custom function that calls the binary file.

Kaniz
Community Manager
Community Manager

Hi @Bradley Wright​, Thank you for sharing the solution with the community. Please accept my deepest thanks. Would you mind marking the best answer for us?

Prabakar
Esteemed Contributor III
Esteemed Contributor III

I am happy that my suggestion added some pointers for you to resolve the issue.​

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!