cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I install a non python/R/maven/jar library into a cluster?

Bradley
New Contributor III

I'm trying to install a non standard package into the cluster using the init scripts. The package I'm trying to install needs to be downloaded using wget, and uncompressed using tar. Then added to the PATH, or at least I need to know where the downloaded files live.

Here are the contents of my init scripts:

wget https://github.com/COMBINE-lab/salmon/releases/download/v1.8.0/salmon-1.8.0_linux_x86_64.tar.gz
tar xzvf salmon-1.8.0_linux_x86_64.tar.gz
SALMON_PATH=$(readlink -f ./salmon-1.8.0_linux_x86_64/bin/)
export PATH="$SALMON_PATH:$PATH"

But this isn't working, at least in a notebook.

5 REPLIES 5

Prabakar
Databricks Employee
Databricks Employee

@Bradley (Customer), You can try to provide the directory option while extracting the tar file.

Or you can download the file to the dbfs location and then use the init script to copy it to the path where it should be in the cluster.

Anonymous
Not applicable

@Bradley (Customer), wget will download the file on the driver, so then it will need to be moved to the filesystem. You can see the distributed file system with %sh ls /dbfs/

User16764241763
Honored Contributor

@Bradley Wright​  Could you please share the error message you are receiving when using the INIT script?

Bradley
New Contributor III

Thank you for the support. Yes, I was able to find a working solution.

  1. I placed the files into the distributed file system, dbfs. For others, this can be done manually using the databricks cli, or using the init scripts. In this case I found it easier to use the databricks cli.
  2. I appended the path to the binary file to the OS environment path. I'm using python so for me it looks like this:
os.environ['PATH'] += ':/dbfs/FileStore/salmon/bin/'

And that was it! I stored the file paths to the input data in a dataframe, then used spark to iterate across the rows of the dataframe calling a custom function that calls the binary file.

Prabakar
Databricks Employee
Databricks Employee

I am happy that my suggestion added some pointers for you to resolve the issue.​

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group