cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Help installing R package "sf" azure Databricks cluster

data_scientist3
New Contributor II

I need help installing the R package "sf" on the azure Databricks cluster. Any help will be greatly appreciated.

I'm getting the error below when I try installing the "sf" package on a cluster.

"Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(bash, -c, ${DB_HOME:-/home/ubuntu/databricks}/spark/scripts/install_r_package.py "$(echo = | base64 --decode)" /local_disk0/cluster_libraries/r) exited with code 2. Error installing R package: Could not install package with error: installation of package ‘units’ had non-zero exit status
installation of package ‘sf’ had non-zero exit status"

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @data_scientist3 , 

The "sf" R package depends on several other packages, including "units". The error message you received indicates that the installation of the "units" package had a non-zero exit status, which means the installation failed for some reason.

To install the "sf" package and its dependencies on a Databricks cluster, you can follow these steps:

  1. Install the "units" package: The "sf" package requires the "units" package, which has system dependencies that need to be installed. You can install the system dependencies by running the following command in a Databricks Notebook:

 

%r
system("sudo apt-get install libudunits2-dev -y")

 

 Then install the units Package using the following R command:

 

%r
install.packages("units")

 

  1. Install the "sf" package: Once the "units" package is installed, you can install the "sf" package using the following R command:

 

%r
install.packages("sf")

 

These steps should install the "sf" package and its dependencies on your Databricks cluster. If you encounter any further dependencies issues, you can install the necessary packages using the install.packages() function in R.

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @data_scientist3 , 

The "sf" R package depends on several other packages, including "units". The error message you received indicates that the installation of the "units" package had a non-zero exit status, which means the installation failed for some reason.

To install the "sf" package and its dependencies on a Databricks cluster, you can follow these steps:

  1. Install the "units" package: The "sf" package requires the "units" package, which has system dependencies that need to be installed. You can install the system dependencies by running the following command in a Databricks Notebook:

 

%r
system("sudo apt-get install libudunits2-dev -y")

 

 Then install the units Package using the following R command:

 

%r
install.packages("units")

 

  1. Install the "sf" package: Once the "units" package is installed, you can install the "sf" package using the following R command:

 

%r
install.packages("sf")

 

These steps should install the "sf" package and its dependencies on your Databricks cluster. If you encounter any further dependencies issues, you can install the necessary packages using the install.packages() function in R.

Thanks!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group