I need help installing the R package "sf" on the azure Databricks cluster. Any help will be greatly appreciated.
I'm getting the error below when I try installing the "sf" package on a cluster.
"Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(bash, -c, ${DB_HOME:-/home/ubuntu/databricks}/spark/scripts/install_r_package.py "$(echo = | base64 --decode)" /local_disk0/cluster_libraries/r) exited with code 2. Error installing R package: Could not install package with error: installation of package ‘units’ had non-zero exit status
installation of package ‘sf’ had non-zero exit status"