Help installing R package "sf" azure Databricks cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-16-2023 06:11 AM - edited 10-16-2023 06:21 AM
I need help installing the R package "sf" on the azure Databricks cluster. Any help will be greatly appreciated.
I'm getting the error below when I try installing the "sf" package on a cluster.
"Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: org.apache.spark.SparkException: Process List(bash, -c, ${DB_HOME:-/home/ubuntu/databricks}/spark/scripts/install_r_package.py "$(echo = | base64 --decode)" /local_disk0/cluster_libraries/r) exited with code 2. Error installing R package: Could not install package with error: installation of package ‘units’ had non-zero exit status
installation of package ‘sf’ had non-zero exit status"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-17-2023 11:33 AM
Thanks!

