cancel
Showing results for 
Search instead for 
Did you mean: 

Time-out when using init script to install GDAL

meystingray
New Contributor II

I'm following this post to run an init script to install gdal.

https://community.databricks.com/t5/get-started-discussions/gdal-on-databricks-cluster-runtime-12-2-...

My script is simply:

--------------
#!/bin/bash
apt-get update && apt-get upgrade
apt-get install gdal*
--------------
This should be simple enough.  However when starting the cluster, the script runs for an hour and then times out.  What gives?  How to fix?

 

3 REPLIES 3

Kaniz
Community Manager
Community Manager

Hi @meystingrayThe script is timing out because it has been running too long.

The apt-get update and apt-get upgrade commands are updating and upgrading the system packages, which can take a significant amount of time depending on the updates’ size and the network connection's speed.

To fix this issue, you can try the following steps:

1. Split the script into two separate commands:
 

   apt-get update
  apt-get upgrade
 

2. Run the commands individually to see if they are causing the timeout. If one of the commands is taking too long, you can try to optimize it or find an alternative solution

3. If the apt-get upgrade The command is causing the timeout; you can try to limit the number of packages being upgraded by specifying the package names explicitly.

For example, instead of using apt-get upgrade, you can use apt-get install package1 package2 package3 to upgrade only the specified packages.

4. You can also try increasing the timeout duration by modifying the timeout settings in your cluster configuration. This can be done by adjusting the spark.databricks.cluster.profile.timeoutSeconds parameter in the cluster configuration.

meystingray
New Contributor II

Really appreciate the response, @Kaniz.  So using your advice, I tried commenting out various parts of the script, but kept getting an error of the type: "Init script failure:
Cluster scoped init script /Users/XYZ/gdal_install.sh failed: Script exit status is non-zero"

The only thing that worked was:

apt-get update

I'm trying to install gdal on a Databricks Runtime Version 13.2 ML.  Thanks, Sean

-werners-
Esteemed Contributor III

you could skip the upgrade part and just go for the gdal install instead.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.