cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Executing Bash Scripts or Binaries Directly in Databricks Jobs on Single Node Cluster

jorperort
Contributor

Hi,

Is it possible to directly execute a Bash script or a binary executable from the operating system of a Databricks job compute node using a single node cluster?
Iโ€™m using databricks asset bundels  for job initialization and execution. When the job starts, I plan to clone a repository containing Bash scripts and binary executables so that these are available within the single node cluster environment.

The goal is to run one of these scripts or binaries directly from the compute nodeโ€™s operating system, passing parameters at runtime, and without intermediaries like notebooks cells sh , scripts python subprocesos, or init scripts

2 REPLIES 2

eniwoke
New Contributor III

Hi @jorperort I know there is no direct task like bash_task for jobs that allows you to run bash scripts without using notebook cells %sh or Python's subprocess. Have you considered using Init scripts for your cluster while setting up the job?

With the init script, you can execute bash scripts to download or even run commands on a cluster. You can try it and tell me how it goes

Eni

Good afternoon @eniwoke @ , and thank you very much for your commen

Yes, I had already considered using scripts, but I was hoping to find a different solution, since the idea is to pass parameters to these Bash scripts or compiled executables. I assume that if I approach it with init_scripts, I would have to retrieve those parameters through environment variables.

 

I'm not sure if I can retrieve through environment variables, but Iโ€™m not certain whether this init script runs before the environment variables have been set.

 

I would prefer another solution. I'm not sure if spark-submit can be used for this, because while it's possible to do a spark-submit of a compiled JAR, if it's not a JAR but another type of compiled file, I donโ€™t know if it can be executed.

 

That's another point: if anyone has encountered this issue, it would be really helpful if they could share their experience. It would be greatly appreciiate.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now