How to use spark-submit python task with the usage of --archives parameter passing a .tar.gz conda env?
We've been trying to launch a spark-submit python task using the parameter "archives", similar to that one used in Yarn.​However, we've not been able to successfully make it work in databricks.​​We know that for our OnPrem installation we can use som...
- 4329 Views
- 2 replies
- 0 kudos
Latest Reply
@Ryoji Kuwae Neto​ :To use the --archives parameter with a conda environment in Databricks, you can follow these steps:1) Create a conda environment for your project and export it as a .tar.gz file:conda create --name myenv conda activate myenv conda...
- 0 kudos