โ04-14-2015 02:58 PM
โ04-14-2015 03:05 PM
@kidexpโ
From the workspace dropdown, you can select New Library, and then select Python eggs or specify specific packages. Please see attached screenshots.
โ04-14-2015 03:05 PM
โ04-14-2015 03:22 PM
Thanks very much @Arsalan Tavakoli-Shirajiโ
โ05-02-2017 06:37 PM
@Arsalan Tavakoli-Shirajiโ how do we attach it to a specific cluster programmatically (and not just all clusters by checking that box)
โ08-01-2018 12:25 PM
You can use the Databricks Libraries API to programmatically attach libraries to specific clusters. For more information: https://docs.databricks.com/api/latest/libraries.html#install
โ06-22-2023 10:04 AM
Introduce Python bundle on flash group
Make a virtualenv only for your Flash hubs.
Each time you run a Flash work, run a new pip introduce of all your own in-house Python libraries. ...
Zoom up the site-bundles dir of the virtualenv. ...
Pass the single .compress document, containing your libraries and their conditions as a contention to - - py-records.
โ10-31-2024 11:49 PM
Use --py-files with Spark Submit: Zip the package and add it using --py-files when you run spark-submit. For example:
spark-submit --py-files path/to/your_package.zip your_script.py
โ10-31-2024 11:53 PM
If --py-files doesnโt work, try this shorter method:
Create a Conda Environment: Install your packages.
conda create -n myenv python=3.x
conda activate myenv
pip install your-package
Package and Submit: Use conda-pack and spark-submit with --archives.
conda pack -n myenv -o myenv.tar.gz
spark-submit --archives myenv.tar.gz#myenv --conf spark.pyspark.python=myenv/bin/python your_script.py
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group