- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-14-2015 02:58 PM
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-14-2015 03:05 PM
@kidexpโ
From the workspace dropdown, you can select New Library, and then select Python eggs or specify specific packages. Please see attached screenshots.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-14-2015 03:05 PM
@kidexpโ
From the workspace dropdown, you can select New Library, and then select Python eggs or specify specific packages. Please see attached screenshots.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-14-2015 03:22 PM
Thanks very much @Arsalan Tavakoli-Shirajiโ
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ05-02-2017 06:37 PM
@Arsalan Tavakoli-Shirajiโ how do we attach it to a specific cluster programmatically (and not just all clusters by checking that box)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ08-01-2018 12:25 PM
You can use the Databricks Libraries API to programmatically attach libraries to specific clusters. For more information: https://docs.databricks.com/api/latest/libraries.html#install

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-22-2023 10:04 AM
Introduce Python bundle on flash group
Make a virtualenv only for your Flash hubs.
Each time you run a Flash work, run a new pip introduce of all your own in-house Python libraries. ...
Zoom up the site-bundles dir of the virtualenv. ...
Pass the single .compress document, containing your libraries and their conditions as a contention to - - py-records.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ10-31-2024 11:49 PM
Use --py-files with Spark Submit: Zip the package and add it using --py-files when you run spark-submit. For example:
spark-submit --py-files path/to/your_package.zip your_script.py
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ10-31-2024 11:53 PM
If --py-files doesnโt work, try this shorter method:
Create a Conda Environment: Install your packages.
conda create -n myenv python=3.x
conda activate myenv
pip install your-packagePackage and Submit: Use conda-pack and spark-submit with --archives.
conda pack -n myenv -o myenv.tar.gz
spark-submit --archives myenv.tar.gz#myenv --conf spark.pyspark.python=myenv/bin/python your_script.py

