04-14-2015 02:58 PM
04-14-2015 03:05 PM
@kidexp
From the workspace dropdown, you can select New Library, and then select Python eggs or specify specific packages. Please see attached screenshots.
04-14-2015 03:05 PM
04-14-2015 03:22 PM
Thanks very much @Arsalan Tavakoli-Shiraji
05-02-2017 06:37 PM
@Arsalan Tavakoli-Shiraji how do we attach it to a specific cluster programmatically (and not just all clusters by checking that box)
08-01-2018 12:25 PM
You can use the Databricks Libraries API to programmatically attach libraries to specific clusters. For more information: https://docs.databricks.com/api/latest/libraries.html#install
06-22-2023 10:04 AM
Introduce Python bundle on flash group
Make a virtualenv only for your Flash hubs.
Each time you run a Flash work, run a new pip introduce of all your own in-house Python libraries. ...
Zoom up the site-bundles dir of the virtualenv. ...
Pass the single .compress document, containing your libraries and their conditions as a contention to - - py-records.
3 weeks ago
Use --py-files with Spark Submit: Zip the package and add it using --py-files when you run spark-submit. For example:
spark-submit --py-files path/to/your_package.zip your_script.py
3 weeks ago
If --py-files doesn’t work, try this shorter method:
Create a Conda Environment: Install your packages.
conda create -n myenv python=3.x
conda activate myenv
pip install your-package
Package and Submit: Use conda-pack and spark-submit with --archives.
conda pack -n myenv -o myenv.tar.gz
spark-submit --archives myenv.tar.gz#myenv --conf spark.pyspark.python=myenv/bin/python your_script.py
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group