How to deploy a python script with dependencies by dbx?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-08-2022 07:48 PM
Hi,
I'm quite new here. I'm trying to perform a deployment of python file with dbx command. The file contains libraries to be installed. How may I deploy the file (together with its dependencies) to databricks?
Here are the commands I currently run:
`dbx deploy --job=Sample-Training-Job-File --no-rebuild`
`dbx launch --job=Sample-Training-Job-File --trace`
And my `Sample-Training-Job-File` is specified in the `depployment.yml` as follows. This is a databricks in Azure.
```
environments:
default:
strict_path_adjustment_policy: true
jobs:
- name: "Sample-Training-Job-File"
new_cluster:
spark_version: "10.4.x-scala2.12"
num_workers: 1
node_type_id: "Standard_DS3_v2"
spark_python_task:
python_file: "file://sample/training_file.py"
```
Thank you.
- Labels:
-
Azure
-
Deploy
-
New
-
Python
-
Python File
-
Python script
-
Standard
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-09-2022 08:30 AM
@Di Lin are you facing any errors while using this code?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-09-2022 09:40 AM
Yes, I would have import error if I deploy the code this way. Because the packages (e.g. mlflow, sklearn) are not installed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-07-2022 05:57 AM
Hi @Di Lin
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!