08-22-2022 12:11 AM
Hi,
I'm very new to databricks, this might be a basic question.
I can't find a way to run my local python file with databricks successfully. When I run the following `execute` command, I got a FileNotFoundError.
`dbx execute --cluster-id=*** --job=Sample-Training-Job-File -e test --no-rebuild --no-package`
`FileNotFoundError: File notebooks/training_file.py is mentioned in the task or job definition, but is non-existent`
Here is my job definition in `deployment.yml`
```
environments:
test:
strict_path_adjustment_policy: true
jobs:
- name: "Sample-Training-Job-File"
new_cluster:
spark_version: "10.4.x-scala2.12"
num_workers: 1
node_type_id: "Standard_DS3_v2"
spark_python_task:
python_file: "file://notebooks/training_file.py"
```
Thanks in advance for any tips and ideas.
08-26-2022 08:29 AM
Hi, If you are using dbx , could you please make sure if all the requirements are fulfilled? https://docs.databricks.com/dev-tools/dbx.html#requirements. Also, where are you running the yml for the same?
09-20-2022 02:29 AM
Hi @Di Lin
Does @Debayan Mukherjee response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?
We'd love to hear from you.
Thanks!
09-20-2022 04:30 AM
09-20-2022 04:46 AM
Hi @Di Lin
Thanks for the quick response.
Regards
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.