Hi,
I'm very new to databricks, this might be a basic question.
I can't find a way to run my local python file with databricks successfully. When I run the following `execute` command, I got a FileNotFoundError.
`dbx execute --cluster-id=*** --job=Sample-Training-Job-File -e test --no-rebuild --no-package`
`FileNotFoundError: File notebooks/training_file.py is mentioned in the task or job definition, but is non-existent`
Here is my job definition in `deployment.yml`
```
environments:
test:
strict_path_adjustment_policy: true
jobs:
- name: "Sample-Training-Job-File"
new_cluster:
spark_version: "10.4.x-scala2.12"
num_workers: 1
node_type_id: "Standard_DS3_v2"
spark_python_task:
python_file: "file://notebooks/training_file.py"
```
Thanks in advance for any tips and ideas.