Hey,
I am trying to run a job with serverless compute, that runs python scripts.
I need the paramiko package to get my scripts to work. I managed to get it working by doing:
environments:
- environment_key: default
# Full documentation of this spec can be found at:
spec:
client: "1"
dependencies:
- paramiko==4.0.0
in the yaml for the job.
However, to more centralize the project config and not having to specify the package per job, I want to make use of a python wheel.
I already set up my pyproject.toml with the dependecies.
I saw other scripts online adding the following to the databricks.yml:
# A set of artifacts to build before deploying
artifacts:
x:
type: whl
build: poetry build
path: .
Not sure if this is needed?
Also, I am not sure what to give as arguments in the environment block, because I read that this is required for the serverless job. I assume I have to fill in the dependencies in another way there now?
How would this work?
Thanks!