What they mention in the API docs is that you can create a job with sql_task of type Alert. To make it easier you can try creating the job first in the UI first and downloading the JSON config. Here is an example with the main parameters that should ...
You can export the complete job configuration in different ways:
You can use the REST API: https://docs.databricks.com/api/azure/workspace/jobs/getYou can use the python SDK: https://github.com/databricks/databricks-sdk-pyYou can use Terraform:
http...
You could use something like flake8 and customize the rules in the .flake8 file or ignore specific lines with #noqa.
https://flake8.pycqa.org/en/latest/user/configuration.html
You don't need to use dbutils for this type of parameters. You can get the arguments using sys.argv. This is an example using a python wheel, but you can use it as reference: https://docs.databricks.com/en/workflows/jobs/how-to/use-python-wheels-in-w...