- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi folks, I'm trying to set up a databricks asset bundle for a job to load some product data into databricks. This job was created in databricks and loads the data from a location hardcoded into the notebook (for now). It is supposed to run every 3 hours. I made the local YAML files using the `databricks bundle generate job --existing-job-id <job-id>` command.
Then I tried deploying it again to the workspace, expecting a job like [dev <your-username>] <project-name>_job. to show up. `databricks bundle deploy -t dev` . Instead i got the following error:
```
Deploying resources...
Error: terraform apply: exit status 1
Error: Insufficient file_arrival blocks
on bundle.tf.json line 69, in resource.databricks_job.product_autoloader.trigger:
69: },
At least 1 "file_arrival" blocks are required.
```
Im not sure why this happens. It has something to do with `file_arrival` key under trigger in the config. But i don't need this particular setting as the path is hardcoded within the notebook and also the job is scheduled. This is my job yaml file:
```
Can someone guide me on this? Thanks
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago - last edited 4 weeks ago
I think since it is a scheduled job, you have to explicitly specify a cron-based schedule instead of using file_arrival in trigger section in yaml file.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago - last edited 4 weeks ago
I think since it is a scheduled job, you have to explicitly specify a cron-based schedule instead of using file_arrival in trigger section in yaml file.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Yep. fixed it with

