- 20480 Views
- 13 replies
- 2 kudos
I have created a number of workflows in the Databricks UI. I now need to deploy them to a different workspace.How can I do that?Code can be deployed via Git, but the job definitions are stored in the workspace only.
- 20480 Views
- 13 replies
- 2 kudos
Latest Reply
@itacdonev great option provided, @Dean_Lovelace you can also select the option View JSON on the Workflow and move to the option create, with this code you can use the API https://docs.databricks.com/api/workspace/jobs/create and create the job in th...
12 More Replies
- 3102 Views
- 2 replies
- 3 kudos
I have created a job that contains a notebook that reads a file from Azure Storage. The file-name contains the date of when the file was transferred to the storage. A new file arrives every Monday, and the read-job is scheduled to run every Monday. I...
- 3102 Views
- 2 replies
- 3 kudos
Latest Reply
Hi, I guess the files are in the same directory structure so that you can use cloud files autoloader. It will incrementally read only new files https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/auto-loaderSo it will ...
1 More Replies
- 1415 Views
- 0 replies
- 0 kudos
I am trying to run an incremental data processing job using python wheel.The job is scheduled to run e.g. every hour.For my code to know what data increment to process, I inject it with the {{start_time}} as part of the command line, like so["end_dat...
- 1415 Views
- 0 replies
- 0 kudos