- 2489 Views
- 3 replies
- 3 kudos
I have created a job that contains a notebook that reads a file from Azure Storage. The file-name contains the date of when the file was transferred to the storage. A new file arrives every Monday, and the read-job is scheduled to run every Monday. I...
- 2489 Views
- 3 replies
- 3 kudos
Latest Reply
Hi @Karolin Albinsson​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's response help you to find the solution? Please let us know.
2 More Replies
- 14483 Views
- 10 replies
- 2 kudos
I have created a number of workflows in the Databricks UI. I now need to deploy them to a different workspace.How can I do that?Code can be deployed via Git, but the job definitions are stored in the workspace only.
- 14483 Views
- 10 replies
- 2 kudos
Latest Reply
Hello everyone, I need the same help from databricks expert. I have created a 'Job1' job with runtime 12.2 in 'Datbricks1' workspace. I have integrated with Azure repo and tried deploying in 'ENV1' using CI/CD pipeline. It is successfully deployed in...
9 More Replies
- 1159 Views
- 0 replies
- 0 kudos
I am trying to run an incremental data processing job using python wheel.The job is scheduled to run e.g. every hour.For my code to know what data increment to process, I inject it with the {{start_time}} as part of the command line, like so["end_dat...
- 1159 Views
- 0 replies
- 0 kudos