05-03-2022 02:18 AM
I have created a job that contains a notebook that reads a file from Azure Storage.
The file-name contains the date of when the file was transferred to the storage. A new file arrives every Monday, and the read-job is scheduled to run every Monday.
In my notebook, I want to use the schedule-date of the job to read the file from Azure Storage with the same date in the filename, something like this:
file_location = ("file_name+"_"+job_date+_+country_id+.csv")
I have tried to pass a date as a parameter and I am able to access that from the notebook, but if the job fails and I want to re-run the job the next day, I'd have to manually enter yesterdays date as the input parameter. I want to avoid this and just use the real scheduling date for the job.
How do I access the job scheduling date from within the notebook?
Thanks in advance
Karolin
05-03-2022 04:58 AM
Hi, I guess the files are in the same directory structure so that you can use cloud files autoloader. It will incrementally read only new files https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/auto-loader
So it will be another way around, so you can take the date from the input file using.:
.withColumn("filePath",input_file_name())
05-03-2022 04:58 AM
Hi, I guess the files are in the same directory structure so that you can use cloud files autoloader. It will incrementally read only new files https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/auto-loader
So it will be another way around, so you can take the date from the input file using.:
.withColumn("filePath",input_file_name())
05-13-2022 03:27 AM
Hi @Karolin Albinsson , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer) 's response help you to find the solution? Please let us know.
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.