cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to access the job-Scheduling Date from within the notebook?

karolinalbinsso
New Contributor II

I have created a job that contains a notebook that reads a file from Azure Storage.

The file-name contains the date of when the file was transferred to the storage. A new file arrives every Monday, and the read-job is scheduled to run every Monday.

In my notebook, I want to use the schedule-date of the job to read the file from Azure Storage with the same date in the filename, something like this:

file_location = ("file_name+"_"+job_date+_+country_id+.csv")

I have tried to pass a date as a parameter and I am able to access that from the notebook, but if the job fails and I want to re-run the job the next day, I'd have to manually enter yesterdays date as the input parameter. I want to avoid this and just use the real scheduling date for the job.

How do I access the job scheduling date from within the notebook?

Thanks in advance

Karolin

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

Hi, I guess the files are in the same directory structure so that you can use cloud files autoloader. It will incrementally read only new files https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/auto-loader

So it will be another way around, so you can take the date from the input file using.:

.withColumn("filePath",input_file_name())

View solution in original post

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

Hi, I guess the files are in the same directory structure so that you can use cloud files autoloader. It will incrementally read only new files https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/auto-loader

So it will be another way around, so you can take the date from the input file using.:

.withColumn("filePath",input_file_name())

Kaniz
Community Manager
Community Manager

Hi @Karolin Albinsson​  , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's response help you to find the solution? Please let us know.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.