cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Input File Path from Autoloader in Delta Live Tables

Enzo_Bahrami
New Contributor III

Hello everyone!

I was wondering if there is any way to get the subdirectories in which the file resides while loading while loading using Autoloader with DLT. For example:

def customer():

 return (

  spark.readStream.format('cloudfiles')

    .option('cloudFiles.format', 'json')

.load(f'/Users/customer/dynamic_dlt/raw/customer/customer_*.json')

.withColumn('input_file', input_file_name())

But instead of input_file_name() something like input_file_path()

1 ACCEPTED SOLUTION

Accepted Solutions

pvignesh92
Honored Contributor

Hi @Parsa Bahraminejad​ , I'm not sure of an inbuilt function to get the sub directory name but you can easily do this by using split function on '/' on the input_file_name() as the file name will have the complete path as well.

Please let me know if this helps.

View solution in original post

2 REPLIES 2

pvignesh92
Honored Contributor

Hi @Parsa Bahraminejad​ , I'm not sure of an inbuilt function to get the sub directory name but you can easily do this by using split function on '/' on the input_file_name() as the file name will have the complete path as well.

Please let me know if this helps.

Anonymous
Not applicable

Hi @Parsa Bahraminejad​ 

We haven't heard from you since the last response from @Vigneshraja Palaniraj​ ​, and I was checking back to see if her suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others. 

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group