cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Auto Loader bringing NULL Records

subhas
New Contributor II

Hi

        I am using auto loader to fetch some records stored in two files. Please see below my code. It fetches records from two files correctly and then it starts fetching NULL records. I attach option("cleanSource",    ) to  readStream. But it is not working and does not create directory "/FileStore/archive" as shown below. Please help me.

 

schema = "user_id long, device_id long, mac_address string, registration_timestamp double"
(spark.readStream
.format("cloudFiles")
.schema(schema)
.option("cloudFiles.format","csv")
.option("maxFilesPerTrigger", 1)
.option("header", True)
.option("cleanSource", "archive")
.option("sourceArchiveDir", "/FileStore/archive")
.option("mergeSchema",True)
.load("/FileStore/auto_loader")
.writeStream
.format("delta")
.option("checkpointLocation", "/FileStore/checkpoint")
.outputMode("append")
.toTable("tempdb.bzTable")
)

 

 

1 REPLY 1

Brahmareddy
Esteemed Contributor

Hi subhas,

How are you doing today?, As per my understanding, It looks like the issue is happening because you're using /FileStore, which isnโ€™t fully supported by Autoloaderโ€™s cleanSource option. Even though the code looks mostly fine, Autoloader expects the source and archive paths to be in DBFS or cloud storage (like S3 or ADLS), not /FileStore, which is more like a local scratch space. Thatโ€™s likely why itโ€™s not moving files to the archive folder and ends up reading them again, leading to those NULL records. Iโ€™d suggest switching your paths to something like /tmp/auto_loader and /tmp/archive, or use full DBFS paths like /dbfs/tmp/.... Once thatโ€™s in place, Autoloader should be able to archive the files properly and avoid reprocessing. Let me know if you want help updating those paths!

Regards,

Brahma

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now