Hi @RameshChejarla
You can connect Databricks to Snowflake using the Snowflake Spark connector. Once the connection is in place, you can use Auto Loader to capture file metadata and write it into a Snowflake table for tracking.
- Set up the Snowflake connection
sf_options = {
"sfURL": "<your_snowflake_account>.snowflakecomputing.com",
"sfWarehouse": "<warehouse>",
"sfDatabase": "<database>",
"sfSchema": "<schema>",
"sfUser": "<username>",
"sfPassword": "<password>"
}
- Write file tracking data to Snowflake
tracking_df.write.format("snowflake") \
.options(**sf_options) \
.option("dbtable", "<file_tracking_table>") \
.mode("append") \
.save()