I am trying to migrate a spark job from an on-premises Hadoop cluster to data bricks on azure. Currently, we are keeping many values in the properties file. When executing spark-submit we pass the parameter --properties /prop.file.txt. and inside the spark code we use spark.conf.get("spark.param1") to get individual parameter values .How can we implement properties file in the Databricks notebook