cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

spark properties files

dataguy73
New Contributor

I am trying to migrate a spark job from an on-premises Hadoop cluster to data bricks on azure. Currently, we are keeping many values in the properties file. When executing spark-submit we pass the parameter --properties /prop.file.txt. and inside the spark code we use spark.conf.get("spark.param1") to get individual parameter values .How can we implement properties file in the Databricks notebook

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

I use JSON files and .conf files which reside on the data lake or in the filestore of dbfs.

Then read those files using python/scala

View solution in original post

2 REPLIES 2

-werners-
Esteemed Contributor III

I use JSON files and .conf files which reside on the data lake or in the filestore of dbfs.

Then read those files using python/scala

Kaniz
Community Manager
Community Manager

Hi @vishal dutt​ , Were you able to implement the properties file in the Databricks notebook?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.