โ03-23-2023 11:10 AM
Hi All,
Could you please suggest to me the best way to write PySpark code in Databricks,
I don't want to write my code in Databricks notebook but create python files(modular project) in Vscode and call only the primary function in the notebook(the rest of the logic will be written in python files).
Could you please let me know the best way to achieve it?
Thanks,
Deepak
โ03-24-2023 11:57 PM
@Deepak Bhattโ :
Yes, you can write your PySpark code in modular Python files outside of Databricks and then call them from a Databricks notebook. Here are the steps you can follow:
import my_pyspark_code
Call the main function in your Python file from the Databricks notebook. For example, if your main function is named run_spark_job() you can call it like this:
my_pyspark_code.run_spark_job()
By following these steps, you can write your PySpark code in a modular and maintainable way outside of Databricks, and then easily call it from a Databricks notebook.
โ03-24-2023 11:57 PM
@Deepak Bhattโ :
Yes, you can write your PySpark code in modular Python files outside of Databricks and then call them from a Databricks notebook. Here are the steps you can follow:
import my_pyspark_code
Call the main function in your Python file from the Databricks notebook. For example, if your main function is named run_spark_job() you can call it like this:
my_pyspark_code.run_spark_job()
By following these steps, you can write your PySpark code in a modular and maintainable way outside of Databricks, and then easily call it from a Databricks notebook.
โ03-25-2023 10:48 PM
Hi @Deepak Bhattโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.