02-07-2022 04:53 AM
Hi All ,
I have Custom code ( Pyspark & SparkSQL) (notebooks) which I want to deploy at customer location and encapsulate so that end customers don't see the actual code. Currently we have all code in Notebooks (Pyspark/spark sql). Could you please let me know
1) if there is any way to run the code as executables/jars.
2) How to convert the code as executables.
Thanks a lot for the help.
02-08-2022 02:33 AM
With notebooks that is not possible.
You can write your code in scala/java and build a jar, which you then run with spark-submit.
(example)
Or use python and deploy a wheel.
(example)
This can become quite complex when you have dependencies.
Also: a jar etc can be decompiled rendering your usecase useless.
02-08-2022 02:33 AM
With notebooks that is not possible.
You can write your code in scala/java and build a jar, which you then run with spark-submit.
(example)
Or use python and deploy a wheel.
(example)
This can become quite complex when you have dependencies.
Also: a jar etc can be decompiled rendering your usecase useless.
02-08-2022 02:48 AM
Hi , Thanks for the reply . Is it possible to restrict the view access by some premium plan ? please let me know . Thank You
02-08-2022 02:50 AM
yes certainly.
You can put permissions on notebooks and folders.
But probably the customer will be admin of the databricks workspace.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group