cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Encapsulate Databricks Pyspark/SparkSql code

Databricks_7045
New Contributor III

Hi All ,

I have Custom code ( Pyspark & SparkSQL) (notebooks) which I want to deploy at customer location and encapsulate so that end customers don't see the actual code. Currently we have all code in Notebooks (Pyspark/spark sql). Could you please let me know

1) if there is any way to run the code as executables/jars.

2) How to convert the code as executables.

Thanks a lot for the help.

1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

With notebooks that is not possible.

You can write your code in scala/java and build a jar, which you then run with spark-submit.

(example)

Or use python and deploy a wheel.

(example)

This can become quite complex when you have dependencies.

Also: a jar etc can be decompiled rendering your usecase useless.

View solution in original post

4 REPLIES 4

Kaniz_Fatma
Community Manager
Community Manager

Hi @ Databricks.70459! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

-werners-
Esteemed Contributor III

With notebooks that is not possible.

You can write your code in scala/java and build a jar, which you then run with spark-submit.

(example)

Or use python and deploy a wheel.

(example)

This can become quite complex when you have dependencies.

Also: a jar etc can be decompiled rendering your usecase useless.

Hi , Thanks for the reply . Is it possible to restrict the view access by some premium plan ? please let me know . Thank You

-werners-
Esteemed Contributor III

yes certainly.

You can put permissions on notebooks and folders.

But probably the customer will be admin of the databricks workspace.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group