cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Notebook exposure

ShankarM
Contributor

i have created a notebook as per client requirement. I have to migrate the notebook in the client env for testing with live data but do not want to expose the Databricks notebook code to the testers in the client env.

Is there a way to package the notebooks so that it cannot opened in the client workspace and can have only execute permission and no view/read/edit permission?

 

2 REPLIES 2

SP_6721
Contributor III

Hi @ShankarM ,

There isnโ€™t a direct way to package a Databricks notebook so that it can only be executed without exposing the code. The suggested way is to move your sensitive logic into a Python/Scala/Java package (for example, a .whl or .jar), upload it to DBFS, Workspace Files, or a Unity Catalog volume, and install it on the cluster. You can then create a minimal notebook or job that simply calls this package and share only that with the testers. This lets them run the workflow in the client environment while keeping the actual code hidden inside the package.

WiliamRosa
New Contributor II

Hi @ShankarM,
Iโ€™ve had to do something similarโ€”packaging a Python class as a wheel. This documentation might help: https://docs.databricks.com/aws/en/dev-tools/bundles/python-wheel

Wiliam Rosa
Data Engineer | Machine Learning Engineer
LinkedIn: linkedin.com/in/wiliamrosa

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now