Notebook exposure
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2025 02:45 AM
i have created a notebook as per client requirement. I have to migrate the notebook in the client env for testing with live data but do not want to expose the Databricks notebook code to the testers in the client env.
Is there a way to package the notebooks so that it cannot opened in the client workspace and can have only execute permission and no view/read/edit permission?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2025 06:26 AM
Hi @ShankarM ,
There isn’t a direct way to package a Databricks notebook so that it can only be executed without exposing the code. The suggested way is to move your sensitive logic into a Python/Scala/Java package (for example, a .whl or .jar), upload it to DBFS, Workspace Files, or a Unity Catalog volume, and install it on the cluster. You can then create a minimal notebook or job that simply calls this package and share only that with the testers. This lets them run the workflow in the client environment while keeping the actual code hidden inside the package.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2025 06:28 AM
Hi @ShankarM,
I’ve had to do something similar—packaging a Python class as a wheel. This documentation might help: https://docs.databricks.com/aws/en/dev-tools/bundles/python-wheel
Data Engineer | Machine Learning Engineer
LinkedIn: linkedin.com/in/wiliamrosa