Deploying Databricks Asset Bundles entirely from a notebookโwithout using the CLI or VS Codeโis not a standard workflow but can be orchestrated using newer features in the Databricks workspace UI and by leveraging programmatic workspace operations. Databricks has recently enhanced its UI to support some bundle creation and deployment steps directly, which can replace much of the CLI-based process for simpler scenarios.โ
Deploying Asset Bundles in the Databricks Workspace
-
The UI now enables you to create Databricks Asset Bundles, define jobs, add notebooks, and manage deployments from within the workspace, all without the CLI or external IDEs. You start by creating a bundle from the workspace menu, add your notebook, set up your jobโs YAML, and deploy it using built-in buttons in the UI.โ
-
After creating the bundle and resources, jobs can be triggered directly from the workspace. The bundle YAML supports specifying tasks, notebooks, clusters, and deployment targets, which you edit either through simple YAML edit windows or in the resource view.โ
-
Advanced bundle configurations can be managed in the UI, including Python package builds and multi-environment resources.โ
Bundling and Packaging in a Notebook
-
You can script key operations in a Databricks notebook using Python. For advanced cases (such as auto-packaging a Python module or deploying via REST API), the following are options:
-
Programmatically generate the asset bundle YAML and job definitions from the notebook.
-
Use the Databricks REST API from within the notebook to create jobs, upload notebooks, and configure clusters, bypassing the CLI. While the UI is the main path for bundles, REST calls are flexible for automation.โ
-
For local code packaging, you may use Python code in the notebook to create a wheel file as part of your asset bundle, referencing official patterns for workspace-integrated builds.โโ
Key Limitations and Tips
-
Full automationโespecially for complex, multi-resource asset bundlesโmay require manual bundle file configuration, even in the UI.
-
The UI is best for straightforward deployments and configurations. For fully custom workflows, programmatic invocation (via REST API or notebook scripting) is recommended.
-
Some workspace actions (e.g., bundle validation, advanced packaging) might still require Databricks CLI for now, depending on your Databricks version and workspace capabilities.
References and Further Reading
-
Databricks Asset Bundles in the UI: step-by-step tutorial covers the process within the workspace, including bundle creation, job definition, notebook addition, and deployment.โ
-
Asset Bundle configuration syntax and advanced techniques: details on defining resources and deployments in YAML files.โ
-
Building and deploying Python packages as part of the asset bundle: supported for workspace builds.โโ
-
For REST API automation, see job creation and management guides for scripting deployments from notebooks.โ
If you need a specific code template or more detailed steps on automating bundle creation directly from a notebook, or on using the REST API for deploying jobs, please specify your Databricks version and workspace setup for more targeted instructionsed instructions.