- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago
Hi, is there a simple way to sync a local notebook with a Databricks notebook? For example, is it possible to just connect to the Databricks kernel or something similar?
I know there are IDE extensions for this, but unfortunately, they use the local development environment to interact with Databricks. As a result, I sometimes have to install additional libraries or adjust settings — for instance, I need to use a token to interact with MLflow.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago - last edited a week ago
Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:
- You can use Git to version control your notebooks. Clone your repo into Databricks via Repos, edit locally or in Databricks, and push changes to keep both environments in sync.
- Install databricks-connect to execute local code on a Databricks cluster. Ensure your local Python version matches the cluster’s runtime.
- Use the Databricks VSCode extension to sync files between your local machine and Databricks workspace. It's easy to develop locally while using Databricks clusters for execution.
And as you mentioned, for MLflow, set up your DATABRICKS_TOKEN to securely connect to tracking servers hosted on Databricks.
From my experience, first two ways are simplest way to sync notebooks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a week ago - last edited a week ago
Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:
- You can use Git to version control your notebooks. Clone your repo into Databricks via Repos, edit locally or in Databricks, and push changes to keep both environments in sync.
- Install databricks-connect to execute local code on a Databricks cluster. Ensure your local Python version matches the cluster’s runtime.
- Use the Databricks VSCode extension to sync files between your local machine and Databricks workspace. It's easy to develop locally while using Databricks clusters for execution.
And as you mentioned, for MLflow, set up your DATABRICKS_TOKEN to securely connect to tracking servers hosted on Databricks.
From my experience, first two ways are simplest way to sync notebooks.

