cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Simple notebook sync

Kabi
New Contributor II

Hi, is there a simple way to sync a local notebook with a Databricks notebook? For example, is it possible to just connect to the Databricks kernel or something similar?

I know there are IDE extensions for this, but unfortunately, they use the local development environment to interact with Databricks. As a result, I sometimes have to install additional libraries or adjust settings — for instance, I need to use a token to interact with MLflow.

1 ACCEPTED SOLUTION

Accepted Solutions

Renu_
Contributor

Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:

  • You can use Git to version control your notebooks. Clone your repo into Databricks via Repos, edit locally or in Databricks, and push changes to keep both environments in sync.
  • Install databricks-connect to execute local code on a Databricks cluster. Ensure your local Python version matches the cluster’s runtime.
  • Use the Databricks VSCode extension to sync files between your local machine and Databricks workspace. It's easy to develop locally while using Databricks clusters for execution.

And as you mentioned, for MLflow, set up your DATABRICKS_TOKEN to securely connect to tracking servers hosted on Databricks.

From my experience, first two ways are simplest way to sync notebooks.

View solution in original post

1 REPLY 1

Renu_
Contributor

Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:

  • You can use Git to version control your notebooks. Clone your repo into Databricks via Repos, edit locally or in Databricks, and push changes to keep both environments in sync.
  • Install databricks-connect to execute local code on a Databricks cluster. Ensure your local Python version matches the cluster’s runtime.
  • Use the Databricks VSCode extension to sync files between your local machine and Databricks workspace. It's easy to develop locally while using Databricks clusters for execution.

And as you mentioned, for MLflow, set up your DATABRICKS_TOKEN to securely connect to tracking servers hosted on Databricks.

From my experience, first two ways are simplest way to sync notebooks.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now