- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 03:19 AM
Hi Team, we have a scenario where we have to connect to the DataBricks SQL instance 1 from another DataBricks instance 2 using notebook or Azure Data Factory. Can you please help?
- Labels:
-
Databricks SQL
-
SQL
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 05:08 AM
Hello @Jayesh Mehta
You can using Python connector from notebook
https://docs.databricks.com/dev-tools/python-sql-connector.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 05:08 AM
Hello @Jayesh Mehta
You can using Python connector from notebook
https://docs.databricks.com/dev-tools/python-sql-connector.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 05:59 AM
Thanks a lot 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 06:32 AM
- You can also mount on both databricks workspaces the same storage folder and use COPY INTO for example,
- In ADF you can use a databricks/delta connector to read from one delta and insert it into another,
- If it is the one-time copy (like migration) we can copy using ADF just files between storage accounts, it is the fastest copy
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-03-2022 10:01 AM
Make sure that you need such an operation in principle : we have Clone use cases as well as Delta Sharing features that might be useful for you

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-03-2022 11:07 AM
Thanks for jumping in to help @Arvind Ravish @Hubert Dudek and @Artem Sheiko !

