- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-06-2022 03:34 AM
Can I use compute instances from different providers i.e. AWS and Azure on the same Databricks workspace?
- Labels:
-
AWS
-
Azure
-
Compute Instances
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-06-2022 04:38 AM
Hi @Liam ODonoghueโ there are a few methods using which you can connect AWS and Azure resources. In such cases, it involves only your accounts. But with Databricks, you need to handle two accounts for both cloud providers. Let's say if you create a workspace in AWS, you create the dataplane in your account and peer it with the control plane that is created in Databricks account. The same is applicable for Azure. If you want you can try VPN tunnel, ExpressRoute, site-to-site VPN to connect your cloud providers and share the resources. But when it comes to Databricks, making this setup is not possible as it has to go through a lot of security compliance and more research and testing to ensure there is no breach by any source. Might be the product and security team could consider this in the future as many enterprises go with hybrid cloud. You can raise a feature request via our ideas portal.
TLDR; This is not possible in Databricks for now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-06-2022 04:15 AM
nice one. does databricks supports multi cloud ? this is not possible as the cloud vendors are different and both have different configuration.
Databricks folks @Jose Gonzalezโ @Prabakar Ammeappinโ @Abishek Subramanianโ @Kaniz Fatmaโ can give you some input on this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-06-2022 04:38 AM
Hi @Liam ODonoghueโ there are a few methods using which you can connect AWS and Azure resources. In such cases, it involves only your accounts. But with Databricks, you need to handle two accounts for both cloud providers. Let's say if you create a workspace in AWS, you create the dataplane in your account and peer it with the control plane that is created in Databricks account. The same is applicable for Azure. If you want you can try VPN tunnel, ExpressRoute, site-to-site VPN to connect your cloud providers and share the resources. But when it comes to Databricks, making this setup is not possible as it has to go through a lot of security compliance and more research and testing to ensure there is no breach by any source. Might be the product and security team could consider this in the future as many enterprises go with hybrid cloud. You can raise a feature request via our ideas portal.
TLDR; This is not possible in Databricks for now.

