Hi @Kaviana, To consume a VPC that is already anchored in the network of Databricks to extract information from a server, you have two options:
1. Use the default VPC provided by Databricks: By default, your workspace compute resources, such as clusters, are created within a GKE cluster within a Virtual Private Cloud (VPC) network. Databricks creates and configures this VPC in your account.
2. Use a customer-managed VPC: Alternatively, you can choose to create your workspaces in an existing customer-managed VPC that you create in your account. This gives you more control over your network configurations to comply with specific cloud security and governance standards that your organization may require. To use a customer-managed VPC, specify it when creating the workspace through the account console. You cannot move an existing workspace with a Databricks-managed VPC to your own VPC.
Also, you cannot change which customer-managed VPC the workspace uses after workspace creation.
However, you can share one customer-managed VPC with multiple workspaces in a single account. You do not have to create a new VPC for each workspace. However, you cannot reuse subnets or security groups with other resources, including different workspaces or non-Databricks resources.
Please note that you would need to set up your VPC, subnets, and security groups using the instructions provided by Databricks. Then, copy the IDs for each object for the next step, in which you register them with Databricks and get a network ID to represent your new network.