โ12-06-2022 03:51 AM
Can anyone let me know, Is there anyway In which we can access different workspace delta tables in a workspace where we run the pipelines using python?โ
โ12-06-2022 03:56 AM
Are the other workspace delta table external tables or stored in dbfs itself ?
โ12-06-2022 04:00 AM
I think you can use the delta sharing option for the same or just mount the same storage to your workspace and try to access the hive table from the current workspace.
โ12-06-2022 04:05 AM
Exactly...If that's an external table, he can simply mount the storage in the workspace and proceed.
โ12-06-2022 06:39 AM
Yes mount of storage is the best option
โ12-06-2022 02:48 PM
Hi @Hemanth Aโ ,
there are some details missing.
If you are using Unity Catalog then you can create a new workspace and attach it to the existing Unity Catalog so the data will be available in both workspaces (you have to grant access to the data itself), this is one of the many benefits of using UC.
You can also as was mentioned already create external tables to access the data.
if you are in a different region or databricks account or different cloud then you can also use delta sharing :).
thanks,
Pat.
โ12-10-2022 02:07 AM
@Hemanth Aโ
go to the workspace you want data from, in warehouse tab you will find connectivity in that copy host name, http path and generate token for it, by this credentials you can access the data of this workspace in any other workspace.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group