cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Can anyone let me know, Is there anyway In which we can access different workspace delta tables in a workspace where we run the pipelines using python...

Heman2
Valued Contributor II

Can anyone let me know, Is there anyway In which we can access different workspace delta tables in a workspace where we run the pipelines using python?​

6 REPLIES 6

UmaMahesh1
Honored Contributor III

Are the other workspace delta table external tables or stored in dbfs itself ?

Ajay-Pandey
Esteemed Contributor III

I think you can use the delta sharing option for the same or just mount the same storage to your workspace and try to access the hive table from the current workspace.

UmaMahesh1
Honored Contributor III

Exactly...If that's an external table, he can simply mount the storage in the workspace and proceed.

Hubert-Dudek
Esteemed Contributor III

Yes mount of storage is the best option

Pat
Honored Contributor III

Hi @Hemanth A​ ,

there are some details missing.

If you are using Unity Catalog then you can create a new workspace and attach it to the existing Unity Catalog so the data will be available in both workspaces (you have to grant access to the data itself), this is one of the many benefits of using UC.

You can also as was mentioned already create external tables to access the data.

if you are in a different region or databricks account or different cloud then you can also use delta sharing :).

thanks,

Pat.

Harish2122
Contributor

@Hemanth A​ 

go to the workspace you want data from, in warehouse tab you will find connectivity in that copy host name, http path and generate token for it, by this credentials you can access the data of this workspace in any other workspace.