- 1724 Views
- 3 replies
- 0 kudos
Hi ,We are looking for a option to copy tables more than 50 TB to be copied from syanpse to databricks on weekly basis , please suggest of there are any feasible ways for samewe are using connector but it is taking too long to copyhttps://learn.micro...
- 1724 Views
- 3 replies
- 0 kudos
Latest Reply
There is no databricks documentation on this as it is only involved for a very tiny bit:"CREATE TABLE catalog.schema.table USING PARQUET LOCATION 'url_to_the_parquet_files'.All the rest is done in Azure Data Factory, or you can even use the built-in ...
2 More Replies
- 1774 Views
- 2 replies
- 0 kudos
So I did U2M authentication for account-level operations -databricks auth login --host <account-login-url> --account-id <account-id>then I tried to run code-workspaces = account_client.workspaces.list()workspace_obj=account_client.get_workspace_clien...
- 1774 Views
- 2 replies
- 0 kudos
Latest Reply
I am doing using a service principal so my .databrickscfg has -[databricks-demo]client_id = -----client_secret = ----host = https://accounts.cloud.databricks.com/account_id = ----
1 More Replies