Hey, I think you'll need to use a Databricks activity instead of Copy
See :
https://learn.microsoft.com/en-us/azure/data-factory/connector-overview#integrate-with-more-data-storeshttps://learn.microsoft.com/en-us/azure/data-factory/transform-data-dat...
Hi @databird ,
You can review the code of each demo by opening the content via "View the Notebooks" or by exploring the following repo : https://github.com/databricks-demos (you can try to search for "merge" to see all the occurrences, for example)
T...
Regardless on how you create the file (however, take a look at Volumes, I'm 100% you'll see the value), please try to proceed with an approach with Excel I described above
Hi @VabethRamirez ,
Also, instead of using directly the API, you can use databricks Python sdk :
%pip install databricks-sdk --upgrade
dbutils.library.restartPython()from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
job_list = w.jobs...
Hi,
According to When to partition tables on Databricks :
Databricks recommends you do not partition tables that contain less than a terabyte of data.If you proceed with partitions, please check if all partitions contain at least a gigabyte of data....