โ09-28-2022 08:33 AM
Hello Team,
I am using df.write command and the table is getting created.
If you refer the below screenshot the table got created in Tables folder in dedicated sql pool. But i required in the External Tables folder.
Regards
RK
โ09-29-2022 12:46 AM
if you actually write into Synapse, it is not an external table. the data resides on synapse.
If you want to have an external table, write the data on your data lake in parquet/delta lake format and then create an external table on that location in synapse.
โ09-29-2022 12:54 AM
Please can you explain me with examples .
โ09-29-2022 01:04 AM
External tables are not actual tables but a 'view' on top of data which resides somewhere else, NOT in the database.
If you use df.write.format(...sqldw) you are actually gonna write into the database itself which makes it a common table.
If you would use df.write.parquet(<path>) you could create an external table on top of the data, see the link I provided.
โ09-29-2022 01:52 AM
Thanks for sharing the link.But this can be tested in synapse analytics but not in databricks
โ09-29-2022 01:54 AM
yes, databricks can send data to synapse but table management is done in Synapse, not databricks
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group