External table issue
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-28-2022 08:33 AM
Hello Team,
I am using df.write command and the table is getting created.
If you refer the below screenshot the table got created in Tables folder in dedicated sql pool. But i required in the External Tables folder.
Regards
RK
- Labels:
-
Command
-
External Tables
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2022 12:46 AM
if you actually write into Synapse, it is not an external table. the data resides on synapse.
If you want to have an external table, write the data on your data lake in parquet/delta lake format and then create an external table on that location in synapse.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2022 12:54 AM
Please can you explain me with examples .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2022 01:04 AM
External tables are not actual tables but a 'view' on top of data which resides somewhere else, NOT in the database.
If you use df.write.format(...sqldw) you are actually gonna write into the database itself which makes it a common table.
If you would use df.write.parquet(<path>) you could create an external table on top of the data, see the link I provided.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2022 01:52 AM
Thanks for sharing the link.But this can be tested in synapse analytics but not in databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2022 01:54 AM
yes, databricks can send data to synapse but table management is done in Synapse, not databricks

