Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-28-2023 12:25 PM
I have a pyspark dataframe that contains information about the tables that I have on sql database (creation date, number of rows, etc)
Sample data:
{
"Day":"2023-04-28",
"Environment":"dev",
"DatabaseName":"default",
"TableName":"discount",
"CountRows":31253
}and I want to write this dataframe to a custom log table that I created on Log Analytics workspace, is it possible?
Thank you !
Labels:
- Labels:
-
Azure
-
Log Analytics
-
Pyspark
-
Synapse