Is it possible to write a pyspark dataframe to a custom log table in Log Analytics workspace?

frank7
Databricks Partner

I have a pyspark dataframe that contains information about the tables that I have on sql database (creation date, number of rows, etc)

Sample data:

 {
   "Day":"2023-04-28",
   "Environment":"dev",
   "DatabaseName":"default",
   "TableName":"discount",
   "CountRows":31253
}

and I want to write this dataframe to a custom log table that I created on Log Analytics workspace, is it possible?

Thank you !