I have a pyspark dataframe that contains information about the tables that I have on sql database (creation date, number of rows, etc)Sample data: {
"Day":"2023-04-28",
"Environment":"dev",
"DatabaseName":"default",
"TableName":"discount"...
Thanks a lot @Suteja Kanuri And the opposite, do you know how I can read those tables and using as a Pyspark DataFrames ?Once again thank you very much !!