Thank You, @DE_K. I see your point. I believe you are using the @dlt.table instead of @dlt.create_table to begin with, since want the table to be created and not define and existing one. (https://community.databricks.com/t5/data-engineering/differenc...
Hey @DE_K , It's hard to follow without some more context, but shouldn't that be live instead of dlt - "select * from live.{0}".format(model_name)", considering you have already created the table.
Hi @Kingston Make sure that you have the proper permissions on the SQL server for the user you do the authentication through JDBC with, i.e. database reader / database writer. Then your approach can go in two directions, push the data from Databrick...
Assuming dbutils.fs.ls works without the "dbfs:/" prefix, try using it directly, i.e. df1 = pd.read_csv("/FileStore/shared_uploads/shiv/Dea.csv") . Alternatively, adjust the path as needed if using a local file path df1 = pd.read_csv("dbfs:/FileStore...
Seems like the answer is in the error. Can you try and follow this part of documentation and confirm if you have storage credential, external location, external table set up correctly:https://docs.databricks.com/en/sql/language-manual/sql-ref-externa...