I have a general question regarding how the table in Databricks changes when we change the underlying data file in S3. For example, if I convert multiple folders that are defined by dates in a single folder into parquet by using the following code: CONVERT TO DELTA parquet.`s3://example_bucket/test_1/` PARTITIONED BY (report_date DATE) and then create a table on an external location using the code: CREATE TABLE test_catalog.test_schema.test_1 USING DELTA LOCATION 's3://example_bucket/test_1/' If I then add a new folder to that same location, how to do we make the table in Databricks update as well?
... View more