Hi @rkand,You can update the pattern to target only files with a 2023-02 prefix in their names.This will match all files from February, regardless of the specific date and timestamp.Try with PATTERN = '2023-02-*.csv.gz'This pattern matches any files ...
It is not generated with AI, I am simply trying to give you an alternative since as we have previously commented, you cannot collect statistics that exceed more than 32 columns, you already know that this is the only current limitation.
Hi @Mani2105,if i create a table in the sales catalog without specifiying any external location, will the tables created be managed and will go to the Sales storage account Yes, if you create a table in the sales catalog without specifying any exte...
HI @zmsoft, Although it is a very generic and complicated question to answer without knowing more about the data solution you need, I will leave you with some characteristics of both services. As always, the final decision you make will depend on the...
Hi @Kguy,Good question! In Databricks, Delta Lake automatically collects statistics only for the first 32 columns in a table for performance optimization. When __start_at and __end_at columns are beyond this limit, they are excluded from automatic st...