In Data Science & Engineering -> Data -> Data Explorer, if I expand the hive_metastore, then expand a schema and choose a table, and then view the "Sample Data", I receive this error:
[DEFAULT_FILE_NOT_FOUND] It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved.
In the old UI, I would click the "Refresh" button to clear the error. That button does not seem to exist anymore. Is there a new method to REFRESH a table within the Data Explorer?