cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Runtime 10.5 (Beta)  �� Auto Loader: new SQL function CLOUD_FILES_STATE You can use the new CLOUD_FILES_STATE function to query the ...

Hubert-Dudek
Esteemed Contributor III

Databricks Runtime 10.5 (Beta)

👉 Auto Loader: new SQL function CLOUD_FILES_STATE

You can use the new CLOUD_FILES_STATE function to query the internal state of an Auto Loader stream.

👉 Delta Lake: new maxRecordsPerFile option for maximum records written to a single file

When you use the DataFrame APIs to write to a Delta table, you can use the

"maxRecordsPerFile" option to specify the maximum number of records to write out to a single file. Setting a value of zero or a negative value represents no limit.

👉 Deprecation of Koalas

👉 Unity Catalog: SQL LIST output removes "is_directory" column and adds trailing / for directory paths...

👉 more info https://docs.databricks.com/release-notes/runtime/10.5.html

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @Hubert Dudek​ , Databricks has the best customers! Thank you so much for your support!

SørenRavn
New Contributor II

anyone that got CLOUD_FILES_STATE to work on 10.5 Beta/Azure.

Syntax:

%sql

SELECT * FROM cloud_files_state('abfss://test@test.dfs.core.windows.net/test/checkpoint/');

i get this error:

com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException: could not resolve `cloud_files_state` to a table-valued function

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!