cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week. Databricks platform release notesApril 20 -...

Sujitha
Community Manager
Community Manager

Weekly Release Notes Recap

Here’s a quick recap of the latest release notes updates from the past one week.

Databricks platform release notes

April 20 - 27, 2023

Configure the Python formatter

For files and notebooks in Databricks Repos, you can now configure the Python formatter based on a Black specification. See Format Python cells.

Workspace files are GA

You can now work with non-notebook files in Databricks. Workspaces files are enabled by default in all workspaces. See What are workspace files?.

Cluster-scoped init scripts can now be stored in workspace files

You can now store cluster-scoped init scripts in workspace files, regardless of Databricks Runtime version used by your compute. Databricks recommends storing all cluster-scoped init scripts in workspace files. See Store init scripts in workspace files.

Databricks SQL release notes

April 20 - 27, 2023

Databricks SQL Version 2023.20 Available

Rollout Schedule

  • Preview rollout for 2023.20: Between Mar 15, 2023 and Mar 23, 2023
  • Current rollout for 2023.20: Between Mar 27, 2023 and Apr 3, 2023

Changes in 2023.20

  • Delta Lake schema evolution supports specifying source columns in merge statements.
  • Remove all NULL elements from an array using array_compact.
  • To append elements to an array, use array_append.
  • To anonymize sensitive string values, use the mask function.
  • Common error conditions now return SQLSTATE.
  • Invoke table-valued generator functions in the regular FROM clause of a query.
  • Use the from_protobuf and to_protobuf functions to exchange data between binary and struct types. See Read and write protocol buffers.
  • Improved consistency for Delta commit behavior for empty transactions relating to update, delete, and merge commands.
  • Behavior change
    • The lateral column alias feature introduces behavior changes during name resolution. See Breaking changes.

User interface updates

The features listed in this section are independent of the SQL Warehouse compute versions described in the Channels section of the release notes.

April 20, 2023

Improvements:

  • Administrators can change warehouse owners using the user interface or the API. See Transfer ownership of Databricks SQL objects.
  • Introduces new pivot tables that allow you to aggregate more than 64k results.
  • Databricks SQL tables and visualizations now support BigInt, 38bit Decimals, and non UTF-8 characters. For numbers, the default setting is now user-defined digit precision.
  • Autocomplete now suggests frequent past joins for Unity Catalog tables, powered by Unity Catalog lineage data in Databricks Runtime 12.0 and above.
  • Cloud Fetch is enabled by default in AWS workspaces with bucket versioning enabled. If you have bucket versioning enabled, Databricks recommends setting a lifecycle policy to automatically remove old versions of uploaded query results. See Advanced configurations.

New feature:

  • Return text generated by a selected large language model (LLM) given the prompt with ai_generate_text. This function is only available as public preview on Databricks SQL Pro and Serverless. To participate in the public preview, populate and submit the AI Functions Public Preview enrollment form.

Databricks runtime releases

April 20 - 27, 2023

User interface updates

The features listed in this section are independent of the SQL Warehouse compute versions described in the Channels section of the release notes.

April 20, 2023

Improvements:

  • Administrators can change warehouse owners using the user interface or the API. See Transfer ownership of Databricks SQL objects.
  • Introduces new pivot tables that allow you to aggregate more than 64k results.
  • Databricks SQL tables and visualizations now support BigInt, 38bit Decimals, and non UTF-8 characters. For numbers, the default setting is now user-defined digit precision.
  • Autocomplete now suggests frequent past joins for Unity Catalog tables, powered by Unity Catalog lineage data in Databricks Runtime 12.0 and above.
  • Cloud Fetch is enabled by default in AWS workspaces with bucket versioning enabled. If you have bucket versioning enabled, Databricks recommends setting a lifecycle policy to automatically remove old versions of uploaded query results. See Advanced configurations.

New feature:

  • Return text generated by a selected large language model (LLM) given the prompt with ai_generate_text. This function is only available as public preview on Databricks SQL Pro and Serverless. To participate in the public preview, populate and submit the AI Functions Public Preview enrollment form.

0 REPLIES 0
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.