We are excited to share some of the latest updates in Databricks Notebooks. From AI-powered Databricks Assistant that automates code development to new charts with better performance, these features help you build faster.
See the latest features live at the webinar on October 24.
Let’s start with Databricks Assistant
Databricks Assistant makes you more productive inside Databricks. Describe your task in English and let the Assistant generate SQL queries, explain complex code, and automatically fix errors. The Assistant leverages context from your notebook and Unity Catalog metadata to provide accurate, personalized responses.
Generate SQL or Python code
Databricks Assistant is natively integrated into Notebooks, SQL editor and file editor. It can help you accelerate projects by writing boilerplate code or providing initial code for you to start with. You can then execute the code, copy it, or add it into a new cell for further development.
Explain code or queries
Databricks Assistant describes complex pieces of code or queries in clear, concise language, making it easier to quickly ramp up on unfamiliar codebases.
Fix issues
Databricks Assistant can identify errors in your code and recommend fixes. When you encounter issues like syntax errors, the Assistant will explain the problem and create a code snippet with a proposed fix.
Databricks Assistant applies a number of signals to provide more accurate, relevant results. It uses context from code cells, libraries, and tables in your notebook to generate code and queries.
Watch the demo or enable the Assistant by following these instructions.
Better performance and improved colors for visualizations
Databricks has released a Public Preview of new charts for visualizing data in notebooks and in Databricks SQL. These new charts feature better performance, improved colors, and faster interactivity. These charts will replace the legacy rendering library currently used by Databricks charts.
Access web terminal from within Notebooks
Databricks web terminal provides a convenient way to run shell commands and use editors, such as Vim or Emacs, on the Spark driver node. Unlike using SSH, web terminal can be used by many users on one cluster and does not require setting up keys. Example uses of the web terminal include monitoring resource usage and installing Linux packages.
View all your data without leaving Notebooks
The new unified schema browser lets you view all of your data without leaving a notebook. You can select “For you” to filter the list to the active tables in your notebook. As you type your search request into the filter box, the display actively updates to show only those items that contain that text. This will look for items that are currently open or have been opened earlier in the current session.
Easily navigate between notebooks with search
Search now returns more relevant results. It shows recents and can now return tables with Unity Catalog that match the search query semantically in addition to keyword matching.
Use SQL warehouses in Notebooks
Databricks SQL warehouses are now public preview in notebooks, combining the flexibility of notebooks with the performance and TCO of Databricks SQL Serverless and Pro warehouses. To enable SQL warehouses in notebooks, simply select an available SQL warehouse from the notebooks compute dropdown.
Improved autocomplete for SQL
Autocomplete now always detects if you’ve already typed the table alias and therefore will avoid populating the alias twice. It recommends popular joins for tables on Unity Catalog and now works for all CTEs and subqueries.
Other improvements
- Refreshed developer settings page
- Dark mode in more areas
- Visualizations in notebooks have a larger hit area for mouse actions
Stay tuned for more innovations in Notebooks.