cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue: UCX Assessment Installation Error in Databricks Automation Script

meghana_tulla
New Contributor III

Hi 

I'm experiencing a problem when installing UCX Assessment through an automation script in Databricks. The script fails with this error:

13:38:06 WARNING [databricks.labs.ucx.hive_metastore.tables] {listing_tables_0} failed-table-crawl: listing tables from database -> ALL : [SCHEMA_NOT_FOUND] The schema `ALL` cannot be found. Verify the spelling and correctness of the schema and catalog.

What Works: When I install manually using:

bash
databricks labs install ucx@v0.57.0

The installation prompts me for:

  • Comma-separated list of workspace group names to migrate (default: <ALL>)
  • Comma-separated list of databases to migrate (default: <ALL>)

Using the default <ALL> values works perfectly - no errors occur and the UCX dashboard populates correctly.

What Doesn't Work: In my automation script, I'm explicitly setting:

bash
WORKSPACE_GROUPS="<ALL>" DATABASES="<ALL>"

Despite using the same <ALL> values, I get the schema not found error.

#UCX #Assessment #Databricks

Questions:

1.Why does the manual installation work with <ALL> but the automated script fails?

2.Are there additional parameters I need to configure for automation?

3.How can I resolve this schema lookup issue in the automation script?

4.Any insights or solutions would be greatly appreciated!

 

1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

The manual installation of UCX Assessment in Databricks works with the default <ALL> values, but automation scripts that set WORKSPACE_GROUPS="<ALL>" DATABASES="<ALL>" often encounter a SCHEMA_NOT_FOUND error related to 'ALL' not being recognized as a valid schema name.​

Why Manual Works, Automation Fails

  • During manual installs, the CLI/UI prompts for inputs and internally interprets <ALL> as a wildcard, triggering logic that migrates all schemas/databases. The CLI does not literally use 'ALL' as the schema name but understands it as "all available options".​

  • When automating, explicitly passing <ALL> as an environment variable likely leads the script to treat it as a literal name, causing attempts to crawl a schema named 'ALL', which does not exist—hence, the failure.​

Needed Parameters for Automation

  • Instead of setting DATABASES="<ALL>", leave the variable empty or unset; most automation tools will default to including all schemas if not explicitly specified.​

  • Check if the automation command supports wildcards or a specific keyword (like an empty string or "*"), as opposed to "<ALL>". The official UCX docs and troubleshooting guide recommend relying on default behavior rather than overriding it with a literal parameter, especially for automation workflows.​

  • You may need to consult the UCX README notebook generated on install, which details accepted parameters for jobs and automation.​

Resolving Schema Lookup Issues

  • Try omitting the explicit DATABASES and WORKSPACE_GROUPS parameters, letting the script use defaults. If parameters must be set, confirm with UCX documentation what value (empty, wildcard, etc.) triggers the “all” behavior.

  • Verify your automation script is running in the same context/catalog as your manual install. Workspace default catalog settings, such as not using hive_metastore, can also impact schema discovery.​

  • Review the UCX troubleshooting guide and error conditions for advanced configuration or known issues regarding schema crawl operations. Sometimes, a reinstall of UCX may be needed if assessment results are already stored and an incremental run isn't supported.​

Recommendations

  • For automation, remove the explicit <ALL> values or carefully check how UCX expects to receive "all" options in automation. This will resolve the SCHEMA_NOT_FOUND error by avoiding an invalid literal schema name.​

  • Always check the Databricks Catalog Explorer and diagnosis dashboards post-install to confirm schema detection is successful.​