cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

500 Error on /ajax-api/2.0/fs/list When Accessing Unity Catalog Volume in Databricks

MrFi
New Contributor

 

We are encountering an issue with volumes created inside Unity Catalog. We are using AWS and Terraform to host Databricks, and our Unity Catalog structure is as follows:
• Catalog: catalog_name
• Schemas: raw, bronze, silver, gold (all with external locations to S3)
• Volume: An external volume inside the raw schema
 
The issue:
• Using a notebook and dbutils, I can successfully list the volume’s contents: dbutils.fs.ls("/Volumes/catalog_name/raw/files")
• However, when navigating to the Unity Catalog UI and attempting to access the volume, I receive the following error:
 
Failed to request /ajax-api/2.0/fs/list?path=%2FVolumes%2Fcatalog_name%2Fraw%2Ffiles&page_size=1000: 500
(screenshot attached)
 
Additionally:
• I am not able to upload files using the UI.
• I cannot set up a file trigger inside a workflow.
• This does not appear to be a permissions issue, as I can list the volume’s contents from a notebook.
 
Any ideas? 😊
1 REPLY 1

Brahmareddy
Honored Contributor II

Hi @MrFi 

How are you doing today?

As per my understanding, It looks like the Unity Catalog UI might have trouble handling external volumes, even though dbutils works fine. Try running SHOW VOLUMES IN catalog_name.raw; to check if the volume is properly registered. Also, verify that your Databricks IAM role has s3:ListBucket, s3:GetObject, and s3:PutObject permissions, as UI and workflows might require additional access. Double-check your Terraform setup to ensure the storage_location correctly points to S3 and try terraform apply again. If file triggers aren’t working, manually running a workflow might help identify the issue. If everything else looks fine, it could be a UI limitation—testing with API calls or reaching out to Databricks support might clarify if external volumes are fully supported in the UI.

I Hope this helps! Give a try and let me know.

Good day.

Regards,

Brahma

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now