- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-07-2024 08:58 AM
Hello!
I'm new on Databricks and I'm exploring some of its features.
I've successfully configured a workspace with unity catalog, one external storage location (ADLSg2) and the associated storage credential. I provided all privileges for all account users and try 'test connection' to ensure that everything is ok.
When I run the following command:
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-15-2024 08:00 AM
Hello @Mo ,
thank you for the quick feedback and sorry for the late reply.
The issue was related to 'schema_location_path' Azure container.
I forgot to register 'schema_location_path' container as external location and the script was not able to read from this specific location.
I added the specific external location and fixed the problem.
Thank you!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-08-2024 08:38 AM
hey @garf
could you please try to create an external volume using your external location and then use the file path in the volume as the input file path?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-15-2024 08:00 AM
Hello @Mo ,
thank you for the quick feedback and sorry for the late reply.
The issue was related to 'schema_location_path' Azure container.
I forgot to register 'schema_location_path' container as external location and the script was not able to read from this specific location.
I added the specific external location and fixed the problem.
Thank you!

