โ03-27-2024 11:22 PM
Problem Statement
Cluster 1 (Shared Cluster) is not able to read the file location at "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/" and hence we are not able to create an external table in a schema inside Enterprise Catalog.
Cluster 2 (No Isolation Shared) is able to read the file location at "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/" and we are able to create an external table but the external table is getting created in default hive metastore catalog and not in the desired Enterprise Catalog.
landingzone is an ADLS Gen2 mounted to Databricks Workspace through external location and storage credentials.
โ03-30-2024 03:04 AM
It started working after we gave the exact external location path like abfss:// of the landing zone mounted external location and we were able to create an external table in the managed external mounted location using shared cluster in the desired catalog and schema.
โ03-28-2024 05:43 AM
Hi @Retired_mod Wondering if you can please advise a solution. Thanks heaps in advance.
โ03-28-2024 07:41 AM
Hi @Meshynix,
Can you provide the code snippet you execute to create your tables, this would give us a better insight for both use cases. Also can you provide the error that is being returned in the first use case. This would help a lot.
โ03-28-2024 08:45 AM
%sql
use catalog enterprise_catalog;
use schema bronze;
create table bronze_hmi
(sl_no string ,
date string ,
time string ,
auger_motor_rpm decimal(22,3) ,
auger_motor_rpm_act decimal(22,3) ,
auger_curr decimal(22,3) ,
auger_torque decimal(22,3) ,
auger_on_time decimal(22,3) ,
auger_off_time decimal(22,3) ,
blower_1_rpm decimal(22,3) ,
blower_2_rpm decimal(22,3) ,
blower_3_rpm decimal(22,3) ,
blower_4_rpm decimal(22,3) ,
blower_5_rpm decimal(22,3) ,
blower_6_rpm decimal(22,3) ,
temp1 decimal(22,3) ,
temp2 decimal(22,3) ,
temp3 decimal(22,3) ,
temp4 decimal(22,3) ,
temp5 decimal(22,3) ,
temp6 decimal(22,3) ,
total_power decimal(22,3) ,
system_current decimal(22,3) ,
cooling_pump_on_time decimal(22,3) ,
cooling_pump_off_time decimal(22,3) ,
input_load_cell decimal(22,3) ,
output_load_cell decimal(22,3) ,
misc_col string)
USING CSV
OPTIONS (header "true", inferSchema "true")
LOCATION "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/hot_test_data/hmi/";
Error Message
[UC_FILE_SCHEME_FOR_TABLE_CREATION_NOT_SUPPORTED] Creating table in Unity Catalog with file scheme dbfs is not supported.
Instead, please create a federated data source connection using the CREATE CONNECTION command for the same table provider, then create a catalog based on the connection with a CREATE FOREIGN CATALOG command to reference the tables therein. SQLSTATE: 0AKUC
โ03-28-2024 11:29 PM
Seems like the answer is in the error. Can you try and follow this part of documentation and confirm if you have storage credential, external location, external table set up correctly:
https://docs.databricks.com/en/sql/language-manual/sql-ref-external-tables.html
โ03-30-2024 03:04 AM
It started working after we gave the exact external location path like abfss:// of the landing zone mounted external location and we were able to create an external table in the managed external mounted location using shared cluster in the desired catalog and schema.
โ05-15-2024 03:08 PM
External table creation failing with error :-
UnityCatalogServiceException:[RequestId=**** ErrorClass=INVALID_PARAMETER_VALUE] Unsupported path operation PATH_CREATE_TABLE on volume.
Able to access and create files on external location.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group