cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Not able to create external table in a schema under a Catalog.

Meshynix
New Contributor III

Problem Statement

Cluster 1 (Shared Cluster) is not able to read the file location at "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/" and hence we are not able to create an external table in a schema inside Enterprise Catalog.

Cluster 2 (No Isolation Shared) is able to read the file location at "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/" and we are able to create an external table but the external table is getting created in default hive metastore catalog and not in the desired Enterprise Catalog.

landingzone is an ADLS Gen2 mounted to Databricks Workspace through external location and storage credentials.

1 ACCEPTED SOLUTION

Accepted Solutions

It started working after we gave the exact external location path like abfss:// of the landing zone mounted external location and we were able to create an external table in the managed external mounted location using shared cluster in the desired catalog and schema.

View solution in original post

6 REPLIES 6

Meshynix
New Contributor III

Hi @Retired_mod Wondering if you can please advise a solution. Thanks heaps in advance.

YuliyanBogdanov
New Contributor III

Hi @Meshynix,

Can you provide the code snippet you execute to create your tables, this would give us a better insight for both use cases. Also can you provide the error that is being returned in the first use case. This would help a lot.


%sql
use catalog enterprise_catalog;
use schema bronze;
create table bronze_hmi
(sl_no string ,
date   string ,
time   string ,
auger_motor_rpm decimal(22,3)  ,
auger_motor_rpm_act decimal(22,3)  ,
auger_curr decimal(22,3)  ,
auger_torque   decimal(22,3)  ,
auger_on_time  decimal(22,3)  ,
auger_off_time decimal(22,3)  ,
blower_1_rpm   decimal(22,3)  ,
blower_2_rpm   decimal(22,3)  ,
blower_3_rpm   decimal(22,3)  ,
blower_4_rpm   decimal(22,3)  ,
blower_5_rpm   decimal(22,3)  ,
blower_6_rpm   decimal(22,3)  ,
temp1  decimal(22,3)  ,
temp2  decimal(22,3)  ,
temp3  decimal(22,3)  ,
temp4  decimal(22,3)  ,
temp5  decimal(22,3)  ,
temp6  decimal(22,3)  ,
total_power decimal(22,3)  ,
system_current decimal(22,3)  ,
cooling_pump_on_time   decimal(22,3)  ,
cooling_pump_off_time  decimal(22,3)  ,
input_load_cell decimal(22,3)  ,
output_load_cell   decimal(22,3)  ,
misc_col   string)
    USING CSV
    OPTIONS (header "true", inferSchema "true")
    LOCATION "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/hot_test_data/hmi/";

Error Message


[UC_FILE_SCHEME_FOR_TABLE_CREATION_NOT_SUPPORTED] Creating table in Unity Catalog with file scheme dbfs is not supported.
Instead, please create a federated data source connection using the CREATE CONNECTION command for the same table provider, then create a catalog based on the connection with a CREATE FOREIGN CATALOG command to reference the tables therein. SQLSTATE: 0AKUC

Seems like the answer is in the error. Can you try and follow this part of documentation and confirm if you have storage credential, external location, external table set up correctly:

https://docs.databricks.com/en/sql/language-manual/sql-ref-external-tables.html

It started working after we gave the exact external location path like abfss:// of the landing zone mounted external location and we were able to create an external table in the managed external mounted location using shared cluster in the desired catalog and schema.

Avi_Bricks
New Contributor II

External table creation failing with error :- 

UnityCatalogServiceException:[RequestId=**** ErrorClass=INVALID_PARAMETER_VALUE] Unsupported path operation PATH_CREATE_TABLE on volume.

Able to access and create files on external location. 

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group