cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Edition Help
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting error while creating external delta table in Databricks

Siddalinga
New Contributor

I am getting below error while creating external delta table in Databricks, even there is a external location created.

[NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] No parent external location was found for path 'abfss://destination@datalakeprojectsid.dfs.core.windows.net/salesdb/Person'. Please create an external location on one of the parent paths and then retry the query or command again.

%sql
CREATE TABLE Sales.Person
(
  id INT,
  name STRING,
  marks INT
)
USING DELTA
LOCATION 'abfss://destination@datalakeprojectsid.dfs.core.windows.net/salesdb/Person'
5 REPLIES 5

Takuya-Omi
Valued Contributor III

@Siddalinga 

If the path specified during table creation is outside the scope of the external location, you may encounter the [NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] error.
Is the external location correctly defined to scope the directory, such as abfss://destination@datalakeprojectsid.dfs.core.windows.net/salesdb, including the salesdb subdirectory?

Please double-check to ensure the external location is properly configured.

--------------------------
Takuya Omi (尾美拓哉)

saisaran_g
Contributor

Hello @Siddalinga,

Is the issue fixed ? If not please check below options : 

1. external location is created in databricks with proper service account or service principle ? 

2. Or any storage location is attached with proper service account. ?

Above 2 methods will establish the connection between databricks VM with external stroage connectivity. 

Happy learning.
Saran

Happy Learning and solve new errors :
Saran

Swamy09
New Contributor II

@Takuya-Omi @saisaran_g , I'm also facing the similar error, how can I resolve it.

[NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] No parent external location was found for path 'abfss://goldlayer@storagepracde.dfs.core.windows.net/cusdata'. Please create an external location on one of the parent paths and then retry the query or command again. SQLSTATE: 22KD1

LR
New Contributor II

Hi @Swamy09 The error [NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] indicates that there is no External Location configured for the storage path you are trying to use while creating the external table.
Before creating an external table, you must:
Create a Storage Credential
Create an External Location using that credential
Ensure the table path falls under the registered external location path
Example:

-- Step 1: Create external location
CREATE EXTERNAL LOCATION my_ext_loc
URL 'abfss://container@storageaccount.dfs.core.windows.net/path/'
WITH (STORAGE CREDENTIAL my_cred);

-- Step 2: Create external table
CREATE TABLE catalog.schema.my_table
USING DELTA
LOCATION 'abfss://container@storageaccount.dfs.core.windows.net/path/table_folder/';

 

Lokesh Rajpoot,

 

datagene
New Contributor II

There are different things to check to resolve it: how is Databricks connecting to the storage, Is it Managed Identity & Azure Role Assignment, Access Keys, or Service Principals with secrets. It is advisable to use Managed Identity & Azure Role Assignment. If you are using Managed Identity, you can create a User-Assigned Managed Identity (UAMI) in the Azure Portal specifically for Databricks.

Assigned RBAC Roles in azure On the Storage Account, you assign the Storage Blob Data Contributor role to the Managed Identity. This gives the identity permission to read, write, and delete blobs without needing a password. Create a Databricks Access Connector in azure, then bridge the gap between Azure and Databricks.

Link the Managed Identity created to the Access Connector. This resource acts as the "handshake" that allows Databricks to use the identity. Within Databricks, register the identity as a governed object.

Create a Credential in Unity Catalog. Point the credential to the Resource ID of the Azure Access Connector. This tells Unity Catalog: "Whenever someone uses this credential, use the Managed Identity to authenticate." Then define the External Location with the specific path where the data lives and also map it to the storage credential.

I believe if all of this is done correctly, you should not have an issue.