cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Issues to load from ADLS in DLT

guostong
New Contributor III

I am using DLT to load csv in ADLS, below is my sql query in notebook:

CREATE OR REFRESH STREAMING LIVE TABLE test_account_raw
AS SELECT * FROM cloud_files(
  "abfss://my_container@my_storageaccount.dfs.core.windows.net/test_csv/", 
  "csv", 
  map("header", "true"));

below is my configuration in Delta live table pipeline in order to access ADLS:

    "configuration": {
        "fs.azure.account.auth.type.my_storageaccount.dfs.core.windows.net": "OAuth",
        "fs.azure.account.oauth.provider.type.my_storageaccount.dfs.core.windows.net": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
        "fs.azure.account.oauth2.client.id.my_storageaccount.dfs.core.windows.net": "my_client_id",
        "fs.azure.account.oauth2.client.secret.my_storageaccount.dfs.core.windows.net": "my_secret",
        "fs.azure.account.oauth2.client.endpoint.my_storageaccount.dfs.core.windows.net": "https://login.microsoftonline.com/my_tenant_id/oauth2/token"
    }

the pipeline have below errors:

org.apache.spark.sql.streaming.StreamingQueryException: [STREAM_FAILED] Query [id = 818323fc-80d5-4833-9f46-7d1afc9c5bf7, runId = 722e9aac-0fdd-4206-9d49-683bb151f0bf] terminated with exception: The container in the file event `{"backfill":{"bucket":"root@dbstoragelhdp7mflfxe2y","key":"5810201264315799/Data/Temp/xxxx.csv","size":1801,"eventTime":1682522202000,"newerThan$default$2":false}}` is different from expected by the source: `my_container@my_storageaccount`.
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:395)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.$anonfun$run$2(StreamExecution.scala:257)
....

How can I fix this issue?

Thanks,

3 REPLIES 3

Anonymous
Not applicable

@Richard Guoโ€‹ :

The error message suggests that the container specified in the cloud_files function and the container specified in the fs.azure configuration settings are different. In the cloud_files function, you are using my_container while in the configuration settings you are using my_container@my_storageaccount.dfs.core.windows.net.

To fix the issue, you need to ensure that the container name used in both places matches exactly. You can try modifying the cloud_files function to use the full container path as follows:

CREATE OR REFRESH STREAMING LIVE TABLE test_account_raw
AS SELECT * FROM cloud_files(
  "abfss://my_storageaccount.dfs.core.windows.net/my_container/test_csv/", 
  "csv", 
  map("header", "true"));

Then, make sure that the fs.azure configuration settings use the same container path:

"configuration": {
        "fs.azure.account.auth.type.my_storageaccount.dfs.core.windows.net": "OAuth",
        "fs.azure.account.oauth.provider.type.my_storageaccount.dfs.core.windows.net": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
        "fs.azure.account.oauth2.client.id.my_storageaccount.dfs.core.windows.net": "my_client_id",
        "fs.azure.account.oauth2.client.secret.my_storageaccount.dfs.core.windows.net": "my_secret",
        "fs.azure.account.oauth2.client.endpoint.my_storageaccount.dfs.core.windows.net": "https://login.microsoftonline.com/my_tenant_id/oauth2/token",
        "fs.azure.createRemoteFileSystemDuringInitialization": "true",
        "fs.abfss.my_container@my_storageaccount.dfs.core.windows.net.tokenProviderType": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
        "fs.abfss.my_container@my_storageaccount.dfs.core.windows.net.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
        "fs.abfss.my_container@my_storageaccount.dfs.core.windows.net.oauth2.client.id": "my_client_id",
        "fs.abfss.my_container@my_storageaccount.dfs.core.windows.net.oauth2.client.secret": "my_secret",
        "fs.abfss.my_container@my_storageaccount.dfs.core.windows.net.oauth2.client.endpoint": "https://login.microsoftonline.com/my_tenant_id/oauth2/token"
    }

Note that in the fs.azure configuration settings, the fs.abfss prefix is used instead of fs.azure.account. This is because we are using the ABFS driver to access ADLS.

Anonymous
Not applicable

Hi @Richard Guoโ€‹ 

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 

guostong
New Contributor III

thank you every one, the problem is resolved, problem is gone when I have workspace admin access.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group