cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

databricks-connect, dbutils, abfss path, URISyntaxException

KrzysztofPrzyso
New Contributor III

When trying to use `dbutils.fs.cp` in the #databricks-connect #databricks-connect context to upload files to Azure Datalake Gen2 I get a malformed URI error

I have used the code provided here:
https://learn.microsoft.com/en-gb/azure/databricks/dev-tools/databricks-connect/python/databricks-ut...

 

from databricks.sdk import WorkspaceClient
w = WorkspaceClient() 
path = r"abfss://bronze@devstorageacc.dfs.core.windows.net/test/"
w.dbutils.fs.cp('dbfs:/config.json', path)

Error:

```databricks.sdk.errors.mapping.InvalidParameterValue: java.net.URISyntaxException: Relative path in absolute URI: abfss:%5Cbronze@devstorageacc.dfs.core.windows.net%5Ctest```

KrzysztofPrzyso_0-1707241094344.png

The standard `dbutils.fs.cp` works on the cluster without problems. I have positively confirmed access rights.

Possibly it is a known issue described here: databricks-connect : Relative path in absolute URI · Issue #2883 · sparklyr/sparklyr (github.com)

3 REPLIES 3

Kaniz_Fatma
Community Manager
Community Manager

Hi @KrzysztofPrzysoIt appears that you’re encountering an issue with relative paths in absolute URIs when using dbutils.fs.cp in the context of Databricks Connect to upload files to Azure Data Lake Gen2.

Let’s break down the problem and explore potential solutions.

  1. Error Explanation: The error message you received indicates a malformed URI:

    databricks.sdk.errors.mapping.InvalidParameterValue: java.net.URISyntaxException: Relative path in absolute URI: abfss:%5Cbronze@devstorageacc.dfs.core.windows.net%5Ctest
    
  2. Root Cause: The issue likely stems from the fact that your provided path is considered relative, whereas dbutils.fs.cp expects an absolute path. Let’s delve into the details.

  3. Absolute vs. Relative Paths:

    • An absolute path specifies the complete location of a file or directory from the root directory.
    • A relative path is specified relative to the current working directory.
  4. Solution: To resolve this, ensure that you provide an absolute path when using dbutils.fs.cp. Here are some steps to consider:

    • Check Your Path: Verify that the path you’re passing ('dbfs:/config.json') is indeed an absolute path. If it starts with /, it’s an absolute path; otherwise, it’s relative.

    • Use an Absolute Path Explicitly: Instead of relying on relative paths, specify the full absolute path to the source file. For example:

      w.dbutils.fs.cp('/dbfs/config.json', path)
      
    • URL Encoding: If your path contains special characters (such as colons), ensure proper URL encoding. However, in your case, it seems the issue is not related to special characters.

    • Documentation Reference: Refer to the Databricks documentation for more details on working with DBFS paths.

  5. Known Issue: You mentioned a known issue related to relative paths in absolute URIs. If you suspect this is the case, consider checking the GitHub issue you referenced: databricks-connect : Relative path in absolute URI.

Remember that Databricks Connect runs in the cloud, so local file paths won’t work directly. You need to upload files to the Databricks file system. If you encounter further issues, explore the documentation for additional insights. 🚀

 

I have the same problem.

Running :

from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
w.dbutils.fs.ls("abfss://xxx@yyy.dfs.core.windows.net/zzz")

via Databricks Connect gives the same result as running :

from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
w.dbutils.fs.ls("/")

The problem seems to be the presence of '//': in that case the string being passed to the databricks library seems to be just '/'

When doing the same directly on a normal Databricks Workspace Notebook it works:

dbutils.fs.ls("abfss://xxx@yyy.dfs.core.windows.net/zzz")

@Gen_Vi332 ,

Apologies for the late response,
My experiments with other solutions show that using Volume paths instead of `abfss://` paths can potentially solve the issue.
The drawback is that you need to expose the folder (or its parent) as Volume in Unity Catalog and grant the client access to the volume object. 
`/Volumes/catalog_dev/operational/vlm_source_landing/file_path`
It seems that for many scenarios volume path is treated as local path, which can potentially simplify the code for non spark python operations.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group