cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

curl: (26) Failed to open/read local data from file/application in DBFS

kavya08
New Contributor

Hi all,

I am trying to upload a parquet file from S3 to dbfs with airflow bash operator curl command using Databricks python Rest API's as shown below

 

 

 

databricks_load_task = BashOperator(
        task_id="upload_to_databricks",
        bash_command = """
   
        curl --location --request POST {{task_instance.xcom_pull(task_ids='get_creds', key='DATABRICKS_HOST')}}/api/2.0/dbfs/put \
        --header "Authorization: Bearer {{task_instance.xcom_pull(task_ids='get_creds', key='DATABRICKS_TOKEN')}}" \
        --form contents="@s3://bucket/test/file.parquet"\
        --form path="{{task_instance.xcom_pull(task_ids='get_creds', key='UPLOAD_PATH')}}" \
        --form overwrite="true"
        """
)

 

 

 

 Parquet files stores the dataframe result. I am unable to upload  the file as it gives me the below error 

curl: (26) Failed to open/read local data from file/application

I tried to replace the content from s3 path to text(--form contents="test text") this works for me. Please help me with this.

#dbfs 

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group