cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

curl: (26) Failed to open/read local data from file/application in DBFS

kavya08
New Contributor

Hi all,

I am trying to upload a parquet file from S3 to dbfs with airflow bash operator curl command using Databricks python Rest API's as shown below

 

 

 

databricks_load_task = BashOperator(
        task_id="upload_to_databricks",
        bash_command = """
   
        curl --location --request POST {{task_instance.xcom_pull(task_ids='get_creds', key='DATABRICKS_HOST')}}/api/2.0/dbfs/put \
        --header "Authorization: Bearer {{task_instance.xcom_pull(task_ids='get_creds', key='DATABRICKS_TOKEN')}}" \
        --form contents="@s3://bucket/test/file.parquet"\
        --form path="{{task_instance.xcom_pull(task_ids='get_creds', key='UPLOAD_PATH')}}" \
        --form overwrite="true"
        """
)

 

 

 

 Parquet files stores the dataframe result. I am unable to upload  the file as it gives me the below error 

curl: (26) Failed to open/read local data from file/application

I tried to replace the content from s3 path to text(--form contents="test text") this works for me. Please help me with this.

#dbfs 

0 REPLIES 0

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now