cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

writing into delta table using databricks connect

MightyMasdo
New Contributor II

Hello we are currently facing a challenge with writing data from a local machine to Delta tables. While small data loads (e.g., 100 Rows) work without any issues, attempting to write larger batches (around 1,000 Rows) results in an exception. the issue started to show up since last week, we were able to write tables to databricks on 10000 batch size. Unfortunately, The error message isn't very descriptive, making it difficult to pinpoint the cause.
Has anyone else experienced a similar issue, or does anyone have insights into potential solutions? We would greatly appreciate any guidance on best practices or troubleshooting steps for writing larger datasets to Delta tables remotely.
Thank you for your help! this is the Error we get

grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "INVALID_PARAMETER_VALUE: Invalid compute mode headers. Please pass X-Databricks-Cluster-Id for using Spark Connect
with classic compute or pass X-Databricks-Session-Id for using Spark Connect with
serverless compute. (requestId= XXXX)"
debug_error_string = "UNKNOWN:Error received from peer {grpc_message:"INVALID_PARAMETER_VALUE: Invalid compute mode headers. Please pass X-Databricks-Cluster-Id for using Spark Connect\nwith classic compute or pass X-Databricks-Session-Id for using Spark Connect with\nserverless compute. (requestId=98ed3355-9a30-45bf-82c9-10ec05f9336e)", grpc_status:3, created_time:"2024-08-26T11:10:18.3724432+00:00"}"
>

3 REPLIES 3

Mok9rQWafV
New Contributor II

Hi,

I am experiencing the same behavior on LTS 13.3 and LTS 14.0. Any idea what could be happening?

 

AnatolBeck
New Contributor II

Also got similar logs since the beginning of the week with Databricks 13.3 LTS.
```

grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "INVALID_PARAMETER_VALUE: Invalid compute mode headers. Please pass X-Databricks-Cluster-Id for using Spark Connect
with classic compute or pass X-Databricks-Session-Id for using Spark Connect with
serverless compute. (requestId=XXXX)"
debug_error_string = "UNKNOWN:Error received from peer {grpc_message:"INVALID_PARAMETER_VALUE: Invalid compute mode headers. Please pass X-Databricks-Cluster-Id for using Spark Connect\nwith classic compute or pass X-Databricks-Session-Id for using Spark Connect with\nserverless compute. (requestId=XXXX)", grpc_status:3, created_time:"2024-12-02T14:04:12.790228698+01:00"}"

```

AnatolBeck
New Contributor II

I can however confirm that writing the data in chunks (playing around with the size) can resolve this error.

The error message is on the one hand completely random and unrelated to real issue and on the other hand the chunk size that allows to write without errors seems to change over time.

This needs a review.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group