Hello we are currently facing a challenge with writing data from a local machine to Delta tables. While small data loads (e.g., 100 Rows) work without any issues, attempting to write larger batches (around 1,000 Rows) results in an exception. the issue started to show up since last week, we were able to write tables to databricks on 10000 batch size. Unfortunately, The error message isn't very descriptive, making it difficult to pinpoint the cause.
Has anyone else experienced a similar issue, or does anyone have insights into potential solutions? We would greatly appreciate any guidance on best practices or troubleshooting steps for writing larger datasets to Delta tables remotely.
Thank you for your help! this is the Error we get
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "INVALID_PARAMETER_VALUE: Invalid compute mode headers. Please pass X-Databricks-Cluster-Id for using Spark Connect
with classic compute or pass X-Databricks-Session-Id for using Spark Connect with
serverless compute. (requestId= XXXX)"
debug_error_string = "UNKNOWN:Error received from peer {grpc_message:"INVALID_PARAMETER_VALUE: Invalid compute mode headers. Please pass X-Databricks-Cluster-Id for using Spark Connect\nwith classic compute or pass X-Databricks-Session-Id for using Spark Connect with\nserverless compute. (requestId=98ed3355-9a30-45bf-82c9-10ec05f9336e)", grpc_status:3, created_time:"2024-08-26T11:10:18.3724432+00:00"}"
>