08-15-2025 10:43 AM - edited 08-15-2025 10:46 AM
Is there a limitation on the string length to pass for Databricks notebook widget? ADF lookup outputs about 1000 tables that I am trying to pass to the databricks notebook via widget parameter. ADF spends 30 mins to open the Databricks notebook and errors out saying "Run failed with error message Unexpected failure while fetching notebook". Is there a limitation with the length of the parameters.
What I am concerned is It worked few months ago. After so many days as we were trying to run it now it now getting this error. Nothing has changed on our end. Did bricks add any limitation with recent updates?
08-15-2025 10:49 AM
Hi @CzarR ,
Yes, there's a limitation. A maximum of 2048 characters can be input to a text widget.
https://docs.databricks.com/aws/en/notebooks/notebook-limitations#databricks-widgets
08-15-2025 10:55 AM - edited 08-15-2025 11:05 AM
Hi, I saw that documentation but currently there are pipelines that run with parameters more than 2048 characters. For example a pipeline is able to pass 329,130 characters and run successfully everyday. I mean it is passed via ADF. From ADF it is passed as a json I think and there is a 1 MB limit where my parameter are only 500kb. with 610,000 characters.
At least databricks does not give me a specific error.
08-15-2025 11:42 AM
Ok, so it seems that this 2048 characters limit applies only to text given via UI. When passing parameters via the notebook activity in ADF, you're essentially embedding the value in a JSON payload sent to Databricks. So maybe there's a some kind of limitation regarding accepted payload size on the side of databricks?
So you meet payload requirements from the side of ADF, but hitting limitations on databricks side.
As a workaround, can't you split output into smaller pieces? Or maybe write an output to a parquet file that can be later consumer consumed by your notebook.
08-15-2025 11:49 AM
Yes, I have work around ideas. That is not a problem. But how did it work few months ago. Any update released that is limiting it now. Looking for some documentation around that so I can prove to my team that things have changed and now I need to update the pipeline accordingly.
08-15-2025 11:49 AM
And one more thing. Could you, just for debugging purposes, reduced the size of the output? You know, to make sure that the output size that is passed as widget input causes the problem.
08-15-2025 11:55 AM
Yes, if I do that it working fine. I reduced it to 600 tables vs 1000 tables.
08-15-2025 12:09 PM
Yep, so for sure you're hitting some API limitations here. If I were you I would just redesign this pipeline. I don't think it's the best idea to pass as payload such long input. Each API has some limitations and if your input will grow with time then you will encounter this issue again.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now