cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Error: Credential size is more than configured size limit. As a result credential passthrough won't work for this notebook run.

carlosancassani
New Contributor III

I get this error when trying to execute parallel slave notebook from a Pyspark "master notebook".

note 1: I use same class, functions, cluster, credential for another use case of parallel notebook in the same databricks instance and it works fine.

note 2: the command works fine if the "master notebook" is launched from a Job, while it returns the error above when the notebook is launched manually. 

image 

So far couldn't find similar errors in docs or forums.

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

Mabe, it is the issue:

"Within PySpark, there is a limit on the size of the Python UDFs you can construct since large UDFs are sent as broadcast variables."

carlosancassani
New Contributor III

Hi,

I add a note: there are no UDF's in the "master notebook" nor the "slave notebooks"; the command works fine if the "master notebook" is launched from a Job, while it returns the error above when the notebook is launched manually.

Anonymous
Not applicable

Hi @carlosancassaniโ€‹ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.