cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

RESOURCE_EXHAUSTED dbutils.jobs.taskValues.get

leireroman
New Contributor III

I've a job in Databricks running multiple tasks in parallel. Those tasks read parameters of the job using the utility of dbutils. I'm getting the following error when trying to read parameters in my different tasks:

image.png

com.databricks.common.client.DatabricksServiceHttpClientException: RESOURCE_EXHAUSTED: Too many requests. Please wait a moment and try again. If the issue persists, consider adjusting your request frequency or reaching out to support for assistance.

This has been working well for a long time, but today it failed. I tried the job a second time, and it failed again. Any help on why it failed?

1 ACCEPTED SOLUTION

Accepted Solutions

leireroman
New Contributor III

Hi all,

Our solution has been to use job parameters and dynamic value references. These are read using dbutils.widgets.get() instead of dbutils.jobs.taskValues.get(). Now, our ETL is working well again.

Pass context about job runs into job tasks - Azure Databricks | Microsoft Learn

Configure settings for Azure Databricks jobs - Azure Databricks | Microsoft Learn

Hope you find this information useful.

View solution in original post

3 REPLIES 3

kesi
New Contributor II

We have also had a RESOURCE_EXHAUSED error this morning in APAC on AWS. The job has 49 parallel tasks and has been running without an issue for maybe 6 months.

joachimswig
New Contributor II

We've been seeing this as well, on a job with ~50 parallel tasks which had been running without issue for the past year

leireroman
New Contributor III

Hi all,

Our solution has been to use job parameters and dynamic value references. These are read using dbutils.widgets.get() instead of dbutils.jobs.taskValues.get(). Now, our ETL is working well again.

Pass context about job runs into job tasks - Azure Databricks | Microsoft Learn

Configure settings for Azure Databricks jobs - Azure Databricks | Microsoft Learn

Hope you find this information useful.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group