cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

AutoMl Forecasting - Query via REST (Issue with input date field)

prem_raj
New Contributor II

Hi ,

Used automl forecasting model with sample data and the model is trained successfully. But when i was to serve the model over REST endpoint, i'm getting the error while querying via the inbuilt browser and postman. (Error seems to be with the date field type- throws python code error)

Sample data trained:

 DATE IPG3113N TYPE

01/01/72 74.6385 RadioActive

Input Json Request:

{

 "dataframe_split": {

  "columns": [

   "DATE",

   "TYPE"

  ],

  "data": [

   [

    "1973-12-02",

    "RadioActive"

   ]

  ]

 }

}

Error Received:

{"error_code": "BAD_REQUEST", "message": "Encountered an unexpected error while evaluating the model. Verify that the input is compatible with the model for inference. Error 'Cannot use .astype to convert from timezone-aware dtype to timezone-naive dtype. Use obj.tz_localize(None) or obj.tz_convert('UTC').tz_localize(None) instead.'", "stack_trace": "Traceback (most recent call last):\n File \"/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/pandas/core/groupby/groupby.py\", line 1353, i

What I have tried:

  1. Changed the date input field with different values like 1973-12-02T00:00:00.000+0000, 1973-12-02 00:00:00
  2. Used different databricks version like 11.3LTS and 12.2 LTS.

I know that my date input is wrong with notebook generated, but not able to figure out the expect input value.

Can someone help here?

2 REPLIES 2

Anonymous
Not applicable

@prem raj​ :

Based on the error message, it seems that the input date format is not compatible with the model for inference. The error message suggests that the input date format is timezone-aware, while the model expects a timezone-naive format.

To fix this issue, you can try converting the input date to a timezone-naive format before sending it to the REST endpoint. You can use the tz_localize and tz_convert methods from the pandas library to perform this conversion. Here's an example of how you can modify your input JSON request:

{
 "dataframe_split": {
  "columns": [
   "DATE",
   "TYPE"
  ],
  "data": [
   [
    "1973-12-02T00:00:00.000Z",   // convert to UTC timezone
    "RadioActive"
   ]
  ]
 }
}

Note that the date string has been modified to include a timezone offset of Z, which indicates UTC timezone. You can also remove the timezone information altogether, and convert the date string to a timezone-naive format using the tz_localize method:

{
 "dataframe_split": {
  "columns": [
   "DATE",
   "TYPE"
  ],
  "data": [
   [
    "1973-12-02 00:00:00",   // remove timezone information
    "RadioActive"
   ]
  ]
 }
}

You can also modify the model code to handle timezone-aware dates, but that would require changes to the code and may not be feasible depending on your use case.

prem_raj
New Contributor II

Hi Suteja,

Thanks for the reply. I have tried to use the mentioned time input and still it fails with the different error

{"error_code": "BAD_REQUEST", "message": "Invalid input. Data is not compatible with model signature. Failed to convert column DATE to type 'datetime64[ns]'. Error: 'Cannot use .astype to convert from timezone-aware dtype to timezone-naive dtype. Use obj.tz_localize(None) or obj.tz_convert('UTC').tz_localize(None) instead.'"}

After looking into the notebook, i have figured out that the forecasting models are not logging with signature or input_example using mlfow like the one for regression/classification models. In the notebook below like just logs the model doesn't specify signature(Is that the issue)?

mlflow_prophet_log_model(prophet_model)

I m trying to add input_example and model signature to enable the inferencing and validate.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group