cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
cancel
Showing results for 
Search instead for 
Did you mean: 

api/2.0/sql/history/queries endpoint does not return query execution time

Octavian1
Contributor

Hi,

I cannot see the query execution time in the response to the "api/2.0/sql/history/queries" request.

Basically, I get only the following fields:
{
"next_page_token":...,
"has_next_page":...,
"res":[
  {
     "query_id":...,
     "status":..,
     "query_text":...,
     "query_start_time_ms":...,
     "execution_end_time_ms":...,
     "query_end_time_ms":...,
     "user_id":...,
     "user_name":...,
     "spark_ui_url":...,
     "endpoint_id":...,
     "rows_produced":...,
     "lookup_key":...,
     "executed_as_user_id":...,
     "executed_as_user_name":...,
     "is_final":..,
     "channel_used":{
       "name":...,
       "dbsql_version":...
     },
     "plans_state":...,
     "statement_type":...,
     "warehouse_id":...,
     "duration":...,
     "canSubscribeToLiveQuery":...
  },
  ...
]
}
There is an execution_end_time_ms (which is equal to the query_end_time_ms), but no execution time (which is part of the duration), as it can be seen in Query History UI, see screenshot attached.
Is there another way to get the query execution time?

Thank you!

1 ACCEPTED SOLUTION

Accepted Solutions

Got it, to get the metrics you've got to call with the include_metrics param set to true:

api/2.0/sql/history/queries?include_metrics=true

View solution in original post

6 REPLIES 6

Yeshwanth
Contributor III
Contributor III

Hi @Octavian1 

As per the API documentation, this API should ideally fetch the execution time and the duration too. What do you see in the response? Can you attach the response here?

feiyun0112
Contributor

you can get time from metrics

"metrics": {
        "total_time_ms": 1000,
        "read_bytes": 1024,
        "rows_produced_count": 100000,
        "compilation_time_ms": 1000,
        "execution_time_ms": 1000,
        "read_remote_bytes": 1024,
        "write_remote_bytes": 1024,
        "read_cache_bytes": 1024,
        "spill_to_disk_bytes": 1024,
        "task_total_time_ms": 100000,
        "read_files_count": 1,
        "read_partitions_count": 1,
        "photon_total_time_ms": 1000,
        "rows_read_count": 10000,
        "result_fetch_time_ms": 100000,
        "network_sent_bytes": 1024,
        "result_from_cache": false,
        "pruned_bytes": 1024,
        "pruned_files_count": 1,
        "provisioning_queue_start_timestamp": 1595357087200,
        "overloading_queue_start_timestamp": 1595357087200,
        "query_compilation_start_timestamp": 1595357087200,
        "metadata_time_ms": 0,
        "planning_time_ms": 0,
        "query_execution_time_ms": 0,
        "planning_phases": [
          {}
        ]
      },

Yeshwanth
Contributor III
Contributor III

Spot on @feiyun0112 

So this confirms that the API is working as expected right?


@Yeshwanth wrote:

Spot on @feiyun0112 

So this confirms that the API is working as expected right?


yes, you can compare data with ui

The response I am getting is the one given in the first post (have tried again right now). I do not get any "metrics" JSON in the response, although I am calling the api/2.0/sql/history/queries endpoint as hinted.

To get those metrics maybe it is needed a special setup, some extra params in the request or other requirements on the workspace?

Thanks!

Got it, to get the metrics you've got to call with the include_metrics param set to true:

api/2.0/sql/history/queries?include_metrics=true
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.