โ12-22-2023 07:07 AM
Hi,
I cannot see the query execution time in the response to the "api/2.0/sql/history/queries" request.
Basically, I get only the following fields:
{
"next_page_token":...,
"has_next_page":...,
"res":[
{
"query_id":...,
"status":..,
"query_text":...,
"query_start_time_ms":...,
"execution_end_time_ms":...,
"query_end_time_ms":...,
"user_id":...,
"user_name":...,
"spark_ui_url":...,
"endpoint_id":...,
"rows_produced":...,
"lookup_key":...,
"executed_as_user_id":...,
"executed_as_user_name":...,
"is_final":..,
"channel_used":{
"name":...,
"dbsql_version":...
},
"plans_state":...,
"statement_type":...,
"warehouse_id":...,
"duration":...,
"canSubscribeToLiveQuery":...
},
...
]
}
There is an execution_end_time_ms (which is equal to the query_end_time_ms), but no execution time (which is part of the duration), as it can be seen in Query History UI, see screenshot attached.
Is there another way to get the query execution time?
Thank you!
โ02-14-2024 01:11 AM
Got it, to get the metrics you've got to call with the include_metrics param set to true:
โ01-31-2024 10:51 PM
Hi @Octavian1
As per the API documentation, this API should ideally fetch the execution time and the duration too. What do you see in the response? Can you attach the response here?
โ01-31-2024 11:48 PM
you can get time from metrics
"metrics": {
"total_time_ms": 1000,
"read_bytes": 1024,
"rows_produced_count": 100000,
"compilation_time_ms": 1000,
"execution_time_ms": 1000,
"read_remote_bytes": 1024,
"write_remote_bytes": 1024,
"read_cache_bytes": 1024,
"spill_to_disk_bytes": 1024,
"task_total_time_ms": 100000,
"read_files_count": 1,
"read_partitions_count": 1,
"photon_total_time_ms": 1000,
"rows_read_count": 10000,
"result_fetch_time_ms": 100000,
"network_sent_bytes": 1024,
"result_from_cache": false,
"pruned_bytes": 1024,
"pruned_files_count": 1,
"provisioning_queue_start_timestamp": 1595357087200,
"overloading_queue_start_timestamp": 1595357087200,
"query_compilation_start_timestamp": 1595357087200,
"metadata_time_ms": 0,
"planning_time_ms": 0,
"query_execution_time_ms": 0,
"planning_phases": [
{}
]
},
โ02-01-2024 11:06 PM
Spot on @feiyun0112
So this confirms that the API is working as expected right?
โ02-02-2024 04:26 AM
@Yeshwanth wrote:Spot on @feiyun0112
So this confirms that the API is working as expected right?
yes, you can compare data with ui
โ02-14-2024 12:59 AM
The response I am getting is the one given in the first post (have tried again right now). I do not get any "metrics" JSON in the response, although I am calling the api/2.0/sql/history/queries endpoint as hinted.
To get those metrics maybe it is needed a special setup, some extra params in the request or other requirements on the workspace?
Thanks!
โ02-14-2024 01:11 AM
Got it, to get the metrics you've got to call with the include_metrics param set to true:
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group