Direct access to Databricks query history database
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-26-2024 10:14 AM
Hello,
I would like to know if there is direct access to the Databricks query history tables. For compliance issues, I would like to be able to create reports for something like: who has accessed a particular column in a table in the past 6 months. The query history web interface is quite limited. I would ideally like wo use SQL to query the history table. Is this possible?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-15-2024 07:47 AM
I would like to be able to query the query history tables by running my own queries. I do not want to use the Query History interface supplied by Databricks; I want to be able to create python scripts that access the underling tables/views for TAC and Query history. From your response, it seems like this is not possible. Can you confirm that?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-19-2024 11:45 AM
%pip install databricks-sdk
dbutils.library.restartPython()
warehouse_id = "bc1af43449227761"
hours_back_to_check = 2
from databricks.sdk import WorkspaceClient
from databricks.sdk.service import sql
from pyspark.sql.types import StructType, StructField, StringType, LongType
import time
w = WorkspaceClient()
current_time_ms = int(round(time.time() * 1000))
start_time = current_time_ms - (3600000 * hours_back_to_check)
# Filtering options:
# https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/sql.html#databricks.sdk.service.sql.QueryFilter
query_filter = sql.QueryFilter(
query_start_time_range=sql.TimeRange(start_time_ms=start_time, end_time_ms=current_time_ms),
warehouse_ids=[warehouse_id]
)
query_ls = [query for query in w.query_history.list(filter_by=query_filter, include_metrics=True)]
schema = StructType([
StructField("duration", LongType(), True),
StructField("query_start_time_ms", LongType(), True),
StructField("query_end_time_ms", LongType(), True),
StructField("executed_as_user_name", StringType(), True),
StructField("query_text", StringType(), True),
])
# Extract the relevant fields from the BaseRun objects
df = spark.createDataFrame([
(query.duration, query.query_start_time_ms, query.query_end_time_ms, query.executed_as_user_name, query.query_text)
for query in query_ls
], schema)
df.display()
print([query.metrics for query in query_ls])
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-19-2024 11:47 AM
This provides query history. You may also be interested in System Tables - for compliance purposes check out Audit Logs:
https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-14-2024 07:06 AM
For posterity, there is a query history system table which contains all of this information which is in preview at the time of me writing this. If you're reading this later than May 2024, please check the documentation for the query metrics table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2024 01:05 PM
A quick question on this... (First of all thanks so much for the sample code!). I'm playing around with this and I would like to get the statement_type and status. I see that duration, query_start_time_ms and query_end_time_ms are int data type and defines as LongType(); executed_as_user_name and query_text are str datatype and defined as StringType. statement_type and status are listed as data types QueryStatementType and QueryStatus respectively. How would I define the StructType for these fields?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-05-2024 04:19 PM
Sorry, missed this - try the system table and query it in SQL instead, much simpler than defining StructTypes!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-19-2024 09:05 AM
The problem with that is that I do not have access to the raw system tables... I am writing a python script to load data into a table, so that we can run queries against it
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2024 01:07 AM
Thanks @josh_melton ! I was wondering right now about this (one day after your post!) since I only found the UI and API in the documentation and was really puzzled that there is no equivalent in unity to the Snowflake query_history table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2025 07:26 AM
Throwing it out there that my team is also looking for a way to easily do this and I don't think the above solutions really fit our need. We are currently in the process of migrating schemas (from one name to a better name, we didn't rename because we didn't want to break processes for us or anyone else) so we have two sets of tables with the same name. In each schema set there are well over a hundred tables so going through each individually and checking if anyone is still hitting the old tables is a tedious process however none of these options give us a way to loop through them and see what is going on:
- describe history [table name] only shows DDL/DML history and doesn't show DQL (like a user/process still selecting data from the table, that's actually important to us)
- system.query.history has DQL history, but in our case where we have two tables with the same name, if the user doesn't identify the catalog or schema in the query text there isn't a way to distinguish the old table from the new table or tables between environments (we use different catalogs for different envs). This happens when you set the schema before running with 'use schema', unfortunately my team has a tendency to do this with 'use catalog' to enforce an environment at the start of a process so it is quite common for the schema not to be listed
- Getting request_params.commandText from system.access.audit suffers from similar issues as above.
So I guess what we are looking for is something like `describe history [table name]` but that has the DQL history this way we can more easily run a script to determine if a table is still in use.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2025 08:23 AM
If you are trying to see access on a certain table query_history is a bad way to do this, parsing the SQL statement is prone to many errors. For example, if the current catalog and schema are set the query may look like "select * from my_view", where the view is accessing the table you are interested in, so from the SQL you will not be able to determine the catalog or schema. If someone creates a view onto the tables you are interested in you may not see the schema or table name in the SQL at all.
The best way to determine this is to use the lineage tables. (https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage). These table track (among other things) access to metadata and data for the objects (tables and views).
To find access for a specific table from 2024-12-01 to 2024-12-31 the query would be something like:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2025 08:42 AM
Thanks this is really helpful. I didn't check table_lineage, I thought would be more strictly metadata related about relationships between tables and wouldn't include simple reads from a table.

