Databricks cluster is treated as active if there are any spark or non-Spark operations running on the cluster. Even though there are no Spark jobs running on the cluster, it's possible to have some driver-specific application code running marking the cluster as non-idle. Driver-specific application code may be something like connecting to a REST API Server, polling a database, etc.
One other use case, where the cluster will be marked non-idle is if the cluster has an active JDBC/ODBC connection that submits queries on a frequent interval.