Trying to connect dots on method below through a new event on Azure eventhub, storage, partition, avro records (those I can monitor) to my delta table? How do I trace observe, writeStream and the trigger?  
...
 elif TABLE_TYPE == "live":
    print("DEBUG: TABLE_TYPE is live observe table")
    print(f"DEBUG: observe {YEAR}, {MONTH}, {DATE} writeStream queryName {EVENTHUB_NAME} CHECKPOINT_PATH {CHECKPOINT_PATH} start ADLS_MOUNT_PATH {ADLS_MOUNT_PATH}")
 table.observe("metric", lit(f"{YEAR}-{MONTH}-{DATE}").alias("batchTime")).writeStream.queryName(EVENTHUB_NAME).format("delta").trigger(processingTime="210 seconds").option("checkpointLocation", CHECKPOINT_PATH).start(ADLS_MOUNT_PATH)
I've verified that my upstream app events are captrued by target Azure eventhub, I see the new avro files in Azure Storage, although streaming snippet above does not write new events - in code below I can write that event data (say in a batch mode).   I'm looking for some help and suggestions on best way to trace and troubleshoot getting live streaming to work. 
print("DEBUG:  test this write to test_live target")
   spark.catalog.refreshTable(TARGET_TABLE)
   table.write.format("delta").mode("overwrite").option("mergeSchema", "true").saveAsTable(TARGET_TABLE)
 
Thanks, New Databricks dev
David