I've read this article, which covers:
- Using CrossValidator or TrainValidationSplit to track hyperparameter tuning (no hyperopt). Only random/grid search
- parallel "single-machine" model training with hyperopt using hyperopt.SparkTrials (not spark.ml)
- "Distributed training with Hyperopt and HorovodRunner" - distributed deep learning with hyperopt (no MLFlow)
- It does mention "With HorovodRunner, you do not use the SparkTrials class, and you must manually call MLflow to log trials for Hyperopt."
Is there an example notebook that shows how to hyperparameter tune a spark.ml model and log hyperparams/metrics/artifacts?