@Erik Shilts :
Yes, it is possible to use a trained ML model in a dashboard in Databricks. Here are a few approaches you could consider:
- Embed the model in a Python library and call it from SQL: You can train your ML model in Python and then save it as a binary file (e.g., using the pickle module). Then you can create a Python library that loads the saved model and provides a function to make predictions. You can import this library in your Databricks notebook and use it to make predictions in your queries. Note that this approach requires that your Databricks notebook supports calling Python functions from SQL.
- Deploy the model as a REST API: You can deploy your trained model as a REST API using a framework like MLFlow. Then you can call the API from your Databricks notebook to make predictions. This approach has the advantage of decoupling the model from the Databricks notebook and allowing it to be used by other applications as well.
- Use a pre-built ML library that supports SQL: Databricks provides a variety of ML libraries that you can use to make predictions in SQL, including Spark ML and TensorFlow. You can train your model using one of these libraries and then use its SQL functions to make predictions in your dashboard.
Note that the choice of approach will depend on your specific use case and the capabilities of your Databricks notebook. For example, if you need to make predictions in real-time or at high scale, deploying the model as a REST API may be the most appropriate approach. If you're already using a Databricks ML library for your training and analysis, using that same library for making predictions may be the most efficient approach.