cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Can I run a custom function that contains a trained ML model or access an API endpoint from within a SQL query in the SQL workspace?

Erik_S
New Contributor II

I have a dashboard and I'd like the ability to take the data from a query and then predict a result from a trained ML model within the dashboard. I was thinking I could possibly embed the trained model within a library that I then import to the SQL workspace and create a function to access it, or put it behind an API that a query could then access. However, I can't find documentation on either of these use cases so I'm not sure if it's possible.

If it were a linear model I could save it as a table, but I want to have access to other models.

Other ideas are welcome!

3 REPLIES 3

Anonymous
Not applicable

@Erik Shiltsโ€‹ :

Yes, it is possible to use a trained ML model in a dashboard in Databricks. Here are a few approaches you could consider:

  1. Embed the model in a Python library and call it from SQL: You can train your ML model in Python and then save it as a binary file (e.g., using the pickle module). Then you can create a Python library that loads the saved model and provides a function to make predictions. You can import this library in your Databricks notebook and use it to make predictions in your queries. Note that this approach requires that your Databricks notebook supports calling Python functions from SQL.
  2. Deploy the model as a REST API: You can deploy your trained model as a REST API using a framework like MLFlow. Then you can call the API from your Databricks notebook to make predictions. This approach has the advantage of decoupling the model from the Databricks notebook and allowing it to be used by other applications as well.
  3. Use a pre-built ML library that supports SQL: Databricks provides a variety of ML libraries that you can use to make predictions in SQL, including Spark ML and TensorFlow. You can train your model using one of these libraries and then use its SQL functions to make predictions in your dashboard.

Note that the choice of approach will depend on your specific use case and the capabilities of your Databricks notebook. For example, if you need to make predictions in real-time or at high scale, deploying the model as a REST API may be the most appropriate approach. If you're already using a Databricks ML library for your training and analysis, using that same library for making predictions may be the most efficient approach.

Erik_S
New Contributor II

Hi, thanks for the response.

What I'm looking for specifically is to call that function from a SQL Workspace dashboard, not a notebook dashboard in the Data Science & Engineering workspace. I have an existing SQL Workspace Dashboard but as far as I know I can't import libraries or call APIs from the SQL queries that power those dashboards. Is what I'm asking for possible in the SQL Workspace?

Anonymous
Not applicable

@Erik Shiltsโ€‹ :

In Databricks SQL Workspace, you cannot call custom functions containing trained ML models or access API endpoints directly from within a SQL query that powers a dashboard.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group