cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

Can I run a custom function that contains a trained ML model or access an API endpoint from within a SQL query in the SQL workspace?

Erik_S
New Contributor II

I have a dashboard and I'd like the ability to take the data from a query and then predict a result from a trained ML model within the dashboard. I was thinking I could possibly embed the trained model within a library that I then import to the SQL workspace and create a function to access it, or put it behind an API that a query could then access. However, I can't find documentation on either of these use cases so I'm not sure if it's possible.

If it were a linear model I could save it as a table, but I want to have access to other models.

Other ideas are welcome!

4 REPLIES 4

Anonymous
Not applicable

@Erik Shilts​ :

Yes, it is possible to use a trained ML model in a dashboard in Databricks. Here are a few approaches you could consider:

  1. Embed the model in a Python library and call it from SQL: You can train your ML model in Python and then save it as a binary file (e.g., using the pickle module). Then you can create a Python library that loads the saved model and provides a function to make predictions. You can import this library in your Databricks notebook and use it to make predictions in your queries. Note that this approach requires that your Databricks notebook supports calling Python functions from SQL.
  2. Deploy the model as a REST API: You can deploy your trained model as a REST API using a framework like MLFlow. Then you can call the API from your Databricks notebook to make predictions. This approach has the advantage of decoupling the model from the Databricks notebook and allowing it to be used by other applications as well.
  3. Use a pre-built ML library that supports SQL: Databricks provides a variety of ML libraries that you can use to make predictions in SQL, including Spark ML and TensorFlow. You can train your model using one of these libraries and then use its SQL functions to make predictions in your dashboard.

Note that the choice of approach will depend on your specific use case and the capabilities of your Databricks notebook. For example, if you need to make predictions in real-time or at high scale, deploying the model as a REST API may be the most appropriate approach. If you're already using a Databricks ML library for your training and analysis, using that same library for making predictions may be the most efficient approach.

Erik_S
New Contributor II

Hi, thanks for the response.

What I'm looking for specifically is to call that function from a SQL Workspace dashboard, not a notebook dashboard in the Data Science & Engineering workspace. I have an existing SQL Workspace Dashboard but as far as I know I can't import libraries or call APIs from the SQL queries that power those dashboards. Is what I'm asking for possible in the SQL Workspace?

Anonymous
Not applicable

@Erik Shilts​ :

In Databricks SQL Workspace, you cannot call custom functions containing trained ML models or access API endpoints directly from within a SQL query that powers a dashboard.

Kaniz
Community Manager
Community Manager

Hi @Erik Shilts​ (Customer)​, We haven't heard from you since the last response from @Suteja Kanuri​, and I was checking back to see if her suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.