cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

Model serving with Serverless Real-Time Inference - How could I call the endpoint with json file consisted of raw text that need to be transformed and get the prediction?

notsure
New Contributor

Hi!

I want to call the generated endpoint with a json file consisted of texts directly, could this endpoint take the raw texts, transform the texts into vectors and then output the prediction?

Is there a way to support so?

Thanks in advance!!!

1 REPLY 1

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, the updated document is : https://docs.databricks.com/machine-learning/model-inference/serverless/serverless-real-time-inferen...,

(mentioned in the document stated above:

  • This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported.
  • The guidance in this article is for a previous preview version of the Serverless Real-Time Inference functionality. Databricks recommends you migrate your model serving workflows to the refreshed preview functionality. See Model serving with Serverless Real-Time Inference.)

For creating please follow: https://docs.databricks.com/machine-learning/model-inference/serverless/create-manage-serverless-end...

Please let us know if this helps.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.