cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

I need a sample code or process which will help us to dynamically select the prompt template

SandipCoder
New Contributor II

We need a sample code or process which will help us to dynamically select the prompt template based on the prompt given as an input through the model legacy serving endpoint

Sandip Bhowmick
1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

To dynamically select a prompt template in Databricks based on the input prompt received through a legacy model serving endpoint, you can implement a Python function that maps incoming prompts to specific templates. This often involves using conditional logic or a registry of templates, and is best integrated with your endpoint scoring logic. Here is a sample code structure and process that can help achieve this:

Sample Code Using Python

python
# Define your prompt templates prompt_templates = { "classification": "Classify the following input: {{input}}", "summarization": "Summarize this text: {{input}}", "default": "Respond to the user's request: {{input}}" } def select_template(input_prompt): """ Selects a prompt template based on keywords in the input prompt. Args: input_prompt (str): The user's request or query. Returns: str: The selected prompt template with the input inserted. """ if "classify" in input_prompt.lower(): template = prompt_templates["classification"] elif "summarize" in input_prompt.lower(): template = prompt_templates["summarization"] else: template = prompt_templates["default"] return template.replace("{{input}}", input_prompt) # Example usage for incoming request incoming_prompt = "Please classify this sentence." selected_prompt = select_template(incoming_prompt) # Send selected_prompt to your model via legacy serving endpoint

This logic can be extended with more sophisticated routing, such as matching on regular expressions, using a dictionary of patterns, or integrating with a registry service like MLflow's Prompt Registry.โ€‹

Process Overview

  • Store Prompt Templates: Define your templates in code, a config file, or use MLflow Prompt Registry for better manageability.โ€‹

  • Detect Intent/Keyword: Analyze the incoming prompt for specific keywords or intents.

  • Select Template: Map the detected intent or keyword to the right template.

  • Render Final Prompt: Insert variables or structure the prompt appropriately.

  • Forward to Endpoint: Send the formatted prompt to your model endpoint for inference.

This approach allows your Databricks model endpoint to handle a variety of tasks by intelligently selecting how to present user queries to the underlying model.โ€‹

MLflow/Databricks Integration

You can also leverage MLflow's Prompt Registry and Python SDK to store, retrieve, and manage different versions of prompt templates programmatically, using methods such as mlflow.set_experiment_tags, or by storing prompts in the Unity Catalog schema for more robust management.โ€‹

This modular process will help your Databricks model serving endpoint remain flexible and responsive to a range of user inputs.