cancel
Showing results for 
Search instead for 
Did you mean: 
DatabricksTV
Community-produced videos to help you leverage Databricks in your Data & AI journey. Tune in to explore industry trends and real-world use cases from leading data practitioners.
cancel
Showing results for 
Search instead for 
Did you mean: 
Sravani-Vadali
Databricks Employee
Databricks Employee

Batch Inference enables businesses to apply LLMs to large datasets all at once, rather than one at a time, as with real-time inference. Processing data in bulk provides cost efficiency, faster processing, and scalability. Some common ways businesses are using batch inference include information extraction, data transformation, and bulk content generation. In this video, Arthur Dooner, a Senior Specialist Solutions Architect at Databricks, goes over the benefits of using Batch Inference when it comes to AI functions such as ai_query() to summarize a document(s). The ai_query() function allows users to query machine learning models (custom/foundation models) served using Mosaic AI Model Serving. Arthur also briefly covers our latest feature - Agent Bricks: Information Extraction which is a simple, no-code approach to build and optimize domain-specific, high-quality AI agent systems for common AI use cases.

â–º Speaker - @ArthurDooner  https://www.linkedin.com/in/arthur-dooner-450b2b97/

1 Comment