A technology company plans to deploy a single, large, pre-trained language model to serve three distinct functions: a customer support chatbot, a document summarizer, and a code generator. To optimize for efficiency, they want to avoid storing and maintaining multiple, separate copies of the large model. Which approach best achieves this goal by keeping the original model's architecture intact?
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A technology company plans to deploy a single, large, pre-trained language model to serve three distinct functions: a customer support chatbot, a document summarizer, and a code generator. To optimize for efficiency, they want to avoid storing and maintaining multiple, separate copies of the large model. Which approach best achieves this goal by keeping the original model's architecture intact?
Efficiency of Parameter-Efficient Tuning
Match each description of a model adaptation method with its corresponding impact on the model's architecture and deployment efficiency.