Learn Before
Deployment Strategy for a New AI Assistant
A startup is launching a new AI-powered virtual assistant. They have developed a very large, state-of-the-art language model that provides highly accurate and nuanced responses. They are now deciding between two deployment strategies. Evaluate the primary trade-offs the company must consider when choosing between these two options, and justify which strategy might be better for a product aiming for mass-market adoption.
0
1
Tags
Data Science
Foundations of Large Language Models Course
Computing Sciences
Evaluation in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Ways to compress PTMs
A development team has created a large, high-performance language model for a new smartphone application that provides real-time text summarization. During user testing, they observe that while the summaries are highly accurate, the application is slow to respond and causes the phone's battery to drain rapidly. Which of the following strategies would be the most appropriate first step to address these specific performance issues on the device?
Deployment Strategy for a New AI Assistant
Deployment Challenges of Large Models
For any real-world application, applying compression techniques to a large pre-trained model is the optimal deployment strategy because it reduces model size and improves computation efficiency without compromising the model's performance.