Case Study

Optimizing a Language Model for Mobile Deployment

A company has developed a highly accurate, but computationally expensive, language model for summarizing complex technical reports. Its accuracy relies on providing it with a very long and detailed set of instructions (a prompt) along with each report. The company now wants to deploy a smaller, faster version of this summarization tool on a mobile app, where providing the long instructions for every request is not feasible. Based on the principle of transferring knowledge from a prompted model into a new model's internal parameters, how could the company create this efficient mobile-friendly model? Explain the role of the original model and the new model in this process, and why the new model would no longer need the lengthy instructions.

0

1

Updated 2025-09-28

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Application in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science