Multiple Choice

A research team wants to adapt a very large, pre-trained language model (with billions of parameters) to perform a new, specialized task, such as classifying medical reports. The team's primary constraint is a very limited computational budget, which makes it infeasible to update all of the model's original parameters. Which of the following training strategies best resolves this constraint while still effectively adapting the model to the new task?

0

1

Updated 2025-09-26

Contributors are:

Who are from:

Tags

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science