Analyzing Fine-Tuning Resource Requirements
A research lab is planning a fine-tuning project. They are considering two base models: Model A with 7 billion parameters and Model B with 70 billion parameters. They also have two potential datasets: Dataset X with 100,000 examples and Dataset Y with 1,000,000 examples. Analyze the factors that contribute to the computational cost of this project and identify which combination of model and dataset would be the most resource-intensive. Justify your reasoning.
0
1
Tags
Ch.4 Alignment - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Parameter-Efficient Methods for Mitigating Fine-Tuning Costs
Evaluating Fine-Tuning Project Feasibility
A machine learning team is fine-tuning a 70-billion parameter language model. They decide to double the size of their high-quality training dataset, from 500,000 examples to 1,000,000 examples. Which of the following best analyzes the primary driver for the substantial increase in computational cost for this project?
Analyzing Fine-Tuning Resource Requirements