Learn Before
Parallelism in Distributed LLM Training
Goal of Parallel Processing: Linear Scalability
The primary objective of parallel processing in distributed training is to achieve linear scalability. This means that the system's efficiency, measured by the number of samples processed per unit of time, should increase in direct proportion to the number of processing devices used.
0
1
5 days ago
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Types of Parallelism in LLM Training
Goal of Parallel Processing: Linear Scalability
Complexity of Distributed Training