Learn Before
Concept

Interpretation and Empirical Results of Performance Gap Recovered

A Performance Gap Recovered (PGR) of 1{}1 signifies that weak-to-strong fine-tuning completely bridges the performance gap between the weak baseline and the strong ceiling, while a PGR of 0{}0 means there is no improvement. Empirical studies, such as the research by Burns et al., have demonstrated that PGR can reach approximately 0.8{}0.8 across various NLP classification tasks. However, despite these promising indicators, realizing substantial weak-to-strong generalization continues to be a difficult objective requiring more research.

0

1

Updated 2026-05-01

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.4 Alignment - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences