Learn Before
The Evolving Meaning of 'Alignment' in Language Models
The term 'alignment' in language processing has shifted from primarily describing the technical task of mapping corresponding data elements to the broader goal of ensuring a model's behavior conforms to human expectations. Evaluate why this shift in meaning is significant, particularly in the era of large, general-purpose language models.
0
1
Tags
Ch.4 Alignment - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Evaluation in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Traditional NLP Alignment
LLM Alignment with Human Expectations
AI Alignment
A research team is developing a machine translation system and focuses on 'word alignment,' which involves mapping each word in a source sentence to its corresponding word in the translated sentence. Separately, a company developing a conversational AI is focused on 'model alignment,' which involves training the AI to be helpful, harmless, and honest. What is the core distinction between the concept of 'alignment' in these two contexts?
The Evolving Meaning of 'Alignment' in Language Models
Distinguishing Types of NLP Alignment