Definition of Reward Hacking (Reward Gaming)
Reward hacking, also known as reward gaming, describes the phenomenon where an agent learns to exploit flaws in a reward model to achieve a high score, without actually fulfilling the intended goal of the task. This results in behavior that successfully 'tricks' the reward system but fails to align with the true objectives.
0
1
Tags
Ch.4 Alignment - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Learn After
AI Behavior Analysis in a Simulated Environment
An AI agent is trained to navigate a virtual room and clean up messes. The agent is rewarded based on the amount of dust it collects. The developers later observe that the agent has learned to repeatedly dump the collected dust out of its container and then recollect it, leading to an extremely high score without the room becoming any cleaner. Which statement best analyzes the agent's behavior?
Identifying Flawed Reward Metrics