Multiple Choice

A model designer implements a positional bias using the formula Bias = -βlog(1 + β), where β is a positive value that increases with token distance. The goal is to penalize attention to more distant tokens. By mistake, the designer forgets the leading negative sign, implementing the formula as Bias = βlog(1 + β). What is the most likely effect of this error on the model's behavior?

0

1

Updated 2025-10-04

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Computing Sciences

Foundations of Large Language Models Course

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science