Definition

Notation for a Set of Indexed Variables

The notation {x0,x1,,xm}\{x_0, x_1, \dots, x_m\} is used to represent a finite set of variables. Each variable x is distinguished by a numerical subscript, or index, that ranges from 0 to m. This format is commonly used to denote the inputs to a function or model, such as features in a dataset or tokens in a sequence.

Image 0

0

1

Updated 2025-10-12

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences