Definition

Tokens vs. Words in NLP

In Natural Language Processing, tokens are the fundamental units of text created through a process called tokenization. While the terms 'token' and 'word' are often used interchangeably for simplicity, they have distinct meanings, as tokens are the basic building blocks that models process and may not always correspond directly to words.

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related