Token

What is Token?

A token is the basic unit of text that an LLM processes. It can be a word, part of a word, or even a punctuation mark. Roughly 1,000 tokens equal about 750 words.

Where did the term "Token" come from?

Fundamental concept in NLP and parsing.

How is "Token" used today?

The standard unit for measuring context window size and pricing for API usage.

Related Terms