A token is the basic unit of text that an LLM processes. It can be a word, part of a word, or even a punctuation mark. Roughly 1,000 tokens equal about 750 words.
Fundamental concept in NLP and parsing.
The standard unit for measuring context window size and pricing for API usage.