Tiktoken is a fast BPE (Byte Pair Encoding) tokenizer used by OpenAI's models. It is optimized for performance and specifically designed to handle the tokenization needs of GPT models.
Created by OpenAI for use with GPT-3.5 and GPT-4.
The standard tokenizer for OpenAI API users and researchers working with GPT models.